DCT

3:25-cv-07924

Artificial Intelligence Industry Association Inc v. Ceres Ai Inc

Key Events
Amended Complaint

I. Executive Summary and Procedural Information

  • Parties & Counsel:
  • Case Identification: 3:25-cv-07924, N.D. Cal., 12/10/2025
  • Venue Allegations: Venue is based on Defendant maintaining a principal office and "regular and established place of business" in Emeryville, California, within the district, and committing alleged acts of infringement in the district.
  • Core Dispute: Plaintiff alleges that Defendant’s aerial imaging and AI-powered data analytics platforms for precision agriculture infringe a patent related to embedding time-synchronized calibration metadata into stereoscopic video files.
  • Technical Context: The technology addresses the challenge of synchronizing sensor data (e.g., camera parameters, location) with corresponding video frames, which is critical for accurate 3D rendering and data analysis in fields like virtual reality and agricultural monitoring.
  • Key Procedural History: The complaint states that prior to filing suit, Plaintiff sent Defendant a formal demand letter identifying the patent-in-suit and its alleged infringement, and also offered a license. This notification is cited as a basis for the willful infringement allegation.

**Case Timeline**

Date Event
2015-04-29 U.S. Patent No. 10,075,693 Priority Date
2018-09-11 U.S. Patent No. 10,075,693 Issued
2025-12-10 First Amended Complaint Filed

II. Technology and Patent(s)-in-Suit Analysis

U.S. Patent No. 10,075,693 - *"Embedding Calibration Metadata Into Stereoscopic Video Files"*

  • Patent Identification: U.S. Patent No. 10,075,693, titled “Embedding Calibration Metadata Into Stereoscopic Video Files,” issued September 11, 2018 (‘693 Patent).

The Invention Explained

  • Problem Addressed: When playing back stereoscopic 3D video, the playback device requires specific camera and sensor parameters to properly render the images. These parameters can vary between cameras and may even change during a single recording session (e.g., due to sensor readings from a gyroscope). The patent describes the difficulty of ensuring this essential metadata remains perfectly synchronized with the corresponding video frames, especially when video from different sources is combined ('693 Patent, col. 1:41-65).
  • The Patented Solution: The invention proposes a system to embed "camera, sensor, and processing parameters" directly into the video file in real-time during capture ('693 Patent, col. 1:53-56). The core technical approach involves encoding the metadata into time-sequenced fields within the video file format, such as subtitle or closed-captioning tracks, which are inherently synchronized with the video frames, thereby preserving the precise timing alignment between the data and the video ('693 Patent, FIG. 11; col. 9:18-23).
  • Technical Importance: This method provides a standardized way to package sensor and calibration data with video, which may simplify the development of applications that rely on accurate, time-aligned metadata for functions like 3D rendering in virtual reality or data-driven analysis of video content ('693 Patent, col. 2:7-14).

Key Claims at a Glance

  • The complaint asserts at least independent Claim 1 (Compl. ¶25).
  • The essential elements of Claim 1 are:
    • A computer store containing a stereoscopic video feed and contemporaneous metadata feeds, including calibration metadata.
    • A computer processor programmed to obtain both the video and metadata feeds.
    • The processor is programmed to embed the metadata into the video feed in real-time as it is recorded.
    • The embedding is performed by encoding the metadata into subtitle or closed-captioning metadata fields of the video file format.
    • This encoding method ensures the timing of the metadata fields conveys the timing of the metadata itself.
  • The complaint does not explicitly reserve the right to assert dependent claims.

III. The Accused Instrumentality

Product Identification

  • The accused instrumentalities are Defendant’s "aerial imaging systems, AI-powered data analytics platforms, computer vision technologies, and machine learning models" used for precision agriculture and crop health monitoring (Compl. ¶1, ¶17).

Functionality and Market Context

  • The complaint alleges that the accused products utilize high-resolution aerial imagery captured from aircraft or drones, incorporating "stereoscopic or 3D imaging technologies" (Compl. ¶17). These systems are alleged to capture image data using multispectral cameras and apply "rigorous calibration of imagery (using sensor metadata and environmental data)" to produce "calibrated data layers" for analysis (Compl. ¶20, ¶21).
  • The functionality is directed at the agricultural market, where these systems are used to detect crop issues, model yields, and generate analytics on irrigation efficiency and crop health (Compl. ¶1, ¶7, ¶18). The complaint alleges Defendant serves major agricultural entities (Compl. ¶7).
  • No probative visual evidence provided in complaint.

IV. Analysis of Infringement Allegations

'693 Patent Infringement Allegations

Claim Element (from Independent Claim 1) Alleged Infringing Functionality Complaint Citation Patent Citation
A computer store containing a stereoscopic video feed of two or more stereoscopic images from a stereoscopic video capture device, wherein the stereoscopic video feed comprises a plurality of contemporaneous metadata feeds, wherein a metadata feed comprises...metadata... Defendant's systems allegedly capture image data "using multispectral stereoscopic or 3D imaging technologies" and contemporaneously generate "associated calibrated data layers" and other sensor-derived information, which are stored and processed by the system (Compl. ¶21, Ex. B). ¶21, Ex. B col. 2:18-24
a computer processor in the stereoscopic video capture device, which computer processor is...programmed to: obtain the stereoscopic video feed from the computer store, obtain the plurality of contemporaneous metadata feeds from the computer store, and... The complaint alleges Defendant's data-processing pipelines and workflows obtain the captured imagery and contemporaneous "sensor metadata and environmental data" for subsequent processing and embedding (Compl. ¶20-21). ¶20-21, Ex. B col. 2:25-29
embedding, in real-time as the stereoscopic video feed is recorded, the stereoscopic video feed with the plurality of contemporaneous metadata feeds... The complaint alleges Defendant's workflows involve "embedding calibrated data into the imagery layers as part of Ceres's real-time or near-real-time agricultural imaging system" (Compl. Ex. B). Ex. B col. 2:29-31
wherein the plurality of contemporaneous metadata feeds is embedded into the stereoscopic video feed by encoding the contemporaneous metadata feeds into the subtitles or closed captioning metadata fields of the video file format, such that the timing of the subtitle or closed captioning metadata conveys the timing of the metadata feeds. The complaint alleges that Defendant's embedding of calibrated data into video files is performed in a manner "consistent with the encoding and timing-alignment requirements recited in the claim," maintaining alignment between the metadata and the video feed (Compl. Ex. B, p. 55). Ex. B col. 9:18-23
  • Identified Points of Contention:
    • Scope Questions: A central question may be whether the "calibrated data layers" and analytical outputs (e.g., "water-stress indices") generated by Defendant's agricultural platform can be construed to meet the "stereoscopic video feed" limitation of the patent, which is described in the context of 3D video for visual display ('693 Patent, col. 1:15-20).
    • Technical Questions: The complaint alleges infringement of the "subtitle or closed captioning" limitation in a conclusory manner (Compl. Ex. B, p. 55). A key factual dispute may arise over the specific technical mechanism Defendant uses to associate metadata with its imagery. The question will be whether Defendant's systems actually encode data into subtitle/closed-captioning fields, or if they use an alternative, non-infringing method for data association within their files.

V. Key Claim Terms for Construction

  • The Term: "subtitle or closed captioning metadata fields"

    • Context and Importance: This term defines the specific technical mechanism for embedding the metadata. The infringement analysis for Claim 1 may depend entirely on whether Defendant's accused systems use this exact method. Practitioners may focus on this term because it appears to be a highly specific and potentially distinguishing feature of the claimed invention.
    • Intrinsic Evidence for Interpretation:
      • Evidence for a Broader Interpretation: A party could argue that this phrase is exemplary of any time-coded metadata track within a standard video file format that achieves the same goal of synchronization. However, the claim language uses "by encoding," which suggests the method is integral, not just an example.
      • Evidence for a Narrower Interpretation: The claim language is explicit and recites a specific, well-understood technical feature of video file formats. The patent specification reinforces this by explaining that using these fields ensures the timing of the metadata "conveys the timing of the metadata feeds" (Compl. Ex. B, p. 55, quoting claim). This suggests the term should be limited to its plain and ordinary meaning.
  • The Term: "embedding, in real-time"

    • Context and Importance: This limitation defines the timing of the embedding process. Defendant may argue that its process is a near-real-time or post-capture batch process, rather than occurring "in real-time as the stereoscopic video feed is recorded," as required by the claim.
    • Intrinsic Evidence for Interpretation:
      • Evidence for a Broader Interpretation: The complaint alleges Defendant's system is a "real-time or near-real-time" system, suggesting Plaintiff may argue that "near-real-time" falls within the scope of the claim (Compl. Ex. B).
      • Evidence for a Narrower Interpretation: The patent’s description of aggregating outputs from sensors and video "in realtime" during the capture process (e.g., '693 Patent, FIG. 8, step 806) suggests a process that happens concurrently with recording, not subsequent to it.

VI. Other Allegations

  • Indirect Infringement: The complaint alleges inducement of infringement based on Defendant's "detailed technical documentation, tutorials, and customer support services" available on its website, which allegedly instruct customers to use the products in an infringing manner (Compl. ¶2). It also alleges contributory infringement, stating Defendant sells software products that are "material components of the patented invention" with "no substantial non-infringing use" (Compl. ¶3).
  • Willful Infringement: The willfulness allegation is based on alleged pre-suit knowledge. The complaint states that Plaintiff sent Defendant a "formal demand letter" and offered a license prior to filing the complaint, but Defendant "has continued to promote and, upon information and belief, expanded sales of the accused products" (Compl. ¶4).

VII. Analyst’s Conclusion: Key Questions for the Case

  • A central issue will be one of technical implementation: what is the precise mechanism used by Ceres AI to associate calibration data with its aerial imagery, and does that mechanism involve encoding data into "subtitle or closed captioning metadata fields" as explicitly required by the asserted claim?
  • A second key question will be one of claim scope: can the term “stereoscopic video feed,” which the patent describes in the context of 3D visual playback, be construed to read on the analytical "calibrated data layers" produced by Defendant’s precision agriculture platform, or is there a fundamental mismatch in the nature and purpose of the accused data product?