DCT

6:23-cv-00194

Electronic Scripting Products Inc v. Eq3 Ltd

Key Events
Complaint
complaint

I. Executive Summary and Procedural Information

  • Parties & Counsel:
  • Case Identification: 6:23-cv-00194, W.D. Tex., 03/14/2023
  • Venue Allegations: Plaintiff alleges venue is proper because the defendant is a foreign corporation and further alleges it has a regular and established place of business in the district.
  • Core Dispute: Plaintiff alleges that Defendant’s augmented reality feature for visualizing furniture infringes patents related to determining an object's position and orientation using optical sensors and environmental features.
  • Technical Context: The technology enables devices like smartphones to understand their position and orientation in a physical space, a foundational capability for augmented reality applications in e-commerce and other fields.
  • Key Procedural History: The complaint does not mention any prior litigation, inter partes review proceedings, or licensing history related to the patents-in-suit.

Case Timeline

Date Event
2004-01-30 Earliest Priority Date for ’641 and ’559 Patents
2010-11-02 U.S. Patent No. 7,826,641 Issued
2019-01-29 U.S. Patent No. 10,191,559 Issued
2023-03-14 Complaint Filed

II. Technology and Patent(s)-in-Suit Analysis

U.S. Patent No. 10,191,559 - "Computer Interface For Manipulated Objects With An Absolute Pose Detection Component"

  • Patent Identification: U.S. Patent No. 10,191,559, "Computer Interface For Manipulated Objects With An Absolute Pose Detection Component," issued January 29, 2019.

The Invention Explained

  • Problem Addressed: The patent's background section describes a need for a "low-cost, robust and accurate apparatus for absolute motion capture" to interface with the digital world. It notes that prior art systems relying on relative motion sensors (like inertial devices alone) suffer from accumulating "gradual drift," while systems using external cameras are often complex and computationally expensive (’559 Patent, col. 1:47-59, 5:9-19).
  • The Patented Solution: The invention is a manipulated object, such as a phone or tablet, containing an on-board "photodetector" (e.g., a camera) that detects "high optical contrast features" in the surrounding physical environment. A controller on the object processes data from the photodetector to determine the object's position and orientation (its "pose"). This optical data can be supplemented with data from auxiliary sensors, like an inertial measurement unit, to create a more robust system (’559 Patent, Abstract; col. 6:31-44).
  • Technical Importance: This approach allows a handheld device to determine its absolute position in a known environment without relying on complex external tracking systems or suffering from the drift inherent in purely inertial-based systems (’559 Patent, col. 1:21-31).

Key Claims at a Glance

  • The complaint asserts independent Claim 1 (Compl. ¶9).
  • Claim 1 of the ’559 Patent recites a manipulated object comprising:
    • a photodetector to detect high optical contrast features and generate data about their positions;
    • a controller to identify a "derivative pattern" from that data, which is indicative of the photodetector's position; and
    • at least one other component, such as an auxiliary motion detector (e.g., an IMU), an active illumination component, or a scanning component.
  • The complaint also asserts dependent claims 6, 7, 10, 15, 16, 19, 24, and 25 (Compl. ¶19).

U.S. Patent No. 7,826,641 - "Apparatus And Method For Determining An Absolute Pose Of A Manipulated Object In A Real Three-Dimensional Environment With Invariant Features"

  • Patent Identification: U.S. Patent No. 7,826,641, "Apparatus And Method For Determining An Absolute Pose Of A Manipulated Object In A Real Three-Dimensional Environment With Invariant Features," issued November 2, 2010.

The Invention Explained

  • Problem Addressed: The patent addresses the same technical challenge as the ’559 Patent: the difficulty in achieving "one-to-one motion mapping between space and cyberspace" without a system that can digitize the absolute pose of a manipulated object. It contrasts this with prior art that relies on relative motion, which is subject to error and drift (’641 Patent, col. 1:47-2:2).
  • The Patented Solution: The invention is an apparatus that determines its absolute pose using an on-board "optical measuring means" to observe at least one "invariant feature" in the environment. A processor prepares this pose data and a communication link transmits it to an application, enabling the object's real-world movements to be translated into the digital realm (’641 Patent, Abstract; col. 2:48-67).
  • Technical Importance: The patent describes a self-contained system on a manipulated object for determining its absolute position and orientation by referencing features in the external world, forming a basis for direct and intuitive user interfaces (’641 Patent, col. 1:20-27).

Key Claims at a Glance

  • The complaint asserts independent Claim 1 (Compl. ¶14).
  • Claim 1 of the ’641 Patent recites an apparatus comprising:
    • at least one invariant feature in the environment;
    • an on-board optical measuring means for inferring the apparatus's absolute pose using the invariant feature;
    • a processor for preparing and identifying a subset of the absolute pose data; and
    • a communication link for transmitting that subset to an application.
  • The complaint also asserts dependent claim 29 (Compl. ¶32).

III. The Accused Instrumentality

Product Identification

  • The "Accused Products" are the augmented reality (AR) and 3D features on Defendant EQ3's website and mobile applications (Compl. ¶9, ¶15).

Functionality and Market Context

  • The accused functionality allows a user to visualize EQ3's furniture products within their own physical space using a smartphone or tablet (Compl. ¶9). The complaint alleges this is accomplished by using the device's camera and underlying AR frameworks, such as Apple's ARKit or Google's ARCore (Compl. ¶10). These frameworks are alleged to recognize "notable features in the scene image," such as "feature points and planes," to build a model of the device's position and motion within the real-world environment (Compl. ¶10). The complaint includes a screenshot of an EQ3 product page showing a QR code that initiates the AR experience on a mobile device (Compl. p. 4). Another screenshot shows the AR feature prompting the user to "Move iPhone to start," indicating an initialization phase where the device scans the environment (Compl. p. 9).
  • The complaint suggests this Web-AR feature is a key part of creating an "immersive customer experience" and unifying the shopping journey between desktop and mobile platforms (Compl. p. 6).

IV. Analysis of Infringement Allegations

’559 Patent Infringement Allegations

Claim Element (from Independent Claim 1) Alleged Infringing Functionality Complaint Citation Patent Citation
A manipulated object cooperating with a first plurality of high optical contrast features disposed in a real three-dimensional environment, said manipulated object comprising: A mobile device (iPhone or Android phone) that cooperates with high contrast features in the environment, such as edges of a table or QR codes. ¶10 col. 9:4-10
a) a photodetector configured to detect said first plurality of high optical contrast features and generate photodetector data... The camera of the iPhone or Android phone, which detects optical features in the environment. ¶10 col. 43:45-48
b) a controller configured to identify a derivative pattern of said first plurality of high optical contrast features from said photodetector data, wherein said derivative pattern is indicative of the position of said photodetector; The processing unit(s) of the iPhone or Android phone, which allegedly identify a derivative pattern from the camera data to determine the phone's position. ¶10 col. 6:31-38
c) at least one component selected from the group consisting of an auxiliary motion detection component... The auxiliary motion detection components of the iPhone or Android phone, such as an Inertial Measurement Unit (IMU). ¶10 col. 36:10-14
  • Identified Points of Contention:
    • Scope Questions: A central question may be the construction of the term "derivative pattern." The analysis will likely focus on whether the processing performed by ARKit or ARCore on environmental feature points constitutes the specific type of "derivative pattern" contemplated by the patent, which the specification links to perspective distortion transformations (’559 Patent, col. 6:35-44).
    • Technical Questions: What evidence does the complaint provide that the accused AR system identifies a pattern that is specifically "indicative of the position of said photodetector," as claimed, versus a more general model of the device's six-degree-of-freedom pose (position and orientation)?

’641 Patent Infringement Allegations

Claim Element (from Independent Claim 1) Alleged Infringing Functionality Complaint Citation Patent Citation
An apparatus for processing absolute pose data derived from an absolute pose of a manipulated object in a real three-dimensional environment, said apparatus comprising: A mobile device (iPhone or Android phone) that processes absolute pose data. ¶16 col. 2:48-52
a) at least one invariant feature in said real three-dimensional environment; Special markings such as QR codes or other features in the environment. ¶16 col. 9:36-40
b) an optical measuring means for optically inferring said absolute pose from on-board said manipulated object using said at least one invariant feature... The camera of the iPhone or Android phone, which is used to infer the device's absolute pose. ¶16 col. 9:31-36
c) a processor for preparing said absolute pose data and identifying a subset of said absolute pose data; The processor of the smartphone, which prepares pose data and identifies a subset. ¶16 col. 9:51-55
d) a communication link for transmitting said subset to an application. The internal communication link within the smartphone that transmits data to an application. ¶16 col. 10:1-5
  • Identified Points of Contention:
    • Scope Questions: The definition of "invariant feature" will be critical. The court may need to decide if this term can read on dynamically-identified "feature points and planes" detected by ARCore/ARKit, or if it is limited to pre-defined, static features like the specific markings and light sources described in the patent's embodiments (’641 Patent, col. 9:36-43).
    • Technical Questions: Does the accused system's use of ARKit/ARCore meet the limitation of "optically inferring said absolute pose... using said at least one invariant feature"? The analysis may explore whether the AR frameworks' simultaneous localization and mapping (SLAM) approach constitutes "using" a specific "invariant feature" in the manner required by the claim.

V. Key Claim Terms for Construction

’559 Patent: "derivative pattern"

  • The Term: "derivative pattern"
  • Context and Importance: This term appears in Claim 1(b) and is the core of the controller's claimed function. The case may turn on whether the complex calculations performed by modern AR frameworks on dynamically detected environmental features can be characterized as identifying a "derivative pattern" as understood in the patent.
  • Intrinsic Evidence for Interpretation:
    • Evidence for a Broader Interpretation: The specification states the controller is configured to "identify a derivative pattern of light sources from the photodetector data" and that this pattern is "indicative of the asymmetric and generally linear pattern" of the light sources in the environment (’559 Patent, col. 6:31-35). This could suggest any mathematically derived pattern that correlates to the real-world feature layout.
    • Evidence for a Narrower Interpretation: The specification explains that as the photodetector's pose changes, the real-world pattern "undergoes a well-understood transformation (i.e., perspective distortion...)." The patent states that "Knowledge of this transformation enables one to correlate the... pattern to the derivative pattern" (’559 Patent, col. 6:35-42). This language may support an argument that the term is limited to patterns derived specifically from known geometric transformations like perspective distortion, rather than any arbitrary computational result.

’641 Patent: "invariant feature"

  • The Term: "invariant feature"
  • Context and Importance: This term from Claim 1(a) defines the external reference points the invention uses. The complaint alleges this term covers QR codes and environmental features like table edges. The viability of the infringement theory depends on whether this term can encompass the ephemeral "feature points" that ARKit and ARCore detect on-the-fly.
  • Intrinsic Evidence for Interpretation:
    • Evidence for a Broader Interpretation: The patent defines invariant features as "high optical contrast features such as edges of objects, special markings, or light sources" (’641 Patent, col. 9:38-40). This open-ended list including "edges of objects" could support a reading that covers naturally occurring, high-contrast points in any environment.
    • Evidence for a Narrower Interpretation: The embodiments and figures consistently depict a system where the invariant features are discrete, known, and often intentionally placed objects like light sources (beacons) or markings in a defined arrangement (’641 Patent, Fig. 4; col. 15:46-54). This could support a narrower construction limited to features whose properties and locations are known to the system beforehand, as opposed to features that are dynamically and transiently identified from a video stream.

VI. Other Allegations

  • Indirect Infringement: The complaint alleges inducement of infringement for both patents. The factual basis is that EQ3 allegedly designed, marketed, and provided instructions for the Accused Products with the specific intent that end-users would operate them in an infringing manner (Compl. ¶¶24-27, 37-40).
  • Willful Infringement: The complaint alleges willful infringement based on knowledge of the patents and the alleged infringement "since at least the date of the filing of this Complaint" (Compl. ¶22, ¶35). This frames the allegation as one of post-suit willfulness.

VII. Analyst’s Conclusion: Key Questions for the Case

  1. A core issue will be one of definitional scope: can claim terms rooted in the patent's context of using known, often pre-arranged environmental markers—such as "invariant feature" (’641) and "derivative pattern" (’559)—be construed to cover the dynamic, on-the-fly environmental mapping and feature point detection performed by modern AR frameworks like ARKit and ARCore?
  2. A key evidentiary question will be one of technical operation: what evidence will demonstrate that the general-purpose localization algorithms of the accused AR platforms perform the specific functions recited in the claims, such as using a discrete "invariant feature" to infer absolute pose or identifying a specific "derivative pattern" that is indicative of the camera's position?