2:23-cv-00567
Electronic Scripting Products Inc v. Viar Inc
I. Executive Summary and Procedural Information
- Parties & Counsel:
- Plaintiff: Electronic Scripting Products, Inc. (Delaware)
- Defendant: VIAR Inc. (Washington)
- Plaintiff’s Counsel: Banie & Ishimoto LLP
- Case Identification: 2:23-cv-00567, W.D. Wash., 04/13/2023
- Venue Allegations: Venue is alleged to be proper in the Western District of Washington because Defendant VIAR Inc. maintains a regular and established place of business in the district.
- Core Dispute: Plaintiff alleges that Defendant’s Viar360 virtual reality (VR) training platform infringes three patents related to determining the absolute position and orientation (pose) of manipulated objects in a three-dimensional environment.
- Technical Context: The technology at issue involves methods of computer vision and motion tracking that enable devices like smartphones and VR headsets to understand their location and orientation in real space, a foundational capability for immersive training, gaming, and augmented reality applications.
- Key Procedural History: The complaint does not mention any prior litigation, licensing history, or post-grant proceedings. The asserted patents derive from a long and interconnected prosecution history, with the ’559 and ’641 patents sharing a 2006 priority date and the ’540 patent claiming priority to a 2004 application.
Case Timeline
| Date | Event |
|---|---|
| 2004-01-30 | U.S. Patent No. 9,229,540 Priority Date |
| 2006-03-08 | U.S. Patent No. 10,191,559 Priority Date |
| 2006-03-08 | U.S. Patent No. 7,826,641 Priority Date |
| 2010-11-02 | U.S. Patent No. 7,826,641 Issued |
| 2016-01-05 | U.S. Patent No. 9,229,540 Issued |
| 2019-01-29 | U.S. Patent No. 10,191,559 Issued |
| 2023-04-13 | Complaint Filed |
II. Technology and Patent(s)-in-Suit Analysis
U.S. Patent No. 10,191,559 - "Computer Interface For Manipulated Objects With An Absolute Pose Detection Component"
- Patent Identification: U.S. Patent No. 10,191,559, "Computer Interface For Manipulated Objects With An Absolute Pose Detection Component," issued January 29, 2019 (the "’559 Patent"). (Compl. ¶7).
The Invention Explained
- Problem Addressed: The patent’s background section describes the need for a "rapid, low-cost method and apparatus for one-to-one motion mapping between real space and cyberspace" for hand-held objects, noting that prior art methods often suffer from drift, position error, and high computational expense. (’559 Patent, col. 1:32-37, 2:8-17).
- The Patented Solution: The invention is a manipulated object, such as a smartphone, containing an on-board photodetector (e.g., a camera) that detects high optical contrast features in the surrounding environment. A controller then analyzes changes in the pattern of these features (a "derivative pattern") to determine the object's position and orientation. This optical data can be supplemented by information from auxiliary sensors, such as an inertial measurement unit (IMU). (’559 Patent, Abstract; col. 6:3-17).
- Technical Importance: This "inside-out" tracking approach allows a device to determine its own pose without relying on external cameras or tracking systems, a key technological enabler for mass-market mobile augmented and virtual reality. (’559 Patent, col. 1:18-24).
Key Claims at a Glance
- The complaint asserts independent Claim 1. (Compl. ¶9).
- The essential elements of Claim 1 are:
- A manipulated object that cooperates with a plurality of high optical contrast features in a real three-dimensional environment.
- A photodetector configured to detect the features and generate data representative of their positions.
- A controller configured to identify a "derivative pattern" of the features from the photodetector data, with the derivative pattern being indicative of the photodetector's position.
- At least one component from the group of an auxiliary motion detection component, an active illumination component, or a scanning component.
- The complaint also asserts infringement of dependent Claims 6, 7, 10, 15, 16, 19, 24, and 25 and reserves the right to assert others. (Compl. ¶¶11, 26).
U.S. Patent No. 7,826,641 - "Apparatus And Method For Determining An Absolute Pose Of A Manipulated Object In A Real Three-Dimensional Environment With Invariant Features"
- Patent Identification: U.S. Patent No. 7,826,641, "Apparatus And Method For Determining An Absolute Pose Of A Manipulated Object In A Real Three-Dimensional Environment With Invariant Features," issued November 2, 2010 (the "’641 Patent"). (Compl. ¶12).
The Invention Explained
- Problem Addressed: The patent addresses the problem that many contemporary interface devices for cyberspace are limited to relative motion tracking, which is insufficient for "one-to-one motion mapping between space and cyberspace" and can lead to limitations like drift and accumulating position error. (’641 Patent, col. 1:43-67).
- The Patented Solution: The invention is an apparatus where an "optical measuring means" (e.g., a camera) is placed on-board the manipulated object. This sensor optically infers the object's absolute pose (position and orientation) by observing at least one "invariant feature" in the environment. A processor then prepares this pose data and sends a subset of it via a communication link to a software application. (’641 Patent, Abstract; col. 5:10-24).
- Technical Importance: The patent describes a self-contained system for absolute pose determination using fixed environmental features, a foundational architecture for modern untethered VR and AR devices that track their position relative to the real world. (’641 Patent, col. 1:32-38).
Key Claims at a Glance
- The complaint asserts independent Claim 1. (Compl. ¶14).
- The essential elements of Claim 1 are:
- An apparatus for processing absolute pose data from a manipulated object.
- At least one "invariant feature" in the real three-dimensional environment.
- An on-board "optical measuring means" for optically inferring the absolute pose using the invariant feature and expressing it with specified pose data (Euler angles and coordinates).
- A processor for preparing the absolute pose data and identifying a subset of it.
- A communication link for transmitting the subset to an application.
- The complaint also asserts infringement of dependent Claim 29 and reserves the right to assert others. (Compl. ¶¶17, 39).
U.S. Patent No. 9,229,540 - "Deriving Input From Six Degrees Of Freedom Interfaces"
- Patent Identification: U.S. Patent No. 9,229,540, "Deriving Input From Six Degrees Of Freedom Interfaces," issued January 5, 2016 (the "’540 Patent"). (Compl. ¶18).
Technology Synopsis
The patent describes an interface for deriving input from the absolute pose of an item, such as a VR headset, in a 3D environment. It addresses the technical challenge of establishing a stable reference frame for tracking. The solution involves an on-board unit (e.g., cameras) that receives "non-collinear optical inputs" from stationary objects, with processing electronics using a "computer vision algorithm using a homography" to recover the item's full six-degree-of-freedom pose from this visual data. (’540 Patent, Abstract; col. 2:1-25).
Asserted Claims
The complaint asserts independent Claim 1 and dependent Claims 2, 11-19, 25, 32, 33, 34, 36, 37, 39, 40, and 44-49. (Compl. ¶¶20, 23, 52).
Accused Features
The complaint accuses VR/XR headsets that use the Viar360 platform of infringement. Specifically, it alleges that the dual front-facing cameras on such headsets receive non-collinear optical inputs from the environment, and an on-board processor (e.g., a Qualcomm Snapdragon) employs a computer vision algorithm based on a homography (specifically referencing FASTCV_API) to perform 6DoF motion tracking. (Compl. ¶¶22, 14-16).
III. The Accused Instrumentality
Product Identification
The accused products are the "Viar360 platform and associated software and products," which the complaint defines as the "Accused Products." (Compl. ¶10).
Functionality and Market Context
The Viar360 platform is described as an authoring and publishing tool for creating immersive VR training scenarios for various industries. (Compl. ¶¶10, 16). The complaint alleges that when used on mobile devices, the platform leverages built-in augmented reality frameworks. On an iPhone, it allegedly uses Apple ARKit to "recognize[] notable features in the scene image, track[] differences in the positions of those features across video frames, and compare[] that information with motion sensing data." (Compl. ¶10). On an Android phone, it allegedly uses Google ARCore to detect "feature points and planes." (Compl. ¶10). For VR headsets, the complaint alleges the platform supports devices using Qualcomm's Snapdragon reference design, which employ dual front-facing cameras for 6DoF motion tracking. (Compl. ¶22). A screenshot in the complaint shows the Viar360 user interface for adding interactive virtual content to a scene. (Compl. ¶10, Figure 2).
IV. Analysis of Infringement Allegations
’559 Patent Infringement Allegations
| Claim Element (from Independent Claim 1) | Alleged Infringing Functionality | Complaint Citation | Patent Citation |
|---|---|---|---|
| a) a photodetector configured to detect said first plurality of high optical contrast features and generate photodetector data representative of the positions of said first plurality... | The camera of an iPhone or Android phone, which uses ARKit or ARCore to detect "notable features" or "feature points" in the environment. | ¶10 | col. 6:3-6 |
| b) a controller configured to identify a derivative pattern of said first plurality of high optical contrast features from said photodetector data, wherein said derivative pattern is indicative of the position of said photodetector | The processing unit of an iPhone or Android phone, which allegedly identifies a derivative pattern by tracking differences in the positions of features across video frames. | ¶10 | col. 6:7-12 |
| c) at least one component selected from the group consisting of an auxiliary motion detection component, an active illumination component and a scanning component. | An auxiliary motion detection component such as the phone's inertial measurement unit (IMU) or motion sensing unit. | ¶10 | col. 6:13-17 |
’641 Patent Infringement Allegations
| Claim Element (from Independent Claim 1) | Alleged Infringing Functionality | Complaint Citation | Patent Citation |
|---|---|---|---|
| a) at least one invariant feature in said real three-dimensional environment | Environmental features such as "special markings" or other objects with high optical contrast visible in the real world. | ¶16 | col. 4:22-23 |
| b) an optical measuring means for optically inferring said absolute pose from on-board said manipulated object using said at least one invariant feature and expressing said inferred absolute pose with absolute pose data (φ, θ, ψ, x, y, z)... | The camera of an iPhone or Android phone, which infers the device's absolute pose from the invariant feature and expresses it using rotation angles and coordinates. | ¶16 | col. 4:24-32 |
| c) a processor for preparing said absolute pose data and identifying a subset of said absolute pose data | The processing unit of an iOS or Android device, which prepares the absolute pose data. | ¶16 | col. 4:33-35 |
| d) a communication link for transmitting said subset to an application | An internal communication link within the mobile device for transmitting the pose data to the Viar360 virtual training application. | ¶16 | col. 4:36-37 |
Identified Points of Contention
- Scope Questions: A primary question may be whether the patented terms "derivative pattern" (’559 Patent) and "invariant feature" (’641 Patent) can be construed to cover the on-the-fly detection and tracking of arbitrary, unmarked features in a general environment by modern software libraries like ARKit and ARCore. The patents' specifications frequently describe these concepts in the context of known, predefined, or structured patterns of features, raising the question of whether the claims are limited to such contexts.
- Technical Questions: For the ’540 patent, the complaint provides a screenshot of API documentation to support its allegation that the accused products use a "homography," a specific requirement of Claim 1. (Compl. ¶22). A technical question will be what evidentiary support exists to demonstrate that the accused VR headsets actually execute this specific computer vision algorithm as part of their 6DoF tracking.
V. Key Claim Terms for Construction
’559 Patent
- The Term: "derivative pattern"
- Context and Importance: This term is the core of the inventive step in Claim 1(b), defining how the controller processes visual data to determine position. The plaintiff's infringement theory equates the tracking of feature positions across video frames with identifying a "derivative pattern." The construction of this term may be dispositive of infringement for the ’559 Patent.
- Intrinsic Evidence for Interpretation:
- Evidence for a Broader Interpretation: The claim language requires only that the derivative pattern be "indicative of the position of said photodetector." (’559 Patent, col. 28:1-3). This could support an interpretation where any derived data that changes predictably with the detector's position meets the limitation.
- Evidence for a Narrower Interpretation: The summary of the invention explains that the "asymmetric and generally linear pattern undergoes a well-understood transformation (i.e., perspective distortion...)." and this "enables one to correlate the asymmetric and generally linear pattern to the derivative pattern." (’559 Patent, col. 6:35-42). This language suggests the "derivative pattern" is the result of a transformation applied to a known, predefined source pattern, not just a pattern of change between arbitrary features.
’641 Patent
- The Term: "invariant feature"
- Context and Importance: The infringement allegation relies on naturally occurring objects in the environment, such as "furniture and items," qualifying as an "invariant feature." (Compl. ¶22). Practitioners may focus on this term because its construction will determine whether the patent applies to modern "simultaneous localization and mapping" (SLAM) systems that use arbitrary environmental features, or if it is limited to systems using predefined markers or beacons.
- Intrinsic Evidence for Interpretation:
- Evidence for a Broader Interpretation: The detailed description states that "Invariant features... are high optical contrast features such as edges of objects, special markings, or light sources." (’641 Patent, col. 5:41-44). The use of "such as" suggests this list is exemplary, not exhaustive, and could be argued to include any high-contrast edge found in an environment.
- Evidence for a Narrower Interpretation: The embodiments depicted in the patent's figures consistently show distinct, predefined features, such as a cross marking (34) and a light source (36), rather than relying on the ambient environment. (’641 Patent, FIG. 1). This could support an argument that the invention is directed to systems that use specific, known features for tracking. A marketing screenshot included in the complaint shows a VR headset used for "interactive virtual reality based on 360 videos and photos," which might be argued to provide the fixed, invariant features required. (Compl. ¶22).
VI. Other Allegations
- Indirect Infringement: The complaint alleges induced infringement for all three patents, asserting that VIAR knowingly induced its end-users to infringe by providing the Accused Products along with "specific instructions or training regarding the use of those products." (Compl. ¶¶31-33, 44-46, 57-59).
- Willful Infringement: Willfulness is alleged for all three patents. The basis for VIAR's knowledge is alleged to have begun "since at least the date of the filing of this Complaint," suggesting a theory of post-suit willfulness. (Compl. ¶¶27, 40, 53).
VII. Analyst’s Conclusion: Key Questions for the Case
- A core issue will be one of definitional scope: can terms like "derivative pattern" and "invariant feature," which the patent specifications often describe in the context of structured or predefined environmental markers, be construed broadly enough to read on the functionality of modern AR/VR systems that perform tracking by identifying and mapping arbitrary, naturally-occurring features in an environment?
- A key evidentiary question will be one of functional equivalence: does the accused Viar360 platform, by relying on general-purpose software libraries like ARKit and ARCore, actually perform the specific computer vision algorithms and data processing steps required by the claims (such as using a "homography" for pose recovery as claimed in the ’540 patent), or is there a fundamental mismatch in the technical operation?