DCT
4:24-cv-06567
Samsung Electronics Co Ltd v. CM HK Ltd
I. Executive Summary and Procedural Information
- Parties & Counsel:- Plaintiff: Samsung Electronics Co., Ltd. (Republic of Korea) and Samsung Electronics America, Inc. (New York)
- Defendant: CM HK, Ltd. (Hong Kong) and CyWee Group Ltd. (British Virgin Islands)
- Plaintiff’s Counsel: Paul Hastings LLP
 
- Case Identification: 4:24-cv-06567, N.D. Cal., 09/18/2024
- Venue Allegations: Plaintiff alleges venue is proper because both defendants are foreign entities and are subject to personal jurisdiction in the district. Plaintiff further alleges that Defendant CyWee has an office in Santa Clara, California, and has directed patent enforcement activities, including prior litigation, within the district.
- Core Dispute: Plaintiff seeks a declaratory judgment that its mobile devices do not infringe four patents owned by Defendants related to motion recognition technology using sensor fusion.
- Technical Context: The technology involves using data from multiple sensors in a mobile device (like accelerometers and gyroscopes) to accurately recognize a user's motion or gestures in three-dimensional space, independent of the device's orientation.
- Key Procedural History: The complaint details a history of litigation and licensing negotiations between the parties. CyWee previously sued Samsung in 2017 in the Eastern District of Texas on related patents, which were subsequently invalidated in inter partes review (IPR) proceedings. Following the IPRs, Defendants allegedly threatened a new lawsuit on the patents-in-suit, which issued after the IPRs, and made a damages demand in excess of $500 million.
Case Timeline
| Date | Event | 
|---|---|
| 2009-07-14 | Earliest Priority Date for all Patents-in-Suit | 
| 2017-02-01 | CyWee files EDTX lawsuit against Samsung on related patents (approximate date) | 
| 2018-06-01 | Google files IPR petitions on related patents (approximate date) | 
| 2018-10-01 | ZTE files IPR petition on a related patent (approximate date) | 
| 2018-12-01 | Google's IPR petitions are instituted (approximate date) | 
| 2019-04-30 | U.S. Patent No. 10,275,038 ('’038 Patent) Issued | 
| 2020-10-27 | U.S. Patent No. 10,817,072 ('’072 Patent) Issued | 
| 2020-12-01 | U.S. Patent No. 10,852,846 ('’846 Patent) Issued | 
| 2023-07-11 | U.S. Patent No. 11,698,687 ('’687 Patent) Issued | 
| 2024-04-04 | Federal Circuit affirms IPR decisions invalidating claims of related patents | 
| 2024-07-31 | Samsung files motion to dismiss EDTX action | 
| 2024-08-14 | EDTX action is dismissed | 
| 2024-09-18 | Complaint for Declaratory Judgment filed | 
II. Technology and Patent(s)-in-Suit Analysis
U.S. Patent No. 10,817,072 - Method and Apparatus for Performing Motion Recognition Using Motion Sensor Fusion, and Associated Computer Program Product, Issued Oct. 27, 2020
The Invention Explained
- Problem Addressed: The patent addresses the problem that motion recognition in portable electronic devices using inertial sensors (like accelerometers or "G-sensors") is often unreliable because the sensor data changes based on the device's tilt and orientation, even when the user performs the exact same motion (U.S. Pat. No. 10,817,072, col. 1:53-2:18). For example, a horizontal swing motion will produce different acceleration data depending on whether the device is held flat or upright (U.S. Pat. No. 10,817,072, col. 9:1-24).
- The Patented Solution: The invention solves this by using "sensor fusion" to first determine the device's orientation (its roll, pitch, and yaw angles) relative to a fixed, "global coordinate system" of the user. It then converts the motion data from the device's own moving coordinate system into this fixed global system. This conversion allows the device to recognize the user's true motion in 3D space consistently, regardless of how the device is held. (’072 Patent, Abstract; col. 2:48-61).
- Technical Importance: This approach allows for more robust and intuitive gesture-based controls, as users are not required to hold the device in a specific, rigid orientation to have their motions recognized correctly. (’072 Patent, col. 3:25-34).
Key Claims at a Glance
- The complaint asserts non-infringement of independent claims 1 and 10. (Compl. ¶71).
- Independent Claim 1 requires a method comprising the steps of:- obtaining sensor data from motion sensors;
- performing sensor fusion to obtain an orientation comprising resultant angles (roll, pitch, yaw) in a global coordinate system;
- mapping the resultant angles onto a selected predetermined plane to obtain a trajectory; and
- performing motion recognition based on the trajectory to recognize user motion in 3D space, including at least one character drawn by the user.
 
- The complaint does not explicitly reserve the right to assert dependent claims, but the prayer for relief seeks a declaration of non-infringement of "any claims" of the patent. (Compl. p. 14).
U.S. Patent No. 10,275,038 - Method and Apparatus for Performing Motion Recognition Using Motion Sensor Fusion, and Associated Computer Program Product, Issued Apr. 30, 2019
The Invention Explained
- Problem Addressed: This patent addresses the same core problem as the ’072 Patent: conventional motion recognition methods fail because they cannot reliably distinguish between acceleration due to user motion and acceleration due to gravity when the device is tilted or overturned. (U.S. Pat. No. 10,275,038, col. 1:53-2:18).
- The Patented Solution: The patented solution involves a specific process for 3D character recognition. It uses sensor fusion to get both the device's motion data and its orientation. Based on the orientation, it selects a 2D plane (e.g., a vertical plane in front of the user) and "maps" the 3D motion data onto that plane to create a 2D trajectory. It then performs character recognition on this 2D trajectory, allowing a user to "write" characters in the air. (’038 Patent, Abstract; col. 7:35-53).
- Technical Importance: This method enables a form of "air writing" or gesture input that is more complex than simple directional swipes, by creating a stable virtual surface on which users can draw characters. (’038 Patent, col. 18:45-51; Fig. 33).
Key Claims at a Glance
- The complaint asserts non-infringement of independent claims 1 and 13. (Compl. ¶77).
- Independent Claim 1 requires a method comprising the steps of:- obtaining sensor data from inertial motion sensors;
- performing sensor fusion to obtain motion data and orientation based on a global coordinate system;
- selecting one of at least one predetermined plane based on the orientation;
- mapping the motion data onto the selected plane to obtain a trajectory; and
- performing motion recognition on the trajectory to recognize the user's motion in 3D space and at least one character drawn by the user.
 
- The prayer for relief seeks a declaration of non-infringement of "any claims" of the patent. (Compl. p. 14).
Multi-Patent Capsule: U.S. Patent No. 10,852,846
- Patent Identification: U.S. Patent No. 10,852,846, Electronic Device for use in Motion Detection and Method for Obtaining Resultant Deviation Thereof, Issued Dec. 1, 2020.
- Technology Synopsis: This patent describes a method for more accurately calculating a device's orientation (yaw, pitch, roll) by using a nine-axis motion sensor module (rotation sensor, accelerometer, and magnetometer). It uses a comparison model where "predicted" axial accelerations derived from rotation sensor data are compared with "measured" axial accelerations from the accelerometer to update and correct the device's orientation, represented as a quaternion. (’846 Patent, Abstract; col. 7:22-45).
- Asserted Claims: Independent claims 1 and 7. (Compl. ¶83).
- Accused Features: The complaint alleges that Samsung's devices do not infringe because they do not perform the claimed steps of obtaining a quaternion by predicting axial accelerations, comparing them with measured accelerations, and using predicted accelerations converted from measured angular velocities. (Compl. ¶82).
Multi-Patent Capsule: U.S. Patent No. 11,698,687
- Patent Identification: U.S. Patent No. 11,698,687, Electronic Device for use in Motion Detection and Method for Obtaining Resultant Deviation Thereof, Issued Jul. 11, 2023.
- Technology Synopsis: This patent is related to the ’846 Patent and discloses a similar method for calculating device orientation. The method involves obtaining quaternions, comparing predicted and measured axial accelerations, and updating the device's orientation state. (’687 Patent, Abstract).
- Asserted Claims: Independent claims 1, 14, and 27. (Compl. ¶89).
- Accused Features: The complaint alleges non-infringement for the same reasons as the ’846 patent: the accused devices do not obtain a quaternion by predicting and comparing axial accelerations as recited in the claims. (Compl. ¶88).
III. The Accused Instrumentality
- Product Identification: Samsung's mobile devices. (Compl. ¶13).
- Functionality and Market Context:- The complaint identifies the relevant functionality as features that use "one or more sensors that are capable of determining a mobile device's position and movement." (Compl. ¶59).
- A specific function identified is character recognition. (Compl. ¶¶70, 76). The complaint alleges that to the extent Samsung's devices perform such character recognition, the operation occurs in "two-dimensional space," not the three-dimensional space required by the claims. (Compl. ¶¶70, 76).
- No probative visual evidence provided in complaint.
 
IV. Analysis of Infringement Allegations
The complaint, seeking a declaratory judgment of non-infringement, does not provide a traditional infringement claim chart. Instead, it identifies specific claim limitations that Samsung's mobile devices allegedly do not practice. The following tables summarize these non-infringement contentions.
'072 Patent Infringement Allegations
| Claim Element (from Independent Claim 1) | Alleged Non-Infringing Functionality | Complaint Citation | Patent Citation | 
|---|---|---|---|
| mapping the resultant angles onto a plane selected based on an orientation, where the orientation is derived from sensor fusion to obtain a trajectory | The complaint alleges Samsung's mobile devices perform character recognition without the need for angle mapping with sensor fusion data. | ¶¶70-71 | col. 21:40-44 | 
| recognizing user motion in three-dimensional space for character recognition | The complaint alleges that to the extent the accused devices perform character recognition, the operation is performed in two-dimensional space, not three-dimensional space. | ¶70 | col. 21:45-48 | 
'038 Patent Infringement Allegations
| Claim Element (from Independent Claim 1) | Alleged Non-Infringing Functionality | Complaint Citation | Patent Citation | 
|---|---|---|---|
| performing motion recognition based on the trajectory on the selected one of the at least one predetermined plane ... to recognize the user's motion in 3D space and at least one character drawn by the user in the 3D space | The complaint alleges that the accused devices perform character recognition in two-dimensional space, not three-dimensional space as required by the claim. | ¶76 | col. 22:43-48 | 
- Identified Points of Contention:- Scope Questions: A central dispute may be the construction of "in three-dimensional space." The question for the court will be whether this phrase requires the character itself to be defined by three spatial coordinates (as if drawn in the air), or if it can be read more broadly to cover a 2D character recognition process that is merely initiated or controlled by a 3D gesture.
- Technical Questions: The complaint raises the factual question of how Samsung's character recognition feature operates. Discovery will likely focus on whether the underlying software algorithms for character recognition in the accused devices process trajectory data defined by two coordinates (e.g., X and Y on a screen) or three coordinates (e.g., X, Y, and Z in physical space).
 
V. Key Claim Terms for Construction
The complaint does not provide sufficient detail for analysis of most claim terms. However, based on the non-infringement allegations, one term is central.
- The Term: "recognizing user motion in three-dimensional space for character recognition" (from ’072 Patent, claim 1; ’038 Patent, claim 1)
- Context and Importance: This term is critical because Samsung's primary non-infringement defense is that its devices perform character recognition in two-dimensional space. (Compl. ¶¶70, 76). The outcome of the case may hinge on whether the accused functionality, which may involve a 3D gesture to initiate a 2D input, falls within the scope of this limitation.
- Intrinsic Evidence for Interpretation:- Evidence for a Broader Interpretation: The specification describes recognizing "user-defined gestures" and creating a "user-defined motion database," which could suggest that "character recognition" is just one example of a broader category of 3D motion recognition. (’038 Patent, col. 8:1-17). An argument could be made that any recognition of a user's 3D motion that results in a character input meets the claim.
- Evidence for a Narrower Interpretation: The specification and figures repeatedly describe and illustrate the concept of "hand-writing in the 3D space," showing trajectories of letters like "d" and "m" being drawn in the air. (’038 Patent, col. 18:45-51; Fig. 33). This may support a narrower construction requiring the trajectory of the character itself to be traced and recognized in 3D space, not merely a 2D character input triggered by a 3D motion.
 
VI. Other Allegations
The complaint is for declaratory judgment of non-infringement and does not allege infringement. Therefore, this section is not applicable.
VII. Analyst’s Conclusion: Key Questions for the Case
This declaratory judgment action appears to center on a targeted, technical non-infringement theory. The key questions for the court will likely be:
- A core issue will be one of definitional scope: Does the claim term "recognizing user motion in three-dimensional space for character recognition" require the entire character-drawing and recognition process to occur using 3D spatial data, or can it be construed to cover a 2D character recognition function that is enabled or controlled by a 3D gesture?
- A key evidentiary question will be one of technical operation: As a factual matter, do Samsung's accused mobile devices process character recognition inputs using two-dimensional trajectory data (e.g., from a touchscreen or touchpad) as alleged in the complaint, or do their algorithms fundamentally rely on processing three-dimensional motion data from inertial sensors as required by the patent claims?
- A predicate issue concerning the ’846 and ’687 patents will be one of algorithmic implementation: Do the accused devices use a predictive model where axial accelerations are calculated from angular velocities and then compared to measured accelerations to determine orientation, or do they employ a different, non-infringing method of sensor data fusion?