DCT
1:17-cv-12483
Matsutek Enterprises Co Ltd v. iRobot Corp
Key Events
Complaint
Table of Contents
complaint
I. Executive Summary and Procedural Information
- Parties & Counsel:
- Plaintiff: Matsutek Enterprises Co., Ltd. (Taiwan)
- Defendant: iRobot Corporation (Delaware)
- Plaintiff’s Counsel: K&L Gates LLP
- Case Identification: 1:17-cv-12483, D. Mass., 12/18/2017
- Venue Allegations: Venue is alleged to be proper in the District of Massachusetts because Defendant iRobot has a regular and established place of business, its corporate headquarters, within the district.
- Core Dispute: Plaintiff alleges that Defendant’s Roomba 900 Series robotic vacuum cleaners infringe a patent related to systems and methods for indoor navigation that use both inertial and visual sensors to simultaneously localize, estimate posture, and build a map.
- Technical Context: The technology at issue falls within the field of Simultaneous Localization and Mapping (SLAM), a critical capability for autonomous mobile robots operating in unknown environments like homes or offices.
- Key Procedural History: The complaint does not mention any prior litigation, licensing history, or inter partes review proceedings related to the patent-in-suit.
Case Timeline
| Date | Event |
|---|---|
| 2009-12-16 | '684 Patent Priority Date |
| 2012-11-13 | '684 Patent Issue Date |
| 2017-12-18 | Complaint Filing Date |
II. Technology and Patent(s)-in-Suit Analysis
U.S. Patent No. 8,310,684 - "System and Method for Localizing a Carrier, Estimating a Posture of the Carrier and Establishing a Map," issued November 13, 2012
The Invention Explained
- Problem Addressed: The patent identifies the difficulty of accurate indoor positioning, noting that GPS signals are often unavailable and that existing indoor systems can be costly or limited. (’684 Patent, col. 1:21-51). Specifically, systems relying solely on inertial sensors accumulate errors over time, while systems relying solely on visual sensors can fail in environments with poor lighting or few distinct features. (’684 Patent, col. 3:6-17, 3:28-39).
- The Patented Solution: The invention describes a system on a mobile "carrier" (e.g., a robot) that fuses data from two complementary sensor types: an inertial measurement device (IMU) and a vision measurement device (camera). (’684 Patent, Abstract; Fig. 1). A controller uses data from one sensor type to correct errors and drift from the other in a feedback loop, enabling the system to continuously and more accurately determine its own position and orientation while simultaneously creating a map of its surroundings. (’684 Patent, col. 2:10-18).
- Technical Importance: This sensor-fusion approach aims to provide robust, real-time localization and mapping capabilities for a mobile device without relying on pre-installed external infrastructure. (’684 Patent, col. 2:58-62).
Key Claims at a Glance
- The complaint asserts independent claims 1 (a system claim) and 10 (a method claim). (Compl. ¶13).
- Independent Claim 1 recites the key elements of the system:
- An inertial measurement device for measuring a motion state and a rotation state.
- A vision measurement device for picturing an environment feature.
- A controller that receives data from both devices to estimate the carrier's posture, location, and velocity and to establish a map.
- Critically, the controller "estimates based on a corrected measuring result of one of the inertial measurement device and the vision measurement device, then the controller controls the other one... to measure and accordingly correct" the carrier's information and the map.
III. The Accused Instrumentality
Product Identification
- The complaint identifies "at least the Roomba 900 Series robotic vacuum cleaning devices" as the Accused Products. (Compl. ¶13).
Functionality and Market Context
- The complaint alleges the Roomba 900 Series employs "iAdapt® 2.0 Navigation with Visual Localization" to intelligently navigate and clean. (Compl. ¶14). This technology is described as performing "VSLAM (Visual Simultaneous Localization and Mapping)," which uses a camera to identify features in the environment and combines that visual data with information from "gyro and IMU data" to build a map and track the robot's position. (Compl. ¶15, ¶17). The complaint includes a screenshot from a companion mobile application showing a map of a cleaned area, suggesting a key feature is the ability to visualize the robot’s coverage. A screenshot from an iRobot owner's manual shows the Roomba 900 is designed to "map out a series of small areas" to "ensure complete coverage." (Compl. ¶14).
IV. Analysis of Infringement Allegations
'684 Patent Infringement Allegations
| Claim Element (from Independent Claim 1) | Alleged Infringing Functionality | Complaint Citation | Patent Citation |
|---|---|---|---|
| an inertial measurement device, for measuring a motion state and a rotation state of the carrier; | The Accused Products comprise an inertial measurement device, as they use "gyro and IMU data" for positioning. | ¶15 | col. 2:66-3:4 |
| a vision measurement device, disposed on a surface of the carrier for picturing at least an environment feature in an indoor environment where the carrier locates; | The Accused Products comprise an "iAdapt® Camera" on their surface to perform "Visual Localization" by looking for "distinctive features." A diagram in the complaint shows the location of this camera. | ¶16 | col. 3:18-24 |
| a controller, for controlling the inertial measurement device and the vision measurement device, receiving a measuring result from [both]... to estimate a posture information, a location information and a velocity information of the carrier and establishing a map... | The Accused Products comprise a controller that combines vision data with IMU data to "build a map of its environment as it goes" and "knows where it is." A screenshot of a generated map is provided as evidence. | ¶14, ¶17 | col. 4:1-10 |
| wherein the controller estimates based on a corrected measuring result of one of the inertial measurement device and the vision measurement device, then the controller controls the other one... to measure and accordingly correct the posture information, the location information and the velocity information of the carrier and the map... | The complaint alleges that in the Accused Products, the controller estimates based on a corrected result from one sensor type and then controls the other sensor to measure and correct the carrier's information and map, directly tracking the claim language. | ¶18 | col. 2:10-18 |
- Identified Points of Contention:
- Scope Questions: The definition of the interactive process claimed in the final "wherein" clause will be central. The dispute may turn on whether iRobot's VSLAM algorithm performs the specific sequence of "estimate-control-correct" between the two sensor systems as required by the claim.
- Technical Questions: A key evidentiary question is how iRobot's "VSLAM" technology actually functions at a software level. The complaint relies on high-level marketing and press materials. The case will likely require discovery into the source code and technical specifications to determine if the accused software's data fusion method performs the specific correction loop recited in the patent, or if it uses a different, non-infringing algorithm.
V. Key Claim Terms for Construction
- The Term: "estimates based on a corrected measuring result of one of the inertial measurement device and the vision measurement device, then the controller controls the other one of the inertial measurement device and the vision measurement device to measure and accordingly correct..."
- Context and Importance: This term describes the core inventive concept: a specific, interactive feedback loop between the two sensor systems. The interpretation of this clause—whether it requires a specific sequence of operations or covers a more general concept of mutual correction—will likely be dispositive for infringement.
- Intrinsic Evidence for Interpretation:
- Evidence for a Broader Interpretation: The patent’s abstract and the brief summary use general language to describe the invention, stating the controller uses a "corrected measuring result from one" device to control and correct via the "other one." (’684 Patent, Abstract). This could support a construction that encompasses any system where IMU and vision data are used to mutually correct each other’s errors.
- Evidence for a Narrower Interpretation: Figure 3 of the patent presents a specific flowchart with distinct, sequential steps for IMU-based correction (steps 310-330) and vision-based correction (steps 340-370). A party could argue that the claims, when read in light of this detailed embodiment, require this specific, sequential, two-pathway process, and that a different implementation (e.g., a fused correction within a single algorithm like a Kalman filter) would not be covered. (’684 Patent, col. 5:8-48).
VI. Other Allegations
- Indirect Infringement: The complaint alleges induced infringement, stating that iRobot provides "manuals, training, guides, videos and/or demonstrations" that instruct customers on how to use the Accused Products in an infringing manner. (Compl. ¶21). It also pleads contributory infringement, alleging the products are "specially made or adapted for use" in an infringing way and have no substantial non-infringing uses. (Compl. ¶22).
- Willful Infringement: Willfulness is alleged based on iRobot’s purported knowledge of the ’684 patent. (Compl. ¶20, ¶22). The complaint does not specify the basis for this alleged knowledge (e.g., a notice letter).
VII. Analyst’s Conclusion: Key Questions for the Case
- A core issue will be one of claim construction: how will the court define the interactive "estimate-control-correct" limitation? Will the claim be construed narrowly to require the specific, sequential process disclosed in the patent's Figure 3 embodiment, or more broadly to cover any VSLAM system where inertial and visual sensor data are used to correct one another's drift?
- A key evidentiary question will be one of technical implementation: what will discovery reveal about the actual operation of iRobot's proprietary "iAdapt 2.0 Navigation" software? The case will likely hinge on whether the accused algorithm's method for fusing sensor data can be shown to map onto the specific functional steps required by the court's construction of the asserted claims.
Analysis metadata