8:23-cv-02013
University Of South Florida Board Of Trustees v. Stryker Corp
I. Executive Summary and Procedural Information
- Parties & Counsel:- Plaintiff: University of South Florida Board of Trustees (Florida)
- Defendant: Stryker Corporation (Michigan)
- Plaintiff’s Counsel: Fee & Jeffries, PA.
 
- Case Identification: 8:23-cv-02013, M.D. Fla., 09/07/2023
- Venue Allegations: Plaintiff alleges venue is proper because Defendant Stryker Corporation has a regular and established place of business in Tampa, Florida, and has committed acts of alleged infringement within the judicial district.
- Core Dispute: Plaintiff alleges that Defendant’s surgical navigation systems infringe two patents related to methods for providing augmented reality during minimally invasive surgery.
- Technical Context: The technology involves using computer-vision techniques to overlay pre-operative medical image data (e.g., from CT scans) onto a live video feed from an endoscope, thereby enhancing a surgeon's view of a patient's internal anatomy during a procedure.
- Key Procedural History: The patents-in-suit, U.S. Patent Nos. 9,547,940 and 9,646,423, share a common specification and priority claim. The ’423 Patent is a continuation of the application that issued as the ’940 Patent. Both patents disclose that the invention was made with U.S. government support under a National Science Foundation grant, which gives the government certain rights in the invention.
Case Timeline
| Date | Event | 
|---|---|
| 2014-09-12 | Priority Date for ’940 and ’423 Patents | 
| 2017-01-17 | ’940 Patent Issue Date | 
| 2017-05-09 | ’423 Patent Issue Date | 
| 2023-09-07 | Complaint Filing Date | 
II. Technology and Patent(s)-in-Suit Analysis
U.S. Patent No. 9,547,940 - Systems and Methods for Providing Augmented Reality in Minimally Invasive Surgery (Issued Jan. 17, 2017)
The Invention Explained
- Problem Addressed: In minimally invasive surgery (MIS), the surgeon’s view is restricted to the laparoscope’s limited field of view, and tactile feedback is nearly eliminated. This makes it difficult to locate or avoid vital anatomical structures, such as blood vessels or ureters, that are obscured or outside the camera's direct line of sight, which can lead to "serious and even life threatening injury" (’940 Patent, col. 1:52-62).
- The Patented Solution: The invention proposes a method to create an augmented reality environment for the surgeon. It involves capturing pre-operative image data (e.g., from a CT scan) to build a first 3D model of the patient's anatomy. During surgery, it captures live intra-operative image data from the endoscope and uses it to build a second 3D model. By registering these two models in real-time and tracking the endoscope's position, the system can display a rendering of an organ on the live video feed, even if that organ is outside the camera's immediate view. (’940 Patent, Abstract; col. 4:50-65).
- Technical Importance: The technology aims to provide reliable image-guided surgery for deformable soft tissues (e.g., in abdominal surgery) without relying on external motion tracking devices or rigid anatomical landmarks, which are often unavailable or unreliable in such procedures (’940 Patent, col. 2:16-28).
Key Claims at a Glance
- The complaint asserts independent claim 1 (Compl. ¶ 16).
- The essential elements of Claim 1 include:- capturing pre-operative image data of internal organs of a patient;
- creating a first three-dimensional surface model based upon the captured pre-operative image data;
- capturing intra-operative image data of the internal organs with an endoscope during a surgical procedure;
- creating a second three-dimensional surface model based on the captured intra-operative image data;
- registering the three-dimensional surface models in real time during the surgical procedure;
- tracking the position and orientation of the endoscope during the surgical procedure; and
- augmenting the intra-operative image data with a rendering of at least a portion of an internal organ that is in registration with the real time intra-operative image data but is "outside of the field of view of the endoscope."
 
- The complaint does not explicitly reserve the right to assert dependent claims.
U.S. Patent No. 9,646,423 - Systems and Methods for Providing Augmented Reality in Minimally Invasive Surgery (Issued May 9, 2017)
The Invention Explained
- Problem Addressed: The ’423 Patent addresses the same technical problem as the parent ’940 Patent: the limited field of view and lack of spatial awareness in MIS can lead to inadvertent injury to vital, non-visible anatomical structures (’423 Patent, col. 1:55-col. 2:5).
- The Patented Solution: The method described also involves integrating pre-operative and intra-operative imaging to enhance the surgeon's view. The system captures pre-operative and live intra-operative data, registers the two data sets in real time, tracks the endoscope, and augments the live surgical video with a rendering of an organ located outside the endoscope's field of view. (’423 Patent, Abstract; col. 4:55-col. 5:4).
- Technical Importance: As with the ’940 patent, the described solution provides a framework for augmented reality in MIS on soft, deformable tissues, which presents significant challenges for traditional image registration and tracking techniques (’423 Patent, col. 2:25-34).
Key Claims at a Glance
- The complaint asserts independent claim 1 (Compl. ¶ 26).
- The essential elements of Claim 1 include:- capturing pre-operative image data of internal organs of a patient;
- capturing intra-operative image data of the internal organs with an endoscope during a surgical procedure;
- registering the pre-operative image data and the inter-operative data in real time during the surgical procedure;
- tracking the position and orientation of the endoscope during the surgical procedure; and
- augmenting the intra-operative image data with a rendering of an organ that is in registration with the real time intra-operative image data but is "outside of the field of view of the endoscope."
 
- The complaint does not explicitly reserve the right to assert dependent claims.
III. The Accused Instrumentality
Product Identification
The primary accused instrumentality is Stryker’s TGS® System, which stands for "target guided surgery" (Compl. ¶ 11). The complaint also identifies the "Scopis Holographic Navigation Platform" as a potentially infringing product but states it "lacks sufficient information" for a definitive determination (Compl. ¶¶ 18, 28).
Functionality and Market Context
The complaint describes the TGS® System as an "ENT navigation system" used for Functional Endoscopic Sinus Surgery (FESS) (Compl. ¶ 11). It is alleged to be a "next-generation solution" that provides surgeons with "highly advanced image guidance and visualization capabilities" and "Augmented Reality (AR) technology" (Compl. ¶ 12). No probative visual evidence provided in complaint.
IV. Analysis of Infringement Allegations
The complaint alleges that the accused TGS® System performs the steps of the asserted claims but references non-provided exhibits for detailed mappings (Compl. ¶¶ 17, 27). The following analysis is based on the complaint's general description of the TGS® System as an augmented reality surgical navigation platform (Compl. ¶¶ 11-12).
’940 Patent Infringement Allegations
| Claim Element (from Independent Claim 1) | Alleged Infringing Functionality | Complaint Citation | Patent Citation | 
|---|---|---|---|
| capturing pre-operative image data of internal organs of a patient; | The TGS® System allegedly uses pre-operative patient scans as the basis for its image guidance. | ¶¶ 12, 17 | col. 4:55-59 | 
| creating a first three-dimensional surface model based upon the captured pre-operative image data... | The system allegedly processes the pre-operative scans to generate a 3D anatomical model for navigation. | ¶¶ 12, 17 | col. 4:60-65 | 
| capturing intra-operative image data of the internal organs with an endoscope during a surgical procedure; | The TGS® System operates during surgery using a live video feed from an endoscope. | ¶¶ 11, 17 | col. 5:8-13 | 
| creating a second three-dimensional surface model based on the captured intra-operative image data... | The system allegedly processes the live video feed to generate a real-time 3D model of the surgical site. | ¶¶ 12, 17 | col. 5:13-17 | 
| registering the three-dimensional surface models in real time during the surgical procedure; | The system's "image guidance" and "visualization capabilities" allegedly involve aligning the pre-operative 3D model with the intra-operative 3D model in real time. | ¶¶ 12, 17 | col. 6:18-24 | 
| tracking the position and orientation of the endoscope during the surgical procedure; | The navigation functionality of the accused system allegedly involves tracking the endoscope's location and orientation. | ¶¶ 11, 17 | col. 6:36-43 | 
| augmenting the intra-operative image data... with a rendering of... an internal organ... that is... outside of the field of view of the endoscope. | The system's "Augmented Reality (AR) technology" allegedly overlays renderings of anatomical structures, including those beyond the camera's immediate view, onto the surgeon's display. | ¶¶ 12, 17 | col. 11:13-19 | 
Identified Points of Contention
- Technical Question: A central question is whether the TGS® System actually "creat[es] a second three-dimensional surface model" from the live endoscope video feed, as the claim requires. The court will require evidence that the system performs this specific step, rather than using another technique (e.g., 2D feature tracking) to align a pre-operative model with the 2D video.
- Scope Question: The final limitation requires augmenting the view with organs "outside of the field of view." A dispute may arise over whether the TGS® System provides this "periphery augmentation" or if its AR overlays are limited to structures already within the camera's frame. The complaint does not provide specific evidence on this point.
’423 Patent Infringement Allegations
| Claim Element (from Independent Claim 1) | Alleged Infringing Functionality | Complaint Citation | Patent Citation | 
|---|---|---|---|
| capturing pre-operative image data of internal organs of a patient; | The TGS® System allegedly uses pre-operative patient scans as the basis for its image guidance. | ¶¶ 12, 27 | col. 4:55-59 | 
| capturing intra-operative image data of the internal organs with an endoscope during a surgical procedure; | The TGS® System operates during surgery using a live video feed from an endoscope. | ¶¶ 11, 27 | col. 5:8-13 | 
| registering the pre-operative image data and the inter-operative data in real time during the surgical procedure; | The system's "image guidance" and "visualization capabilities" allegedly involve aligning or registering the pre-operative scan data with the live video data in real time. | ¶¶ 12, 27 | col. 6:18-24 | 
| tracking the position and orientation of the endoscope during the surgical procedure; | The navigation functionality of the accused system allegedly involves tracking the endoscope's location and orientation. | ¶¶ 11, 27 | col. 6:36-43 | 
| augmenting the intra-operative image data... with a rendering of... an internal organ... that is... outside of the field of view of the endoscope. | The system's "Augmented Reality (AR) technology" allegedly overlays renderings of anatomical structures, including those beyond the camera's immediate view, onto the surgeon's display. | ¶¶ 12, 27 | col. 8:14-23 | 
Identified Points of Contention
- Technical Question: Unlike the ’940 Patent, this claim requires registering "data" rather than two distinct "models." The key question will be what technical process of alignment and registration the TGS® System employs and whether it meets the definition of "registering" the two data sets as understood in the context of the patent.
- Scope Question: As with the ’940 Patent, a factual dispute is likely regarding whether the system displays renderings of organs that are truly "outside of the field of view of the endoscope."
V. Key Claim Terms for Construction
- The Term: "registering the three-dimensional surface models" (’940, Claim 1) vs. "registering the pre-operative image data and the inter-operative data" (’423, Claim 1)- Context and Importance: These parallel terms are the core of the claimed methods. The difference between them—registering pre-formed models versus registering the underlying data—may be critical. The viability of the infringement allegation for the ’940 Patent, which has the more specific requirement, may depend heavily on whether the accused system is found to create two distinct models before alignment.
- Intrinsic Evidence for a Broader Interpretation: The specification describes multiple ways to achieve registration, including a general process of using iterative closest point (ICP) followed by manual tuning, which could suggest the term is not limited to one specific algorithm (’940 Patent, col. 6:20-24).
- Intrinsic Evidence for a Narrower Interpretation: The specification also details a "simultaneously tracking, mapping, and registering (STMR)" framework, which a party could argue represents the actual invention and should limit the scope of "registering" (’940 Patent, col. 8:5-13). The distinction between creating models from data ('940) and registering data directly ('423) suggests the terms were intended to have different scopes.
 
- The Term: "outside of the field of view of the endoscope" (’940 and ’423, Claim 1)- Context and Importance: This limitation defines the type of augmentation required and is a potential point of non-infringement. Infringement requires showing the accused system renders anatomy that is literally off-screen, not just hidden behind other tissue within the camera's view.
- Intrinsic Evidence for a Broader Interpretation: The patent explicitly describes this feature as "periphery augmentation" to provide the surgeon with "better context" and shows an example in FIG. 11, which is described as a "virtual image" with a "larger field of view than the endoscope" (’940 Patent, col. 11:13-35).
- Intrinsic Evidence for a Narrower Interpretation: A defendant could argue this functionality is tied to the specific "visual SLAM technique" disclosed for creating the expanded virtual view, potentially narrowing the claim to that specific implementation (’940 Patent, col. 11:21-29).
 
VI. Other Allegations
- Indirect Infringement: The complaint does not allege facts to support claims of induced or contributory infringement, such as knowledge or intent based on user manuals or product marketing that instruct on an infringing use.
- Willful Infringement: The complaint does not contain an explicit allegation of willful infringement or plead facts that would typically support such a claim, such as pre-suit knowledge of the patents.
VII. Analyst’s Conclusion: Key Questions for the Case
- A central issue will be one of evidentiary proof: What is the precise operational methodology of the Stryker TGS® System? The case will require detailed technical evidence to determine if the system’s method for aligning pre-operative and intra-operative imagery meets the specific limitations of the asserted claims, particularly the ’940 Patent’s requirement to create and register two distinct three-dimensional models.
- The infringement analysis will likely turn on a question of visualization scope: Does the accused system’s "Augmented Reality" feature render anatomical structures that are demonstrably "outside of the field of view of the endoscope," as required by both asserted claims? Plaintiff will need to provide evidence that the system performs this specific type of "periphery augmentation."
- A key legal question will be one of claim differentiation: How will the court construe the "registering" limitation in Claim 1 of the ’940 Patent (registering "models") versus the arguably broader limitation in Claim 1 of the ’423 Patent (registering "data")? The outcome could create different infringement results for the two patents, even though they are asserted against the same product.