DCT

5:24-cv-03200

University Of British Columbia v. Caption Health Inc

I. Executive Summary and Procedural Information

  • Parties & Counsel:
  • Case Identification: 5:24-cv-03200, N.D. Cal., 12/20/2024
  • Venue Allegations: Venue is alleged to be proper as Defendants maintain a regular and established place of business in the district, and a substantial part of the events giving rise to the claims occurred there.
  • Core Dispute: Plaintiff alleges that Defendants’ AI-driven ultrasound guidance software infringes patents related to using neural networks to assess the quality of cardiac ultrasound images.
  • Technical Context: The technology lies at the intersection of artificial intelligence and medical imaging, aiming to improve the diagnostic quality of echocardiograms by providing real-time feedback to operators.
  • Key Procedural History: The complaint details significant pre-suit history, including a May 2017 meeting between the patent inventors and the founder of Caption Health where the technology was allegedly discussed. Plaintiff also alleges sending notice letters starting in May 2022, and providing a detailed claim chart in November 2022, over two years prior to the filing of the amended complaint. GE Healthcare acquired Caption Health in February 2023.

Case Timeline

Date Event
2016-04-21 ’591 Patent Earliest Priority Date
2017-05-01 Inventors and Caption Health founder (then Bay Labs) hold teleconference
2018-08-31 ’029 Patent Earliest Priority Date
2020-08-25 U.S. Patent No. 10,751,029 Issues
2021-09-28 U.S. Patent No. 11,129,591 Issues
2022-05-05 Plaintiff’s counsel sends first notice letter to Caption Health regarding ’591 Patent
2022-11-11 Plaintiff’s counsel provides claim chart for ’591 Patent to Caption Health
2023-02-01 GE Healthcare acquires Caption Health
2023-10-06 Accused "Venue Family" with Caption Guidance announced as available in U.S.
2024-04-03 Accused "Vscan Air SL" with Caption AI announced
2024-12-20 First Amended Complaint Filed

II. Technology and Patent(s)-in-Suit Analysis

U.S. Patent No. 11,129,591 - “Echocardiographic Image Analysis” (Issued Sep. 28, 2021)

The Invention Explained

  • Problem Addressed: The patent addresses the longstanding challenge that acquiring high-quality echocardiographic (cardiac ultrasound) images requires years of specialized training, and even experienced operators struggle to capture usable images consistently. This inefficiency can delay diagnosis and treatment (’591 Patent, col. 1:29-34; Compl. ¶¶ 13, 16).
  • The Patented Solution: The invention is a system that uses neural networks to automatically assess the quality of an echocardiographic image. Critically, the system first associates an image with a specific "view category" (e.g., an apical four-chamber view) and then uses a neural network trained for that specific view to determine a quality score. The patent describes an efficient architecture where neural networks for different views share common initial layers but have separate, view-specific final layers, reducing computational load and training requirements (’591 Patent, col. 11:53-12:29, Fig. 8; Compl. ¶¶ 39, 41).
  • Technical Importance: This approach provides a more accurate and efficient quality analysis by using specialized neural networks tailored to the distinct anatomical features of different cardiac views, an improvement over general-purpose image analysis techniques (Compl. ¶ 40).

Key Claims at a Glance

  • The complaint asserts at least independent claims 1 and 15 (Compl. ¶ 67).
  • Essential elements of independent claim 1 (a system claim) include:
    • Receiving signals for a first echocardiographic image.
    • Associating the image with a first view category from a plurality of predetermined view categories.
    • Determining, based on the image and the first view category, a first quality assessment value representing a "view category specific quality assessment."
    • Producing signals representing this first quality assessment value.
    • Repeating these steps for a second image associated with a second, different view category.
    • Wherein each view category is associated with a "respective set of assessment parameters," which are themselves a set of neural network parameters defining a neural network.

U.S. Patent No. 10,751,029 - “Ultrasonic Image Analysis” (Issued Aug. 25, 2020)

The Invention Explained

  • Problem Addressed: The patent identifies the problem that for inexperienced operators, a simple quality score is often insufficient; they also need to know other "image properties," such as the view category being displayed. However, designing a system to determine both quality and another property simultaneously is computationally challenging, especially for portable devices (’029 Patent, col. 1:22-31; Compl. ¶¶ 42-43).
  • The Patented Solution: The invention provides a method and system that uses a single algorithm, such as a neural network, to efficiently and simultaneously determine both a quality assessment value and an image property (e.g., view category) from a set of ultrasound images. This is achieved by deriving a common set of "extracted feature representations" from the input images, which are then used for both the quality and property assessments, reducing processing time and resources (’029 Patent, Abstract, col. 5:34-51; Compl. ¶ 49).
  • Technical Importance: This combined assessment architecture allows for faster processing on less powerful hardware, such as mobile devices, and may prevent the neural network from "overfitting" compared to training separate models for each task (’029 Patent, col. 5:40-55; Compl. ¶ 50).

Key Claims at a Glance

  • The complaint asserts at least claims 4 and 23 (Compl. ¶ 86). The independent claims from which these depend are 1 (method) and 21 (system).
  • Essential elements of independent claim 21 (a system claim) include:
    • Receiving signals representing a set of ultrasound images.
    • Deriving one or more extracted feature representations from the set of images.
    • Determining, based on the derived feature representations, a quality assessment value.
    • Determining, based on the same derived feature representations, an image property (e.g., a view category).
    • Producing signals representing both the quality assessment value and the image property.
  • The complaint also reserves the right to assert other claims (Compl. ¶¶ 67, 86).

III. The Accused Instrumentality

Product Identification

The "Venue Family" and "Vscan Air SL" point-of-care ultrasound products, when incorporating the software technologies named "Caption Guidance" or "Caption AI" (collectively, the "Accused Products") (Compl. ¶¶ 52, 67).

Functionality and Market Context

The Accused Products feature AI-driven software that provides real-time, step-by-step visual guidance to help users capture diagnostic-quality cardiac ultrasound images (Compl. ¶ 52). A key feature is a "Quality Meter" displayed on the screen that provides continuous feedback on image quality, rising as the operator moves the ultrasound probe closer to an optimal position (Compl. ¶ 53). The software guides the user through a predetermined workflow of standard cardiac views and can automatically capture and save high-quality image clips (Compl. ¶¶ 58, 60). The technology is marketed as enabling even novice users to perform complex echocardiographic exams (Compl. ¶ 51). The complaint references a screenshot from GE Healthcare's marketing materials showing the user interface, which includes a list of cardiac views and a "Guidance" tool (Compl. p. 18). It also references a figure from a Caption Health patent application that diagrams a workflow including view selection, a quality meter, and real-time guidance prompts (Compl. p. 19).

IV. Analysis of Infringement Allegations

’591 Patent Infringement Allegations

Claim Element (from Independent Claim 1) Alleged Infringing Functionality Complaint Citation Patent Citation
receive signals representing a first at least one echocardiographic image The Accused Products acquire a series of echocardiographic images from a patient scan using a transducer. ¶69 col. 5:30-45
associate the first at least one echocardiographic image with a first view category of a plurality of predetermined echocardiographic image view categories The Accused Products guide operators through a sequence of ten standard diagnostic views (e.g., PLAX, Ap4) and associate the captured imagery with the selected view in the workflow. ¶70 col. 5:62-67
determine... a first quality assessment value representing a view category specific quality assessment... The Accused Products employ a "Quality Meter" that provides a quality score based on deep learning algorithms trained to assess images relative to the specific diagnostic view being sought. ¶71 col. 6:13-20
each of the plurality of predetermined echocardiographic image view categories is associated with a respective set of assessment parameters... being a set of neural network parameters... The Accused Products allegedly use neural networks trained with parameters that are specific to certain view categories to provide quality assessment values. ¶¶74, 76 col. 12:10-21

’029 Patent Infringement Allegations

Claim Element (from Independent Claim 21) Alleged Infringing Functionality Complaint Citation Patent Citation
receive signals representing a set of ultrasound images of the subject The Accused Products acquire a series of echocardiographic images when an operator scans a patient. ¶88 col. 5:23-28
derive one or more extracted feature representations from the set of ultrasound images The Accused Products allegedly use neural networks (DCNNs) to extract feature representations from the ultrasound images to assess them. ¶89 col. 5:28-31
determine, based on the derived one or more extracted feature representations, a quality assessment value... The Accused Products use the extracted feature representations to determine a quality assessment value, which is displayed to the user as a "quality meter." ¶91 col. 5:34-39
determine, based on the derived one or more extracted feature representations, an image property associated with the set of ultrasound images The Accused Products allegedly use the same extracted feature representations to determine an "image property," which is the "view category" of the ultrasound images. ¶¶92, 94 col. 5:34-39
  • Identified Points of Contention:
    • ’591 Patent - Architectural Scope: The infringement analysis may focus on whether the Accused Products' AI architecture meets the claim requirement that "each" view category is associated with a "respective set of assessment parameters." This raises the question of whether a single, large neural network trained on all views can be said to contain "respective sets" of parameters, or if the claim, read in light of the specification's shared-and-separate-layer embodiment, requires more distinct or separable parameter groupings for each view.
    • ’029 Patent - Combined vs. Sequential Analysis: A potential point of contention is whether the Accused Products perform a truly combined analysis as claimed. The allegations suggest a common set of "extracted feature representations" is used for both quality and view category determination. The defense may argue that its system operates sequentially (e.g., first identifying the view, then separately assessing its quality using different criteria), which could raise questions about whether it meets the claim limitations requiring both determinations to be based on the same "derived one or more extracted feature representations."

V. Key Claim Terms for Construction

  • Term: "a respective set of assessment parameters" (’591 Patent)

    • Context and Importance: This term is central to the structure of the claimed invention. Its construction will determine whether a single, multi-task AI model infringes, or if the claim requires a more modular architecture with distinct parameter sets for each cardiac view. Practitioners may focus on this term because the patent's primary embodiment shows an explicit architecture of common "shared layers" and distinct "view-specific layers" (’591 Patent, Fig. 8).
    • Intrinsic Evidence for Interpretation:
      • Evidence for a Broader Interpretation: The plain language of the claim does not explicitly forbid the "respective sets" of parameters from residing within a single, larger neural network model.
      • Evidence for a Narrower Interpretation: The specification's detailed description of the shared-layer and separate view-specific-layer architecture as a key feature for improving efficiency could be used to argue that "respective set" implies some degree of separability beyond what a monolithic network provides (’591 Patent, col. 12:10-35).
  • Term: "deriving one or more extracted feature representations" (used for determining both quality and image property) (’029 Patent)

    • Context and Importance: This term is critical to the patent's assertion of a more efficient, combined analysis. The dispute may turn on whether the accused system uses a truly common set of underlying features for both tasks.
    • Intrinsic Evidence for Interpretation:
      • Evidence for a Broader Interpretation: The term itself is general and does not specify how the representations must be used, potentially allowing for different downstream processing for the two determination steps.
      • Evidence for a Narrower Interpretation: The specification repeatedly emphasizes the benefits of a "combined quality assessment and another image property assessment" and a "highly shared neural network" to achieve faster processing and prevent overfitting, suggesting that the use of a common feature set for both tasks is a core inventive concept (’029 Patent, col. 5:46-55).

VI. Other Allegations

  • Indirect Infringement: The complaint alleges inducement of infringement against Defendants for selling the Accused Products and providing instructions (e.g., product demos, user guidance) that allegedly cause end-users to perform the patented methods (Compl. ¶¶ 78, 97). Contributory infringement is also alleged for the ’591 patent, on the basis that the Caption Guidance software is a material component of the invention with no substantial non-infringing uses (Compl. ¶ 79).
  • Willful Infringement: Willfulness is alleged for both patents. For the ’591 patent, the claim is based on alleged actual knowledge from a notice letter sent on May 5, 2022 (Compl. ¶ 82). The complaint also alleges that Caption Health's founder was made aware of the underlying "innovations" in a May 2017 meeting with the inventors, prior to the patent's issuance (Compl. ¶ 26). For the ’029 patent, willfulness is alleged based on knowledge obtained prior to the filing of the amended complaint, though a specific date of notice is not provided (Compl. ¶ 101).

VII. Analyst’s Conclusion: Key Questions for the Case

  1. A central issue will be one of architectural scope: For the ’591 patent, can the term "a respective set of assessment parameters," described in the patent's embodiment as a shared-plus-separate layer structure, be construed to read on the specific AI architecture implemented in the Accused Products, which may be a single, multi-task neural network?
  2. A key evidentiary question will be one of operational equivalence: For the ’029 patent, does the accused software derive both image quality and view category from a common set of "extracted feature representations" in a combined manner, as the patent claims, or does it employ a sequential process that may fall outside the claim scope?
  3. The detailed pre-suit history, including allegations of a 2017 technology discussion and a 2022 notice letter with a claim chart, raises a significant question of willfulness. Should infringement of the ’591 patent be found, the timing and extent of Defendants' knowledge will be critical in determining whether damages may be enhanced.