DCT

1:25-cv-00757

Web3ai Tech LLC v. MicroStrategy Inc

Key Events
Complaint

I. Executive Summary and Procedural Information

  • Parties & Counsel:
  • Case Identification: 1:25-cv-00757, E.D. Va., 05/01/2025
  • Venue Allegations: Venue is alleged to be proper in the Eastern District of Virginia because Defendant maintains its corporate headquarters and a regular and established place of business within the district.
  • Core Dispute: Plaintiff alleges that Defendant’s AI-driven business intelligence platform infringes a patent related to a user interface for dynamically interacting with machine learning models.
  • Technical Context: The technology concerns interactive interfaces for machine learning systems, a field of significant importance for making complex data analytics and business intelligence accessible to non-expert users.
  • Key Procedural History: The complaint alleges that Plaintiff provided Defendant with pre-suit notice of infringement via a letter sent on December 2, 2024, a fact that may be relevant to the allegation of willful infringement.

Case Timeline

Date Event
2013-05-29 U.S. Patent No. 9,218,574 Priority Date
2015-12-22 U.S. Patent No. 9,218,574 Issue Date
2024-09-01 Approximate launch of Accused Products ("September 2024 release")
2024-12-02 Plaintiff sends pre-suit notice letter to Defendant
2025-05-01 Complaint Filing Date

II. Technology and Patent(s)-in-Suit Analysis

U.S. Patent No. 9,218,574 - "User Interface for Machine Learning"

  • Patent Identification: U.S. Patent No. 9,218,574 (“’574 Patent”), issued December 22, 2015.

The Invention Explained

  • Problem Addressed: The patent’s background section describes that machine learning predictions are often presented as static numbers that are difficult for non-experts to interpret. Furthermore, if a user wishes to change an input parameter to see a new prediction, a "substantial delay will occur" while the new prediction is calculated (’574 Patent, col. 1:11-31).
  • The Patented Solution: The invention is a system and method for a dynamic user interface that allows a user to adjust a machine learning parameter and see updated results with little or no delay. The key to this responsiveness is a "pre-compute module" that predetermines and caches "permutations of the machine learning results" across a range of potential parameter values before the user provides input, making the results immediately available for display (’574 Patent, Abstract; col. 2:6-14).
  • Technical Importance: This approach aims to bridge the gap between complex backend machine learning algorithms and end-users who need fast, intuitive access to predictive insights for "what-if" analysis (Compl. ¶12).

Key Claims at a Glance

  • The complaint asserts infringement of at least independent claim 1 and reserves the right to assert additional claims, including dependent claims 2, 3, 4, and 9 (Compl. ¶24, ¶29).
  • The essential elements of independent claim 1 include:
    • A "predictive compiler module" that generates machine learning program code.
    • An "input module" to receive a user-specified parameter value.
    • A "pre-compute module" to predetermine permutations of machine learning results across a range of values "prior to the input module receiving the user input."
    • A "display module" to show a first result from the pre-computed data.
    • An "update module" to dynamically show a second result in response to additional user input.

III. The Accused Instrumentality

Product Identification

  • The "Accused Products" are identified as MicroStrategy's AI-powered analytics platform, including features branded as "Auto," "Auto Answers," "Auto Narratives," and "HyperIntelligence with Auto," which are part of the MicroStrategy ONE platform (Compl. ¶22).

Functionality and Market Context

  • The Accused Products provide a conversational interface allowing users to ask plain-English questions about their data and receive "instant" and "dynamically generated" answers (Compl. ¶20). This functionality is integrated into MicroStrategy's core business intelligence and analytics software and is designed to allow users to interactively explore data (Compl. ¶18, ¶23).
  • The complaint alleges that the platform achieves its real-time responsiveness through mechanisms like "in-memory caches and pre-aggregated data (through its Intelligence Server, caching mechanisms, or semantic graph)" (Compl. ¶23).
  • No probative visual evidence provided in complaint.

IV. Analysis of Infringement Allegations

’574 Patent Infringement Allegations

Claim Element (from Independent Claim 1) Alleged Infringing Functionality Complaint Citation Patent Citation
a predictive compiler module configured to generate machine learning comprising program code for a plurality of learned functions MicroStrategy's "AI engine," including its "Auto" generative AI model, is alleged to be a predictive engine that produces machine learning results and insights based on user parameters. ¶24, ¶31 col. 36:31-36
an input module configured to receive user input identifying one or more values for the one or more machine learning parameters The user interface of the Accused Products, such as a text prompt or dashboard filter, accepts user queries or parameter selections (e.g., "What's the projected sales for the next quarter?"). ¶23, ¶32 col. 36:37-40
a pre-compute module configured to predetermine, using the generated machine learning, permutations of the machine learning results... prior to the input module receiving the user input The platform's use of caching, in-memory pre-aggregated data, and a semantic graph is alleged to be functionally equivalent to pre-computing results to ensure they can be retrieved and displayed "immediately." ¶23, ¶33 col. 36:41-48
a display module configured to display, from the pre-compute module, a first predetermined permutation of the one or more predicted machine learning results... The platform's interface displays an initial result, such as a forecast chart or natural language summary, corresponding to the user's initial parameter selection. ¶23, ¶34 col. 36:49-54
an update module configured to dynamically display... a second permutation of the one or more machine learning results in response to the input module receiving additional user input... When a user refines a query or changes a filter, the interface refreshes the display to show the new results in "essentially real time" without requiring a full recalculation. ¶23, ¶35 col. 36:55-63

Identified Points of Contention

  • Scope Questions: A central dispute may arise over the meaning of predetermining results "prior to the input module receiving the user input." The complaint alleges that the accused platform's use of caching and on-the-fly querying is "functionally equivalent" to the claimed pre-computation (Compl. ¶23). This raises the question of whether the claim requires a systematic, offline pre-calculation of a result space before a user session begins, or if it can be construed to cover near-instantaneous computation or retrieval from a cache that occurs after an initial user query is made.
  • Technical Questions: The complaint equates the claimed "predictive compiler module" with Defendant's "generative AI model" (Compl. ¶24). This raises a technical question for the court: does a generative AI model that formulates answers to user queries perform the same function as the patent's "compiler module," which is described as generating "program code for a plurality of learned functions" (’574 Patent, col. 36:32-34)?

V. Key Claim Terms for Construction

The Term: "pre-compute module configured to predetermine... permutations of the machine learning results... prior to the input module receiving the user input"

  • Context and Importance: This limitation is the patent's proposed solution to the latency problem in interactive analytics. The construction of this term, particularly the temporal requirement of "prior to... user input," will be critical to the infringement analysis. Practitioners may focus on this term because it appears to be the primary point of mismatch between the claim language and the alleged functionality of the accused product's caching architecture.
  • Intrinsic Evidence for Interpretation:
    • Evidence for a Broader Interpretation: The specification's objective is to overcome "substantial delay" and provide results "with little or no delay" (’574 Patent, col. 1:28-31; col. 2:40-41). A party could argue that any technical means that achieves this goal, such as advanced caching that feels instantaneous to a user, falls within the spirit of the invention.
    • Evidence for a Narrower Interpretation: The claim language "prior to the input module receiving the user input" and the specification's disclosure of pre-computing results at "predefined increments between minimum... and maximum values" (’574 Patent, col. 18:16-20) suggest a structured, exhaustive pre-calculation performed before a user session, not a reactive caching of a specific query result. Flowchart FIG. 10 explicitly shows determining machine learning results (1004) as a preparatory step before the user interaction flow begins.

The Term: "predictive compiler module"

  • Context and Importance: This is a specific, defined component of the claimed apparatus. Its construction is important because the complaint alleges that a modern "generative AI model" meets this limitation.
  • Intrinsic Evidence for Interpretation:
    • Evidence for a Broader Interpretation: The patent describes the module as generating "machine learning... with program code for a plurality of learned functions... to predict machine learning results" (’574 Patent, col. 19:11-21). An argument could be made that this is a functional description that could encompass a generative AI system that, in effect, synthesizes an executable plan to generate a predictive result.
    • Evidence for a Narrower Interpretation: The term "compiler" has a specific meaning in computer science. The patent's detailed description focuses on generating and combining discrete machine learning models like decision trees and SVMs (’574 Patent, FIG. 5), which a party could argue is structurally and functionally distinct from a large language model that generates narrative text.

VI. Other Allegations

Indirect Infringement

  • The complaint alleges induced infringement, stating that MicroStrategy encourages its customers' direct infringement by providing marketing materials, user guides, and tutorials (e.g., "Using Auto Answers") that instruct on the use of the accused features (Compl. ¶26).

Willful Infringement

  • The willfulness allegation is based on alleged pre-suit knowledge. The complaint asserts that MicroStrategy has had knowledge of the ’574 Patent and its alleged infringement since at least December 2, 2024, when it received a notice letter from Plaintiff, and that its continued infringement thereafter has been willful (Compl. ¶17, ¶27).

VII. Analyst’s Conclusion: Key Questions for the Case

  • A core issue will be one of temporal scope: Can the claim limitation requiring pre-computation of results "prior to... user input" be construed to cover the accused system's alleged use of real-time caching and pre-aggregation, which may occur nearly instantaneously but after an initial user query is received?
  • A second central question will be one of technological equivalence: Does the accused "generative AI model" function as a "predictive compiler module" as that term is used and described in the patent, or is there a fundamental operational difference between a modern generative AI and the patent's system of compiling ensembles of discrete machine learning functions?
  • Finally, an evidentiary question will be what proof Plaintiff can offer that the accused system actually performs the claimed "pre-computation," as the complaint's allegations appear to be based on the system's external behavior (i.e., its "instantaneous response") and its generally known architecture rather than direct evidence of its internal operations.