2:25-cv-00260
Web3ai Tech LLC v. MicroStrategy Inc
I. Executive Summary and Procedural Information
- Parties & Counsel:
- Plaintiff: Web3AI Technologies, LLC (Utah)
- Defendant: MicroStrategy, Incorporated (Delaware)
- Plaintiff’s Counsel: Kb&A; Tate Bywater Attorneys at Law
- Case Identification: 2:25-cv-00260, E.D. Va., 05/01/2025
- Venue Allegations: Plaintiff alleges venue is proper in the Eastern District of Virginia because Defendant is headquartered there, constituting a regular and established place of business, and has committed alleged acts of infringement in the District.
- Core Dispute: Plaintiff alleges that Defendant’s AI-driven business intelligence platform infringes a patent related to user interfaces for machine learning that provide real-time, interactive results.
- Technical Context: The dispute is situated in the business intelligence and data analytics software market, specifically concerning features that allow non-expert users to interact with complex machine learning models dynamically.
- Key Procedural History: The complaint alleges that Plaintiff provided Defendant with notice of the patent-in-suit and its alleged infringement via a letter on December 2, 2024, approximately five months prior to the complaint's filing.
Case Timeline
| Date | Event |
|---|---|
| 2013-05-29 | U.S. Patent No. 9,218,574 Priority Date |
| 2015-12-22 | U.S. Patent No. 9,218,574 Issues |
| 2024-09-01 | Approx. launch of Accused Products ("September 2024 release") |
| 2024-12-02 | Plaintiff sends notice letter to Defendant |
| 2025-05-01 | Complaint Filed |
II. Technology and Patent(s)-in-Suit Analysis
U.S. Patent No. 9,218,574 - “User Interface for Machine Learning”
The patent-in-suit is U.S. Patent No. 9,218,574 (the “’574 Patent”), issued December 22, 2015.
The Invention Explained
- Problem Addressed: The patent’s background section identifies that machine learning predictions are often presented as "static numbers" that are "confusing and inaccessible" to a non-expert user (e.g., a business person) (’574 Patent, col. 1:13-19). Furthermore, changing an input parameter to get an updated prediction can cause a "substantial delay," hindering interactive analysis (’574 Patent, col. 1:25-29).
- The Patented Solution: The invention claims a system that provides an interactive user interface for machine learning results, allowing users to adjust parameters and see updated outcomes "with little or no delay" (’574 Patent, col. 1:35-37). The core technical mechanism for achieving this speed is a "pre-compute module" that is configured to "predetermine permutations of the machine learning results prior to the input module receiving the user input" (’574 Patent, Abstract). By pre-calculating potential results across a range of parameter values, the system can retrieve and display an updated result instantly rather than re-running a complex model for each user adjustment.
- Technical Importance: This approach sought to make complex predictive analytics more accessible and useful for business decision-making by enabling real-time "what-if" scenario exploration without requiring data science expertise or tolerating computational lag (Compl. ¶12).
Key Claims at a Glance
- The complaint asserts infringement of at least independent claim 1 (Compl. ¶29).
- The essential elements of Claim 1 are:
- A "predictive compiler module" configured to generate machine learning program code for learned functions to predict results based on parameters.
- An "input module" to receive user-specified values for those parameters.
- A "pre-compute module" to predetermine, in advance of user input, permutations of the machine learning results across a range of parameter values.
- A "display module" to display a first set of results from the pre-computed data corresponding to the user's input.
- An "update module" to dynamically display a second set of results from the pre-computed data when the user provides additional input.
- The complaint also references dependent claims 2, 3, 4, and 9 as being infringed (Compl. ¶24).
III. The Accused Instrumentality
Product Identification
The Accused Products include MicroStrategy's AI-powered analytics platform (MicroStrategy ONE), specifically features branded as "Auto," "Auto Answers," "Auto Narratives," and "HyperIntelligence with Auto" that were introduced around September 2024 (Compl. ¶19, ¶22).
Functionality and Market Context
The complaint alleges these features provide a conversational interface allowing users to ask analytical questions in plain English (e.g., "What will our predicted sales be if marketing budget increases by 10%?") and receive "instant, dynamically generated answers and insights" (Compl. ¶20, ¶23). To achieve this speed, the platform allegedly utilizes "in-memory caches and pre-aggregated data" through its core analytics engine, the MicroStrategy Intelligence Server (Compl. ¶23). The complaint asserts these features are marketed to enhance data analysis for business users by providing on-demand, real-time predictive analytics (Compl. ¶18, ¶23). No probative visual evidence provided in complaint.
IV. Analysis of Infringement Allegations
’574 Patent Infringement Allegations
| Claim Element (from Independent Claim 1) | Alleged Infringing Functionality | Complaint Citation | Patent Citation |
|---|---|---|---|
| a predictive compiler module configured to generate machine learning comprising program code for a plurality of learned functions... | The "predictive engine (Auto's generative AI model and underlying analytics)" which is configured to produce machine learning results (predictions and insights) based on user-specified parameters. | ¶24, ¶31 | col. 35:31-39 |
| an input module configured to receive user input identifying one or more values for the one or more machine learning parameters | The user interface, such as a text prompt or filter control, that allows a user to specify a query or select a value influencing the analysis. | ¶23, ¶32 | col. 35:40-43 |
| a pre-compute module configured to predetermine, using the generated machine learning, permutations of the machine learning results... prior to the input module receiving the user input | The platform's use of "in-memory caches and pre-aggregated data (through its Intelligence Server, caching mechanisms, or semantic graph)" to prepare analytical results for potential inputs ahead of time. | ¶23, ¶33 | col. 35:44-51 |
| a display module configured to display, from the pre-compute module, a first predetermined permutation of the one or more predicted machine learning results... | The interface that displays a numerical result, chart, or narrative in response to an initial user query or parameter selection. | ¶23, ¶34 | col. 35:52-57 |
| an update module configured to dynamically display, from the pre-compute module, a second permutation of the one or more machine learning results in response to the input module receiving additional user input... | The interface swiftly recalculating or fetching and displaying a new prediction when the user refines a question or changes a filter in real time. | ¶23, ¶35 | col. 35:58-66 |
- Identified Points of Contention:
- Scope Questions: A central dispute may concern whether the accused system's alleged use of "in-memory caches and pre-aggregated data" meets the claim limitation of a "pre-compute module configured to predetermine... permutations of the machine learning results." The defense may argue that caching past results or pre-aggregating raw data for faster querying is distinct from the patent's teaching of proactively calculating results for a range of parameter values not yet requested by a user.
- Technical Questions: The complaint's allegation maps the "predictive compiler module" to the "Auto generative AI model." This raises the technical question of whether a generative AI model, which formulates answers based on learned patterns, performs the same function as a module that generates "machine learning comprising program code" as recited in the claim.
V. Key Claim Terms for Construction
The Term: "pre-compute module configured to predetermine... permutations of the machine learning results"
Context and Importance: This term is the central inventive concept for enabling real-time updates. The outcome of the case may depend on whether MicroStrategy’s caching and pre-aggregation architecture is found to fall within the scope of this limitation. Practitioners may focus on this term because it appears to be the primary technical point of divergence between the patent's specific implementation and the accused system's alleged functionality.
Intrinsic Evidence for Interpretation:
- Evidence for a Broader Interpretation: The patent’s abstract states the module may "predetermine permutations," which could be argued to encompass any method of preparing potential results in advance to reduce latency, including sophisticated caching (’574 Patent, Abstract).
- Evidence for a Narrower Interpretation: Claim 1 requires the pre-computation to occur "at one or more increments between a minimum value and a maximum value for the one or more machine learning parameters." This language may support a narrower construction requiring a systematic, forward-looking calculation across a defined parameter space, which could be distinguished from merely caching the results of previously executed queries.
The Term: "predictive compiler module"
Context and Importance: The defendant may argue that its generative AI engine is not a "compiler" in the sense used by the patent. The construction of this term will be critical to determining if the first element of Claim 1 is met.
Intrinsic Evidence for Interpretation:
- Evidence for a Broader Interpretation: The specification does not provide a rigid definition, which may allow for a functional interpretation where any component that generates predictive outcomes based on inputs could be considered a "predictive compiler module" (Compl. ¶31).
- Evidence for a Narrower Interpretation: The claim language requires the module to be configured to "generate machine learning comprising program code for a plurality of learned functions." A defendant could argue that a generative AI responding to a natural language query is not generating "program code" in the manner described, but is instead performing pattern matching and text generation (’574 Patent, col. 35:31-34).
VI. Other Allegations
- Indirect Infringement: The complaint alleges inducement under 35 U.S.C. § 271(b), asserting that MicroStrategy encourages infringement by providing marketing materials, user guides, official tutorials, and technical support that instruct customers on using the accused features (Compl. ¶26).
- Willful Infringement: The willfulness claim is based on alleged pre-suit knowledge of infringement. The complaint pleads that Plaintiff sent Defendant a notice letter identifying the ’574 Patent and the accused products on December 2, 2024, and that Defendant’s infringement continued thereafter (Compl. ¶14, ¶27).
VII. Analyst’s Conclusion: Key Questions for the Case
- A core issue will be one of technical and definitional scope: Can Plaintiff prove that the accused platform's use of caching and pre-aggregation via its "Intelligence Server" is the same as, or equivalent to, the claimed "pre-compute module configured to predetermine... permutations of the machine learning results"? The resolution of this question through claim construction and factual findings will be central to the infringement analysis.
- A second key evidentiary question will be one of functional equivalence: Does the accused "Auto generative AI model" perform the function of a "predictive compiler module" that generates "program code for a plurality of learned functions," as required by Claim 1, or is there a fundamental mismatch in their technical operation that places the accused system outside the claim's scope?