DCT
2:25-cv-00103
Fractal Networks LLC v. Velocix Solutions Ltd
Key Events
Complaint
Table of Contents
complaint
I. Executive Summary and Procedural Information
- Parties & Counsel:
- Plaintiff: Fractal Networks LLC (NM)
- Defendant: Velocix Solutions Limited (UK)
- Plaintiff’s Counsel: Rabicoff Law LLC
- Case Identification: 2:25-cv-00103, E.D. Tex., 02/01/2025
- Venue Allegations: Venue is asserted based on the defendant having an established place of business within the Eastern District of Texas.
- Core Dispute: Plaintiff alleges that Defendant’s unspecified cellular system products infringe a patent related to 5G network architecture that utilizes edge processing for low-latency computation.
- Technical Context: The lawsuit concerns the field of 5G network infrastructure, specifically the architectural design choice of distributing computational tasks between the network edge and a central cloud to improve performance for latency-sensitive applications.
- Key Procedural History: The complaint does not mention any prior litigation, inter partes review (IPR) proceedings, or licensing history related to the patent-in-suit.
Case Timeline
| Date | Event |
|---|---|
| 2019-09-02 | '399 Patent Priority Date |
| 2020-06-23 | '399 Patent Issue Date |
| 2025-02-01 | Complaint Filing Date |
II. Technology and Patent(s)-in-Suit Analysis
U.S. Patent No. 10,694,399 - "Cellular system", issued June 23, 2020 (’399 Patent)
The Invention Explained
- Problem Addressed: The patent's background section identifies a challenge in 5G network deployment: the need for a "huge number of 5G towers" to provide high-speed coverage, which can be visually intrusive ("eyesores nearly everywhere") and require a direct line of sight for small cells to function effectively (’399 Patent, col. 1:31-44).
- The Patented Solution: The invention proposes a cellular system that integrates 5G components, such as steerable antennas, into existing public infrastructure like light poles or buildings. A key component is an "edge processing module" coupled to the antennas to provide "low-latency computation" for a target device, offloading processing from more distant and higher-latency core or cloud data centers (’399 Patent, col. 1:52-59; Fig. 2A). This architecture is intended to enable dense 5G deployment in a less obtrusive manner while supporting latency-sensitive applications.
- Technical Importance: The described approach addresses the dual 5G objectives of expanding network density for ubiquitous coverage and reducing communication delays, which is critical for emerging technologies like autonomous vehicles and the Internet of Things (IOT) (’399 Patent, col. 2:25-34).
Key Claims at a Glance
- The complaint does not specify which claims are asserted but reserves the right to assert any of the patent’s claims (Compl. ¶11). The patent’s independent claims include:
- Independent Claim 1:
- A 5G cellular transceiver to communicate with a predetermined target.
- One or more steerable antennas coupled to the transceiver.
- A processor to control the directionality of the antennas.
- An "edge processing module" that provides "low-latency computation" and "shares workload with a core processing module" at a head-end or cloud data center, where the core/cloud module has "increased latency."
- Independent Claim 16:
- A 5G cellular transceiver and one or more steerable antennas.
- A processor to control antenna directionality.
- An "edge processing module" for low-latency computation, wherein the processor "calibrates a connection by analyzing RSSI and TSSI" and moves the antennas until "predetermined cellular parameters are reached."
- Independent Claim 1:
III. The Accused Instrumentality
Product Identification
- The complaint refers to "Exemplary Defendant Products" but does not name or describe them, instead incorporating them by reference to an "Exhibit 2" which was not filed with the public version of the complaint (Compl. ¶11, ¶16).
Functionality and Market Context
- The complaint does not provide sufficient detail for analysis of the accused instrumentality's functionality or market context.
IV. Analysis of Infringement Allegations
The complaint incorporates infringement allegations by reference to claim charts in Exhibit 2, which is not publicly available (Compl. ¶16, ¶17). The complaint's narrative text does not provide sufficient detail to create a claim chart or otherwise analyze the specific infringement theories.
No probative visual evidence provided in complaint.
V. Key Claim Terms for Construction
The Term: "edge processing module"
- Context and Importance: This term is central to the patent's claimed point of novelty. The dispute will likely focus on whether the accused architecture includes a component that meets this definition, distinguishing it from a monolithic cloud-based or device-based processing system.
- Intrinsic Evidence for Interpretation:
- Evidence for a Broader Interpretation: The specification describes the module in broad functional terms, stating it "comprises at least a processor, a graphical processing unit (GPU), a neural network, a statistical engine, or a programmable logic device (PLD)" (’399 Patent, col. 2:1-3).
- Evidence for a Narrower Interpretation: Claim 1 defines the term structurally and relationally. The "edge processing module" is distinguished from a "core processing module" or "cloud module" by its ability to provide "low-latency computation" versus the "increased latency" of the core/cloud modules (’399 Patent, cl. 1). Figure 2J visually depicts this architectural separation between "ACCESS/EDGE" computing and "CORE" or "CLOUD/DATA CENTER" processing, which may support a narrower construction requiring a distinct, lower-latency processing tier (’399 Patent, Fig. 2J).
The Term: "low-latency computation"
- Context and Importance: This functional limitation defines the purpose of the "edge processing module." The meaning of "low-latency" will be critical for determining infringement.
- Intrinsic Evidence for Interpretation:
- Evidence for a Broader Interpretation: The patent does not provide a specific numerical value for "low-latency," suggesting it is a relative term to be understood by a person of ordinary skill in the art.
- Evidence for a Narrower Interpretation: Claim 1 provides a direct comparison, contrasting the "low-latency computation" of the edge module with the "increased latency" of the core and cloud modules (’399 Patent, cl. 1). This suggests the term should be construed relatively, meaning a latency that is demonstrably lower than that of the system's central processing resources for the same task.
VI. Other Allegations
- Indirect Infringement: The complaint alleges inducement of infringement, stating that the defendant distributes "product literature and website materials" that instruct end users on how to use the accused products in a manner that infringes the ’399 Patent (Compl. ¶14).
- Willful Infringement: The complaint alleges willfulness based on knowledge of the patent obtained from the service of the complaint itself, indicating a claim for post-suit willful infringement (Compl. ¶13).
VII. Analyst’s Conclusion: Key Questions for the Case
- An Architectural Question: A central issue will be whether the accused products embody the specific distributed computing architecture of Claim 1. The case will likely depend on evidence demonstrating a distinct "edge processing module" that performs "low-latency computation" and shares a workload with a separate, higher-latency "core processing module" located in a cloud or head-end.
- A Functional Question: For infringement of claims like Claim 16, a key evidentiary question will be one of specific functionality. Can the plaintiff demonstrate that the accused products perform the claimed calibration loop of "analyzing RSSI and TSSI" and physically moving antennas until "predetermined cellular parameters are reached"?
- An Evidentiary Question: As the complaint lacks specific details regarding the accused products, a foundational question for the litigation will be what specific hardware and software architectures the plaintiff identifies in its forthcoming infringement contentions and how those systems map to the limitations of the asserted claims.
Analysis metadata