PTAB

IPR2023-01046

MicroSoft Corp v. RealD Spark LLC

1. Case Identification

2. Patent Overview

  • Title: Generating Reference Data for Adjusting Digital Representations of a Head Region
  • Brief Description: The ’985 Patent relates to methods for correcting eye gaze in digital images. The core claimed invention involves a machine-learning training process that uses a first machine-learning algorithm (MLA) to generate initial reference data, which is then used along with the original training data to train a second MLA.

3. Grounds for Unpatentability

Ground 1: Anticipation over Ganin - Claims 1-3, 5, 10, 12-14, and 18 are anticipated by Ganin.

  • Prior Art Relied Upon: Ganin, “Deepwarp: Photorealistic image resynthesis for gaze manipulation” (a 2016 conference publication).
  • Core Argument for this Ground:
    • Prior Art Mapping: Petitioner argued that Ganin discloses all limitations of the independent claims. Ganin teaches a system for gaze correction that predicts a "warping field" using a deep convolutional neural network. This system is implemented in two stages: a "coarse warping" module (the first MLA) and a "fine warping" module (the second MLA). Ganin's training process uses pairs of images with different gaze directions (input and output patches). The fine warping module (second MLA) receives as input the original image data as well as the output of the coarse warping module (the "coarse warping field," which constitutes the "first reference data"), thereby teaching the claimed two-MLA training architecture.
    • Key Aspects: Petitioner asserted that Ganin’s coarse and fine warping modules are two distinct MLAs that directly map to the claimed first and second MLAs, and that the "warping fields" they generate constitute the claimed "editing instructions."

Ground 2: Obviousness over Ganin and Liu - Claims 16 and 19 are obvious over Ganin in view of Liu.

  • Prior Art Relied Upon: Ganin (a 2016 conference publication) and Liu, “Optical Flow and Principal Component Analysis-Based Motion Detection in Outdoor Videos” (a 2010 journal article).
  • Core Argument for this Ground:
    • Prior Art Mapping: This ground addresses dependent claims requiring that the editing instructions be provided in a "compressed representation." Petitioner asserted that Ganin teaches all elements of the claims except for this compression. Liu was introduced for its teaching of applying principal component analysis (PCA), a well-known data compression technique, to optical flow data.
    • Motivation to Combine (for §103 grounds): A person of ordinary skill in the art (POSITA) would combine Liu with Ganin to improve system efficiency. Ganin’s warping fields are a type of optical flow, and Liu explicitly teaches using PCA to compress such data to reduce its size. Ganin itself encouraged improvements like speed optimization, which would motivate a POSITA to look to analogous art like Liu for known compression techniques.
    • Expectation of Success (for §103 grounds): A POSITA would have a high expectation of success because applying PCA to motion-related data like optical flows was a standard and widely used technique in image processing to achieve data compression.

Ground 3: Obviousness over Chalom and Aslan - Claims 1-3, 5-6, 8-14, and 16-19 are obvious over Chalom in view of Aslan.

  • Prior Art Relied Upon: Chalom (Patent 10,664,949) and Aslan (Application # 2017/0132528).

  • Core Argument for this Ground:

    • Prior Art Mapping: Chalom was cited for teaching a system that uses a single MLA (a Random Forest Classifier, or RFC) to perform eye-contact correction by generating motion vector fields. Aslan was cited for teaching a "teacher-student" training architecture, where a large, complex "teacher" model (the first MLA) is used to train a smaller, more efficient "student" model (the second MLA). The combination of Chalom's system with Aslan's architecture would result in the claimed two-MLA training process.
    • Motivation to Combine (for §103 grounds): A POSITA would be motivated to combine these references to improve the computational efficiency of Chalom's system. Aslan addresses the problem of large models (like Chalom's RFC) requiring significant resources, and it proposes the teacher-student framework as a solution for "model compression." Chalom itself suggests its model could be compressed, motivating a search for techniques like those in Aslan.
    • Expectation of Success (for §103 grounds): The combination would have been straightforward. A POSITA could use Chalom’s RFC as the "teacher" model to train a smaller, more compact "student" RFC, following the explicit instructions provided in Aslan to achieve efficiency gains while maintaining accuracy.
  • Additional Grounds: Petitioner asserted additional obviousness challenges, including combining Chalom and Aslan with Grau (Application # 2018/0158246) to teach editing instructions like a "brightness adjustment field" (Ground 4), and with Pace (Patent 8,942,283) to teach wavelet transformation as an alternative compression technique (Ground 5).

4. Arguments Regarding Discretionary Denial

  • Petitioner argued that discretionary denial would be inappropriate under both §325(d) and §314(a).
  • §325(d) (Same or Substantially the Same Art or Arguments): Petitioner contended that while Ganin was cited during prosecution, the Patent Owner led the Examiner to overlook its relevance. Furthermore, the primary obviousness combination of Chalom and Aslan was never raised or considered during prosecution.
  • §314(a) (Fintiv Factors): Petitioner asserted that denial based on parallel district court litigation is unwarranted because it filed a Sotera stipulation, agreeing not to pursue in that litigation any invalidity grounds that are raised or could have been reasonably raised in the inter partes review (IPR).

5. Relief Requested

  • Petitioner requests institution of an IPR and cancellation of claims 1-14 and 16-19 of Patent 10,740,985 as unpatentable.