Searching over 5,500,000 cases.


searching
Buy This Entire Record For $7.95

Official citation and/or docket number and footnotes (if any) for this case available with purchase.

Learn more about what you receive with purchase of this case.

Virtual Solutions, LLC v. Microsoft Corp.

United States District Court, S.D. New York

February 15, 2013

VIRTUAL SOLUTIONS, LLC, Plaintiff,
v.
MICROSOFT CORPORATION, Defendant.

Memorandum Denying Reconsideration March 7, 2013.

Page 551

[Copyrighted Material Omitted]

Page 552

[Copyrighted Material Omitted]

Page 553

Timothy E. Grochocinski, Esq., Innovalaw, P.C., Orland Park, IL, Anthony G. Simon, Esq., Michael P. Kella, Esq., The Simon Law Firm, P.C., Saint Louis, MO, Harold Y. MacCartney, Jr., Esq., MacCartney, MacCartney, Kerrigan & MacCartney, Nyack, NY, for Virtual Solutions, LLC.

Ruffin B.Cordell, Esq., Lauren A. Degnan, Esq., Cherylyn Esoy Mizzo, Esq., Robert Courtney, Esq., Fish & Richardson P.C., Washington, DC, Jonathan A. Marshall, Esq., Leah A. Edelman, Esq., Fish & Richardson P.C., New York, NY, for Microsoft Corp.

OPINION AND ORDER

SHIRA A. SCHEINDLIN, District Judge.

I. INTRODUCTION[1]

Virtual Solutions, LLC (" Virtual" ) brings this action against Microsoft Corporation (" Microsoft" ). Virtual claims that Microsoft has infringed on claims 1-3, 5, 7, 8-9, and 22 of U.S. Patent No. 6,507,353 (" the '353 Patent" ), of which Virtual is the exclusive licensee. Microsoft now moves

Page 554

for summary judgment on the grounds that claims 1 and 8 of the '353 Patent are invalid for indefiniteness. Also before the Court are the parties' submissions related to the Markman hearing that took place on January 22, 2013. At this hearing, the Court also heard oral arguments on the present motion. For the following reasons, the motion is granted.

II. BACKGROUND[2]

A. The Patent

The '353 patent was filed on December 10, 1999 and was issued by the Patent and Trademark Office (" PTO" ) on January 14, 2003.[3] It is entitled " Influencing Virtual Actors in an Interactive Environment." [4] The patent claims " [a] method for generating a behavior vector for a virtual actor in an interactive theat[er] by interpreting stimuli from visitors ...." [5] The specification states that the object of the patent is " to provide a method for interacting with virtual actors in an interactive environment[,]" and " to simulate ‘ real-life’ behaviors[ ] of the virtual actors." [6]

The specification teaches that " [c]ombining the security of a controlled environment with the possibility to interact almost naturally with animal kind has always been a dream[,]" [7] and that viewing animals in captivity does not live up to this dream, because animals in captivity alter their behaviors.[8] The specification further teaches that, prior to the ' 353 Patent, virtual reality had showed promise in bringing this dream to life, but that the interactivity of virtual reality at the time was limited by its use of scripted scenarios.[9]

The specification describes a preferred embodiment consisting of a dome-shaped theater into which images are projected for viewing by an audience, and notes that the projection of images could be replaced by holography or any other type of presentation.[10] In a preferred embodiment of the main modules described in the patent, these modules are to implemented via software. [11]

The '353 patent also describes sensors in the theater area that detect physical information about audience members and " Stimulus Generators" that analyze that information.[12] The patent states that a system could feed this sensor data into the " behavioral module" of a " virtual actor," which is " likely [t]o be [a] virtual animal[ ] or [a] virtual physical actor[ ] which ha[s]

Page 555

behaviors that are easier to simulate than those of humans." [13]

The " behavioral module" of the virtual actor would then calculate the reaction of the actor.[14] One component of the " behavioral module" would be the " behavioral model" of a virtual actor, a set of factors specific to a virtual actor that would mediate its response to the data provided to it by the Stimulus Generator.[15] For example, the specification describes the age of a virtual actor as a possible factor within that actor's behavioral model, and states that " [a]n older animal will more likely be passive with respect to the stimuli of the visitors." [16]

In sum, the system would collect and analyze physical information from visitors, feed that information to virtual actors, the behavioral models of which would mediate a response (or non-response).[17] In this way the system would allow the virtual actors to respond to the physical information collected from viewers in real time.[18]

B. The Parties' Experts[19]

Both parties have offered expert testimony in connection with this motion. Microsoft's expert is Aaron T. Bobick, Ph.D. (" Bobick" ). Bobick received bachelor of science degrees in mathematics and computer science from the Massachusetts Institute of Technology (" MIT" ) in 1981, and a Ph.D. in cognitive science from MIT in 1987. He has been in the academic field since receiving his Ph.D., with stints at MIT and Stanford. Since 2003, he has been employed as full professor at the Georgia Institute of Technology's College of Computing, where he was the founding chair of the School of Interactive Computing. Additionally, from 1993-1996, he served as the Chief Technology Officer of Cybergear, a company that he founded. In connection with Cybergear, Bobick received several patents for an " interactive exercise apparatus."

Over the course of his career, Bobick has published twenty-two articles in peer-refereed journals on topics related to machine perception and virtual reality. He has also received nine grants, as principal investigator, on the same topics. Finally, Bobick has provided expert testimony in eight prior cases, mostly in the field of computer vision.

Virtual's expert is Vyacheslav Zavadsky, Ph.D. (" Zavadsky" ). Zavadsky received a Masters in Computer Science in 1994 from Belarusian State University, and a Ph.D. in the same field from Belarusian State University in 1998. He is currently employed as the Principal of Zavadsky Technologies, where his duties include patent assessment, software project management, and software development. From 2003 to 2011, he worked at UBM TechInsights, where he was engaged in patent analysis

Page 556

and reverse engineering. During his tenure at UBM, Zavadsky reviewed hundreds of patents.

Zavadsky is the named inventor on twelve issued United States patents, at least five of which pertain to image processing and computer vision. He has ten additional patents pending.

III. LEGAL STANDARD

Indefiniteness is an issue that is amenable to summary judgment.[20] Summary judgment is appropriate " if the movant shows that there is no genuine dispute as to any material fact and the movant is entitled to judgment as a matter of law." [21] " ‘ An issue of fact is genuine if the evidence is such that a reasonable jury could return a verdict for the nonmoving party. A fact is material if it might affect the outcome of the suit under the governing law.’ " [22] " The moving party bears the burden of establishing the absence of any genuine issue of material fact." [23] " When the burden of proof at trial would fall on the nonmoving party, it ordinarily is sufficient for the movant to point to a lack of evidence ... on an essential element of the nonmovant's claim." [24] In turn, to defeat a motion for summary judgment, the non-moving party must raise a genuine issue of material fact. To do so, the non-moving party " ‘ must do more than simply show that there is some metaphysical doubt as to the material facts,’ " [25] and " ‘ may not rely on conclusory allegations or unsubstantiated speculation.’ " [26]

In deciding a motion for summary judgment, a court must " ‘ construe the facts in the light most favorable to the non-moving party and must resolve all ambiguities and draw all reasonable inferences against the movant.’ " [27] However, " ‘ [c]redibility determinations, the weighing of the evidence, and the drawing of legitimate inferences from the facts are jury functions, not those of a judge.’ " [28] " ‘ The role of the court is not to resolve disputed issues of fact but to assess whether there are any factual issues to be tried.’ " [29]

IV. APPLICABLE LAW

" A determination of indefiniteness is a legal conclusion that is drawn from the court's performance of its duty as the construer of patent claims ...." [30] I will therefore lay out the law of claim construction

Page 557

that is applicable to this case prior to stating the law applicable to indefiniteness.

" Analysis of patent infringement starts with ‘ construction’ of the claim, whereby the court establishes the scope and limits of the claim, interprets any technical or other terms whose meaning is at issue, and thereby defines the claim with greater precision than had the patentee." [31] Claim construction is a question of law.[32] Judges construe claims with the goal of " ‘ elaborating the normally terse claim language in order to understand and explain, but not to change, the scope of the claims.’ " [33]

The claim construction inquiry begins from the " objective baseline" of the ordinary and customary meaning that a claim term would have to a person of ordinary skill in the art at the time of the effective filing date of the invention.[34] An accused infringer faces a " heavy presumption" that the ordinary meaning of claim terms applies, and a claim term may not be narrowed merely by pointing out that the preferred embodiment of a patent is narrower than the ordinary meaning of its claim terms.[35]

With this being said, the ordinary and customary meaning of a term to a skilled artisan is only a baseline. A deeper inquiry is needed when a claim term is ambiguous or when " reliance on a term's ‘ ordinary meaning’ does not resolve the parties' dispute." [36] As such, a judge may not leave questions of claim construction to the finder of fact by instructing a jury that an ambiguous or disputed term is to be given its ordinary meaning. [37]

" A person having ordinary skill in the art" is a legal fiction, like tort law's " reasonable person." She is a hypothetical person who knows and understands all of the relevant art within the field of invention and any related technical fields, but is utterly uncreative.[38] To determine how such a hypothetical person would have understood the meaning of a disputed claim term, courts look to publicly available sources, including both intrinsic evidence and extrinsic evidence.[39]

A. Intrinsic Evidence

Because intrinsic evidence provides " a more reliable guide to the meaning of a claim term than [ ] extrinsic

Page 558

sources[,]" [40] and because a person having ordinary skill in the art would use it to understand a patent,[41] district courts use the intrinsic record as " the primary tool to supply the context for interpretation of disputed claim terms." [42] The intrinsic record consists of information found within the public record of a patent, such as its specification and prosecution history.[43]

The most important source of intrinsic evidence is the language of the claims, because a patent's claims define the scope of the patentee's right to exclude.[44] Sometimes " the ordinary meaning of claim language as understood by a person of skill in the art" will be so apparent from the claim language itself that no further inquiry is needed.[45] Even if claim language is not self-explanatory, the context in which a claim term is used will often be highly instructive as to its meaning.[46]

The second most important source of intrinsic evidence is provided by the patent's specification,[47] which typically includes: " an abstract of the invention; a description of the invention's background; a summary of the invention; patent drawings; and a detailed description that discusses preferred embodiments of the invention." [48] A patent's specification is " always highly relevant to the claim construction analysis[,]" and therefore " claims must be read in view of the specification ...." [49]

Page 559

While it is permissible (and often necessary) to " us[e] the specification to interpret the meaning of a claim[,]" it is generally impermissible to " import[ ] limitations from the specification into the claim." [50] This is so because the words of the claim " define the scope of the right to exclude ...." [51] For the purposes of claim construction, this rule defines the boundary between a patent's claims and its specification.[52] In light of this rule, the specification may be used to limit a claim only if:

(1) [ ] the claim " explicitly recite[s] a term in need of definition" ; or (2)[ ] the specification unambiguously defines a term, i.e., if " a patent applicant has elected to be a lexicographer by providing an explicit definition in the specification for a claim term." [53]

In drawing the distinction between using the specification to interpret a claim term, and improperly importing a limitation from the specification to construe a claim term, a court should bear in mind " that the purposes of the specification are to teach and enable those of skill in the art to make and use the invention and to provide a best mode for doing so." [54] In most cases, a person having ordinary skill in the art would appreciate the difference between a preferred embodiment presented as an example, and one intended as a limitation on the claimed invention.[55]

B. Extrinsic Evidence

The extrinsic record " ‘ consists of all evidence external to the patent and prosecution history, including expert and inventor testimony, dictionaries, and learned treatises.’ " [56] Although " district courts [ ][may] rely on extrinsic evidence, ... extrinsic evidence in general [ ][is] less reliable than the patent and its prosecution history in determining how to read claim terms." [57] This is so for many reasons, including: (1) extrinsic evidence was not created at the time of the patent for the purpose of explaining the patent's scope and meaning; (2) external publications " may not reflect the understanding of a skilled artisan in the field of the patent" ; (3) expert reports and testimony are created at the time and for the purpose of litigation and may suffer from bias; (4) " there is a virtually unbounded universe of potential extrinsic evidence of some marginal relevance that could be brought to bear on any claim construction question" ; and (5) unlike intrinsic evidence, extrinsic evidence is not necessarily part of the public record, and therefore undue reliance on it undermines the public notice function of patents.[58]

Expert reports and expert testimony are forms of extrinsic evidence. They can be helpful to a court by:

Page 560

[providing] background on the technology at issue, [explaining] how an invention works, [ensuring] that the court's understanding of the technical aspects of the patent is consistent with that of a person of skill in the art, or [establishing] that a particular term in the patent or the prior art has a particular meaning in the pertinent field.[59]

However, it is unhelpful to a court if an expert " simply recites how [she] would construe [a disputed term] based on [her] own reading of the specification." [60] Further, " a court should discount any expert testimony ‘ that is clearly at odds with the claim construction mandated by’ " the intrinsic evidence.[61]

C. Canons of Construction

In construing the meaning of claim terms, courts may employ interpretive aids known as canons of construction. Because they are merely interpretive aids, they cannot displace contrary intrinsic evidence. Indeed, " no canon of construction is absolute in its application...." [62] This section discusses several canons that are pertinent to this dispute.

Because " claims are interpreted with an eye toward giving effect to all terms in the claim[,]" [63] courts strive to avoid constructions that render a term or phrase superfluous.[64] This is known as the rule against surplusage. For the same reason, " different claim terms are presumed to have different meanings ...." [65] This presumption is rebuttable when the intrinsic evidence weighs against it. [66] And just as distinct words are presumed to have distinct meanings, the same words are presumed to have the same meaning each time they appear. [67] This presumption is also rebuttable by the intrinsic evidence, [68] at least insofar as the same phrase may warrant a different meaning when it appears in different claims (as opposed to the same claim).[69]

Page 561

D. Indefiniteness

35 U.S.C. § 112(b) (" Section 112, ¶ 2" ) states:

The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.

Under this section, the inventor has a duty to " inform the public of the bounds of the protected invention, i.e., what subject matter is covered by the exclusive rights of the patent." [70] An inventor's failure to " particularly point[ ] out and distinctly claim[ ]" the subject matter of her invention provides a basis for invalidating a patent claim.[71]

Whether a claim is indefinite is a question of law. [72] Patents issued by the PTO enjoy a presumption of validity, [73] and so indefiniteness must be proved by clear and convincing evidence.[74] " A claim is definite if one skilled in the art would understand the bounds of the claim when read in light of the specification [,]" [75] and " [a] claim is ... indefinite only where it is not amenable to construction or is insolubly ambiguous." [76] If a narrowing construction can properly be adopted, then a claim is not indefinite.[77]

V. DISCUSSION

A. The Parties' Contentions Relating to Claim 1, " physical characteristic signal"

Claim 1 is the only independent claim of the '353 patent. It includes the phrase " physical characteristic signal." Microsoft argues that this phrase is indefinite, and that as a consequence Claim 1 and its dependents are invalid.[78] Claim 1 is reproduced below, with the phrases relevant to Microsoft's contentions appearing in bold:

1. A method for generating a behavior vector for a virtual actor in an interactive theatre by interpreting stimuli from visitors, the method comprising:
[a] providing a plurality of sensors detecting and sensing at least one physical characteristic at a plurality of positions within a theatre area within which a number of visitors are free to move about, said sensors generating sensor signals;
[b] interpreting said sensor signals to provide at least one physical characteristic signal including position information, wherein said physical characteristic signal provides information on visitor activity and location within said theater area;

Page 562

[c] providing a behavior model for at least one virtual actor;
[d] analyzing said at least one physical characteristic signal, a change over time of said physical characteristic signal and said behavior model for said at least one virtual actor to generate a behavior vector of said at least one virtual actor using said position information and said at least one physical characteristic signal, said behavior vector being generated in real-time;
[e] whereby a virtual actor reacts and interacts, in real-time, with visitors depending on the visitors' actions and evolution of said actions.[79]

Microsoft argues that Claim 1 is invalid because it requires that " physical characteristic signal" both include and exclude " position information," which, Microsoft argues, is logically impossible.[80] Microsoft further argues that neither the patent prosecution history nor the written description of the '353 patent shed light on the relationship between " position information" and " physical characteristic signal." [81] Bobick, Microsoft's expert, declares that " one of ordinary skill in the art would view it as critical to know the relationship between these two claim elements [ i.e. position information and the physical characteristic signal]." [82] Bobick further states that, because " [t]he claims[ ] simultaneous[ly] require [ ][ ] two contradictory elements ... it [is] impossible to provide a meaningful interpretation to these claim terms." [83]

In its claim-construction brief, Virtual argued that the phrase " a plurality of sensors detecting and sensing at least one physical characteristic signal at a plurality of positions within a theatre area" did not require construction, as it could be given its plain and ordinary meaning.[84] However, in the same brief Virtual reserved the right to respond to Microsoft's indefiniteness contentions until after Microsoft filed the present motion.[85]

In its opposition to this motion, Virtual argues that " [p]osition information is merely information, or data, that pertains to the location of visitors in the theater area." [86] Virtual further argues that " [a]s described in claim 1, the ‘ physical characteristic signal’ is a signal that includes, inter alia, the position information." [87] On Virtual's construction, because " position information" is merely data, the fact that it may be used in different ways does not render Claim 1 indefinite.[88]

Microsoft's response is that it does not matter whether it may make sense to use position information in different ways, because claim 1 " requires that two specific elements (‘ said physical characteristic signal’

Page 563

and ‘ said position information’ ) are two inputs to one function (‘ generat[ing] a behavior vector’ ) ...." [89] In other words, Microsoft clarifies that it is not arguing as a general matter that data cannot be used in two different ways. Instead, Microsoft is arguing that the '353 Patent requires that two elements be used in a seemingly contradictory way, without disclosing to the public their relationship to one another.

The problem is particularly acute in the embodiment of the '353 Patent where the physical characteristic signal is only position information. The ' 353 Patent clearly encompasses this embodiment, because limitation l[b] recites the phrase " physical characteristic signal including position information." [90] In this case, limitation 1[d] would read: " to generate a behavior vector of said at least one virtual actor using said position information and said [position information]."

It does appear to be both surplusage and contradictory for claim 1 to state " using said position information and said at least one physical characteristic signal [which may just be position information]." However, Virtual argues that when claim 1 is read in light of the specification, the apparent contradiction dissolves. Before addressing Virtual's arguments, I will provide my interpretation of the '353 Patent's teachings about the use of " position information."

B. The Specification's Discussion of " Position Information"

" Position information" in the '353 Patent refers to positional data collected from the users of the virtual reality system disclosed in the '353 Patent.[91] In the preferred embodiment disclosed in the specification, position information is collected by sensors situated at a plurality of positions throughout the dome.[92] The raw data collected from these sensors is then fed into an " interpreter," which " analyzes the raw signals from these sensors a [sic] produces a physical characteristic signal ...." [93] For example, the interpreter would take the first derivative of the raw position information (rate of change of the user's position) in order to determine the visitor's displacement, and it would take the second derivative of this information (rate of the rate of change) to determine the visitor's acceleration.[94]

The filtered positional data, which is termed the " physical characteristic signal," is then fed into the " behavioral module" of the virtual actor.[95] The behavioral module

Page 564

processes this input in light of the virtual actor's characteristics ( e.g., age, position within the simulation, sleepiness) to plan an action for the virtual actor.[96] (It is important to note that the position of the virtual actor, as opposed to the user, is an independent input to the behavioral module.[97] This positional data pertaining to virtual actors should not be confused with the position information pertaining to users that is at issue here.)

The planned action decided upon by the behavioral module, which is sometimes called a " behavior vector," [98] is then fed into the " biophysical model action generator[,]" which calculates the physics of the planned action in light of the virtual actor's physical characteristics, e.g. its gross anatomy, as well as the physical situation within the simulation.[99] The biophysical model action generator's output is a " rendering vector," which it sends to a " rendering module," a component that determines how the planned action is to be displayed in light of the physical constraints imposed by the biophysical model action generator.[100]

The sum total of the planned actions of all of the virtual actors in the simulation is summed and then sent to the " Overall Rendering Module," which determines how the actions of all of the virtual actors in the simulation are to be displayed, generating an " overall rendering effect." [101] This overall rendering effect is then decomposed into haptic, sonic, and visual elements by a " haptic effect generator," a " sound effect generator," and an " image generator." [102] Finally, each of these generators sends outputs into a corresponding module, e.g., " the haptic module," and these modules " generate what the visitor will hear, feel and see." [103]

Data from the " biophysical model action generator" is also sent to the " virtual environment database," which stores the locations of all of the virtual actors at any given point in time.[104] The " virtual environment stimulus generator" reads from this database and, based on the information received, decides whether to generate random events ( e.g. creating a new virtual actor).[105]

The specification further states that the biophysical model action generator also reads from the virtual environment database in the process of computing the virtual actor's planned action. It is claimed

Page 565

that this information from the virtual environment database could, for example, enable a virtual actor to avoid a collision with another virtual actor.[106] However, it is unclear whether it is the biophysical model action generator, or the behavioral module, that reads from the virtual environment database. Logically, one would expect that it would be the behavioral module reading from the database, because that module plans the virtual actors' actions. The specification, though, is unclear on the point.[107]

In sum, " position information" is data about the position of the user of the system that is collected by sensors. After being filtered, this data is transmitted to the " behavioral module" of a virtual actor, which plans a reaction. This planned reaction is refined in light of the virtual actor's physical characteristics, and the prevailing conditions within the simulation, by the " biophysical model action generator." After this refinement, the planned reaction is transmitted to a rendering module where, after being combined with the actions of the other virtual actors in the system, it is displayed to the user of the system.

C. The Specification Does Not Resolve the Contradiction in Claim 1

1. It Is Not Inherently Paradoxical to Use a Data Structure in Two Different Ways in a Computer Program

With this understanding in mind, I now turn to Virtual's arguments that the specification resolves the apparent contradiction in claim 1. As an initial matter, I agree with Virtual's expert, Zavadsky, that it is not inherently contradictory for a computer system to use data, such as " position information," in two different ways.[108] For the purposes of this motion, I also agree with Zavadsky that, because the specification of the ' 353 Patent indicates that object-oriented programming techniques were to be used in implementing the claimed invention, there are a number of reasons that data might be used in different ways within the '353 Patent.[109]

However, Zavadsky's observation that there might be a reason for position information to be both included within and separately added to the physical characteristic signal to generate a behavior vector misses the mark. Bobick's real point is that because " one of ordinary skill in the art would view it as critical to know the relationship between [position information and the physical characteristic signal][,]" [110] and because the '353 Patent

Page 566

fails to disclose this relationship, the inventor failed to distinctly claim what he regarded as his invention.[111]

Microsoft has unearthed an apparent logical contradiction within claim 1, and it will have met its burden of proving invalidity by clear and convincing evidence unless Virtual resolves this contradiction. To do that, Virtual must demonstrate that the claim's apparent contradiction is resolved by the specification, as read by a skilled artisan. More specifically, Virtual must show that the specification discloses a way in which position information may be used alongside a physical characteristic signal (which might contain only position information) to generate a behavior vector. Virtual makes two arguments on this score, both of which are addressed below.

2. Figures Three and Four of the '353 Patent Provide No Support for Virtual's Position

First, Virtual argues that figures three and four of the '353 Patent, and their accompanying text, demonstrate how position information and a physical characteristic signal containing position information may be used together to generate a behavior vector.[112] Figure three, " a block diagram of the details of the behavioral module[,]" [113] is a flowchart that shows that positional information is fed into the behavioral module, which generates a behavior vector.

Figure four is described only as " a flow chart of the general steps involved ...." [114] It depicts a " sensor signal" being sent to an " interpreter," which in turn sends a " physical characteristic signal" to an " analyzer." The analyzer is depicted as also receiving input from a " behavior model," which is not networked with anything else on the flow-chart. Finally, the analyzer's output is termed a " behavior vector."

According to Virtual, the fact that figure four shows the analyzer receiving input from the " behavior model" as well as the physical characteristic signal demonstrates that position information is both a part of, and distinct from, the physical characteristic signal.[115] This is so because, in Virtual's view, the " behavior model" depicted in figure four is the same as the " behavioral module" depicted in figure three, and the behavioral module contains the position information.[116]

Virtual supports this view by quoting the description accompanying figure four. It states: " [a]n interpreter 66 filters and analyzes the raw signals from these sensors and produces a physical characteristic signal .... The analyzer 67 reads from the behavioral module 68 and produces a rendering vector which will be used to display the environment." [117]

There are two discrepancies between figure four and its accompanying text. First, the text refers to " behavioral module 68," while figure four depicts " behavior model" 68. Second, the text also states that the output of the " analyzer" is a " rendering

Page 567

vector," while figure four depicts the output of the analyzer as a " behavior vector." Virtual does not identify, much less attempt to justify, these discrepancies. Instead, it implicitly asserts that the text's label for the component (" behavioral module" ), and the figure's label for the output of the analyzer (" behavior vector" ), should be credited. [118]

The tension between figure four and its text may be resolved with reference to the inputs and outputs of the figure's " analyzer." The input of the " analyzer" is a " physical characteristic signal," which implies that the " analyzer" is related (if not identical) to the " behavioral module," because the behavioral module has the same input.[119]

The text states that the output of the " analyzer" is a " rendering vector." As discussed above, a " rendering vector" is the output that the " biophysical model action generator" produces after it has received information from the " behavioral module" (which has in turn processed a " physical characteristic signal" containing position information). If the output of the analyzer is a rendering vector, then the component that the text accompanying figure four calls " behavioral module 68" must be the biophysical model action generator. But this reading make no sense, because the biophysical model action generator receives inputs from the behavioral module, and not vice versa.[120]

Figure four— as opposed to the text— identifies the output of the " analyzer" as a " behavior vector." This is also the output of the behavioral module, according to figure three, which reinforces my earlier conclusion that the analyzer is related to the behavioral module.

This still leaves the question of what the " behavior model" referred to by figure four is. Whatever it is, it is plainly distinct from both " position information" and the " physical characteristic signal." The summary section of the '353 patent states that one of the functions of the claimed invention is: " analyzing the at least one physical characteristic signal and the behavior model for said at least one virtual actor to generate a behavior vector." [121] Likewise, claim 13 states " wherein said generating a behavior vector comprises adding a reaction for each physical characteristic signal, using said behavior model and said position information to an overall reaction to generate a behavior vector." [122]

The nature of the " behavior model" is revealed by Claim 19, which states: " [a] method as claimed in claim 18, wherein said behavior model comprises psychological factors, wherein said psychological factors are at least one of age factor, hunger, thirst, sleepiness, attention span and disability...." [123] In other words, the " behavior model" of a virtual actor relates to its disposition and characteristics.

The most plausible reading of figure four, then, is that it means what it says. The " behavior model" in figure four, which is fed into the " analyzer," really is the " behavior model" of the virtual actor. It does not contain a " physical characteristic signal," or position information about the user. And the " analyzer" takes input from

Page 568

it, and takes a separate input from the " physical characteristic signal," in generating a " behavior vector."

Figure four does not show, as Virtual had hoped, that position information and the physical characteristic signal (containing, or consisting wholly of, position information) are both used jointly to generate a behavior vector. And even if it did, it is doubtful that a figure with such a muddled explanation would resolve the ambiguity in claim 1 to a skilled artisan. In short, Virtual's attempt to clarify ambiguous claim language with a confusing reference to two figures and their accompanying text fails.

3. The Claims of the '353 Patent Do Not Indicate How Position Information May Be Both Part of and Distinct from the " Physical Characteristic Signal"

Virtual's final argument is that certain other claims— namely 10, 13, and 15— indicate that position information is to be used in different ways.[124] The central flaw in this argument is that it does not respond to Microsoft's point that claim 1 requires that two elements (position information and the physical characteristic signal, which may just be position information) generate one function (a behavior vector), but does not disclose the relationship between these elements. Moreover, the claims that Virtual points to in order to resolve the ambiguity in claim 1 are, themselves, ambiguous.

Claim 10 discloses a " new actor creation module" that creates a new virtual actor using the physical characteristic signal, the behavior model, and position information.[125] Like claim 1, it is unclear why claim 10 requires that position information be used twice to generate its output. It is possible that the position information contained in the physical characteristic signal might be distinct from the explicitly referenced position information, in that the latter term might refer to the position of the virtual actors within the simulation.[126] But this is just conjecture based on a poorly written specification. Virtual is adamant that " [p]osition information is ... data[ ] that pertains to the location of visitors in the theater area [,]" [127] and the weight of the specification supports this view, as does the consistency canon.[128] Furthermore, even if " position information" in claim 10 may be re-written as " position information [about the virtual actor(s) ]," there is nothing in the specification to indicate that the same holds true for limitation 1[d]. Limitation 1[d] relates to generating a behavioral vector, not a new actor.

Claim 13 relates to generating a behavior vector, but it does not illuminate the relationship between the " physical characteristic signal" and " position information" in limitation 1[d]. It states: " [a] method as claimed in claim 12, wherein said generating a behavior vector comprises adding a reaction for each physical characteristic

Page 569

signal, using said behavior model and said position information to an overall reaction to generate a behavior vector." [129]

The specification provides some help in understanding claim 13. As previously stated, the behavior module generates a behavior vector, and takes input from the physical characteristic signal. And the specification teaches that the " overall reaction generator" is the final component of the behavioral module, [130] the function of which is to sum the reactions of all virtual actors prior to sending a behavior vector to the biophysical model action generator. [131] Presumably, then, the " overall reaction" of claim 13 is the output of the " overall reaction generator."

Even with this understanding, claim 13 makes no sense. Like claim 10, it might relate to the position information of virtual actors, because it depends on claim 11, " wherein said virtual environment stimulus is added to said at least one physical characteristic signal." [132] But this only heightens the ambiguity of claim 13: if position information about both users and virtual actors is contained within the physical characteristic signal, it does not make sense to " add[ ] a reaction for each physical characteristic signal," and also, separately, " us[e] ... said position information[,]" [133] as claim 13 requires. And nothing in the specification resolves this problem. For example, the Patent gives no indication as to what " position information to an overall reaction" may refer to.

In sum, claim 13 suffers from the same deficiency as claim 1: it requires that a function be generated by two apparently identical elements, without describing the relationship between those elements. Claim 15 also suffers from this deficiency, and likewise does not support Virtual's argument. [134] As such, none of the claims cited by Virtual succeed in resolving claim 1's ambiguity.

D. Holding

Microsoft has the burden of demonstrating, by clear and convincing evidence, that claim 1 is invalid.[135] In light of the foregoing discussion, I conclude that it has met that burden. Section 112, ¶ 2 requires that the inventor disclose what she regards to be her invention with sufficient definiteness that " one skilled in the art would understand the bounds of the claim when read in light of the specification." [136] The inventor of the '353 Patent failed to make this disclosure.

As Microsoft points out, claim 1 requires that two apparently contradictory statements hold true: " position information" must be simultaneously part of, and used separately from, the " physical characteristic signal" to generate a behavior vector. Bobick, a person having ordinary skill in

Page 570

the relevant field at the relevant time, has stated that, in the context of claim 1, " one of ordinary skill in the art would view it as critical to know the relationship between [position information and the physical characteristic signal]." [137] Virtual has offered no evidence or argument to refute Microsoft's contention that the '353 Patent does not disclose this relationship.

A court may not " re-write claims to preserve their validity[,]" even if it is plain that the inventor did not mean what she wrote.[138] Virtual's expert has identified a plethora of reasons why, in a hypothetical patent, it might make sense to use position information in different ways. The '353 Patent, though, does not disclose why position information must be used twice to generate a behavior vector. Claim 1 simply states that position information and the physical characteristic signal (which might contain only position information) are both to be used to generate a behavior vector; and the balance of the patent does nothing to resolve the paradox. The public would be ill-served if Virtual were allowed to retain a monopoly on a patent that fails to disclose its scope. I therefore hold that claim 1 and all of its dependent claims are invalid.

Because claim 1 is the only independent claim of the '353 Patent, I need not consider the parties' contentions relating to claim 8. The same holds true for the parties' claim construction contentions.

VI. CONCLUSION

For the reasons stated above, Microsoft's motion for summary judgment is granted. The Clerk of the Court is directed to close the motion (Doc. No. 43) and the case.

SO ORDERED.

MEMORANDUM OPINION AND ORDER

I. INTRODUCTION

Virtual Solutions, LLC (" Virtual" ) brought this action against Microsoft Corporation (" Microsoft" ), claiming that Microsoft had infringed on claims 1-3, 5, 7, 8-9, and 22 of U.S. Patent No. 6,507,353 (" the '353 Patent" ), of which Virtual is the exclusive licensee. On February 15, 2013, I granted Microsoft's motion for summary judgment on the ground that claim 1 of '353 Patent is invalid for indefiniteness.[1] Virtual now moves for reconsideration. [2]

Virtual raises three grounds in support of its motion for reconsideration: (1) that because " [t]here is no logical contradiction in claim 1 [and] ... all parties and all experts agree on what is required by claim 1[,] it cannot be indefinite[; ]" [3] (2) that the Summary Judgment Opinion improperly held that the contradiction in claim 1 had to be resolved by the specification, when under controlling precedents it could be resolved by a person having ordinary skill in the art; [4] and (3) that the Summary Judgment Opinion " improperly shifted the burden of proof to Virtual[.]" [5] For the

Page 571

following reasons, Virtual's motion for reconsideration is denied.

II. MOTION FOR RECONSIDERATION STANDARD

Motions for reconsideration are governed by Local Rule 6.3 and are committed to the sound discretion of the district court. [6] " ‘ [R]econsideration will generally be denied unless the moving party can point to controlling decisions or data that the court overlooked— matters, in other words, that might reasonably be expected to alter the conclusion reached by the court.’ " [7] " Typical grounds for reconsideration include ‘ an intervening change of controlling law, the availability of new evidence, or the need to correct a clear error or prevent manifest injustice.’ " [8] Yet, because " the purpose of Local Rule 6.3 is to ‘ ensure the finality of decisions and to prevent the practice of a losing party examining a decision and then plugging the gaps of a lost motion with additional matters,’ " [9] the Rule must be " narrowly construed and strictly applied so as to avoid repetitive arguments on issues that have been considered fully by the Court." [10]

III. DISCUSSION

Virtual's motion for reconsideration fails at the outset, because Virtual has not demonstrated that the Court committed clear error, or overlooked facts or controlling precedents that would have altered the outcome of the Summary Judgment Opinion. Each of the three grounds for reconsideration asserted by Virtual are discussed in turn below.

First, based on the following passage, Virtual argues that the Summary Judgment Opinion improperly shifted the burden of proof:

Microsoft has unearthed an apparent logical contradiction within claim 1, and it will have met its burden of proving invalidity by clear and convincing evidence unless Virtual resolves this contradiction. To do that, Virtual must demonstrate that the claim's apparent contradiction is resolved by the specification, as read by a skilled artisan.[11]

From this passage, and the balance of the Opinion,[12] it is apparent that the Summary Judgment Opinion does not shift the burden of proof, but rather weighs all of the evidence in determining whether Microsoft

Page 572

had met its burden of proving invalidity. The Federal Circuit has recently emphasized that " use of the terms ‘ prima facie’ and ‘ rebuttal’ in addressing an invalidity challenge does not constitute reversible error as long as the court ‘ consider[s] all evidence ... before reaching a determination’ and does not shift the burden from the patent challenger." [13] Because the Court considered all of the evidence and did not shift the burden of proof, Virtual's first ground for reconsideration is without merit.

Second, Virtual argues that reconsideration is warranted because the Summary Judgment Opinion relied on intrinsic evidence to the exclusion of considering an ordinary skilled artisan's " ‘ knowledge of the relevant art area.’ " [14] Specifically, Virtual argues that the testimony of its expert, Vyacheslav Zavadsky, Ph.D., created a material issue of fact with respect to whether claim 1 is indefinite.[15]

Virtual ignores that the Summary Judgment Opinion explicitly considered, and rejected, the reasons given by Zavadsky for denying Microsoft's motion for summary judgment. The Summary Judgment Opinion credited the testimony of Microsoft's expert, Aaron T. Bobick, Ph.D., that " ‘ one of ordinary skill in the art would view it as critical to know the relationship between [position information and the physical characteristic signal in claim 1 of the '353 Patent][,]’ " and ultimately concluded that " because the '353 Patent fails to disclose this relationship," claim 1 is invalid.[16] The Summary Judgment Opinion further found that Zavadsky's declaration does not respond to Bobick's concern because, while it shows that data may be used twice to generate one output in the abstract, it does not show that the '353 Patent plausibly discloses the relationship between position information and a physical characteristic signal.[17] The Summary Judgment Opinion also considered and rejected Zavadsky's various explanations for claim 1's ambiguity.[18] In sum, because the Court explicitly considered Zavadsky's testimony, Virtual's second ground for reconsideration is likewise without merit.

Third, and finally, Virtual argues that reconsideration is warranted because " [a]ll parties agree that claim 1 requires generating a behavior vector using (1) position

Page 573

information; and (2) a physical characteristic signal including position information[,]" and, on Virtual's view, these requirements are not contradictory.[19] Because this argument is yet another " attempt[ ] to relitigate an issue that already has been decided [,]" [20] it does not support reconsideration.

Prior to the Summary Judgment Opinion, one of the main points of contention between the parties was whether the intrinsic evidence, as read by a skilled artisan, disclosed the relationship between position information and a physical characteristic signal including— or comprised solely of— position information. Virtual, having lost that motion, now re-asserts various arguments that were considered and rejected in the Summary Judgment Opinion.[21]

Virtual also re-asserts an argument that it made during oral argument, namely that the position information that is combined with the physical characteristic signal (that might be comprised solely of position information) is ‘ raw data,’ while the physical characteristic signal is processed.[22] Notably, this argument was not advanced in Virtual's brief in opposition to Microsoft's motion for summary judgment, nor supported by any expert testimony. Regardless, it was considered by the Court, and implicitly rejected by the Summary Judgment Opinion's holding that the '353 Patent fails to disclose the relationship between two required elements.[23] Virtual's untimely attempts to " ‘ plug[ ] the gaps of [its][ ] lost motion with additional matters' " are unavailing.[24]

The test for indefiniteness stems from the statutory requirement that a patent's claims " particularly point[ ] out and distinctly claim[ ]" what the inventor regards as her invention.[25] " Indefiniteness requires a determination whether those skilled in the art would understand what is claimed [.]" [26] In the context of means-

Page 574

plus-function claiming, " it is well established that proving that a person of ordinary skill could devise some method to perform the function is not the proper inquiry as to definiteness— that inquiry goes to enablement." [27] When functional claiming is not employed, a variant of this principle applies: because courts may not " rewrite claims to preserve their validity[,]" an insolubly ambiguous claim is invalid, regardless of whether one skilled in the art would understand it to mean something other than its literal terms.[28]

Under these principles of law, the Summary Judgment Opinion held that claim 1 of the '353 Patent is invalid for indefiniteness, because it requires that two seeming-identical elements generate one output, without disclosing their relationship to persons having ordinary skill in the art. In other words, the Summary Judgment Opinion held claim 1 indefinite because a skilled artisan would not " understand what i[t] claimed[.]" [29] Virtual has pointed to no overlooked authority, argument, fact, or error supporting reconsideration of the Summary Judgment Opinion. Its motion for reconsideration is therefore denied.

IV. CONCLUSION

For the reasons stated above, Virtual's motion for reconsideration is denied. The Clerk of the Court is directed to close the motion (Docket No. 72).

SO ORDERED.


Buy This Entire Record For $7.95

Official citation and/or docket number and footnotes (if any) for this case available with purchase.

Learn more about what you receive with purchase of this case.