U.S. Pat. No. 11,439,919
INTEGRATING COMMENTARY CONTENT AND GAMEPLAY CONTENT OVER A MULTI-USER PLATFORM
AssigneeSony Interactive Entertainment LLC
Issue DateSeptember 8, 2020
Illustrative Figure
Abstract
A multi-user platform provides an immersive digital (including virtual reality (VR)) environment to solicit and broadcast commentary related to gameplay. The multi-user platform receives gameplay content, which includes a plurality of media streams that show one or more views. The multi-user platform generates a graphical representation for each media stream in the VR environment, receives commentary content corresponding to one or more media streams in the VR environment, and determines one graphical representation for one media stream is an active representation in the VR environment for a time period. The multi-user platform further synchronizes a portion of the commentary content received in the time period with the one media stream associated with the active representation to create synchronized content and broadcast gameplay channel(s) that includes the synchronized content to one or more subscribers connected to the multi-user platform.
Description
DETAILED DESCRIPTION Various embodiments of the disclosure are discussed in detail below. While specific implementations are discussed, it should be understood that this is done for illustration purposes only. A person skilled in the relevant art will recognize that other components and configurations may be used without parting from the spirit and scope of the disclosure. As used herein, the term “user” refers to a user of an electronic device(s) and can include participants or players, as well as non-participants or spectators. Actions performed by a user in the context of computer software shall be considered to be actions taken by a user to provide an input to the electronic device(s) to cause the electronic device to perform the steps embodied in computer software. The terms “stream” or “media stream” are synonymous and generally refer to data or content associated with an online game or an online game session. As discussed in herein, the subject disclosure generally relates to online gameplay hosted by multi-user platforms and improving spectator experiences. In particular, the techniques disclosed herein integrate and synchronize commentary content an gameplay content and broadcast such content to subscribers of the multi-user platform. Referring to the figures,FIG. 1illustrates a schematic diagram100of an example communication network105(e.g., the Internet). Communication network105is shown for purposes of illustration and represents various types of networks, ranging from local area networks (LANs) to wide area networks (WANs). LANs typically connect the nodes over dedicated private communications links located in the same general physical location, such as a building or campus. WANs, on the other hand, typically connect geographically dispersed nodes over long-distance communications links, such as common carrier telephone lines, optical lightpaths, synchronous optical networks (SONET), synchronous digital hierarchy (SDH) links, or Powerline Communications (PLC) such as IEEE 61334, IEEE P1901.2, and others. Communication network105includes a ...
DETAILED DESCRIPTION
Various embodiments of the disclosure are discussed in detail below. While specific implementations are discussed, it should be understood that this is done for illustration purposes only. A person skilled in the relevant art will recognize that other components and configurations may be used without parting from the spirit and scope of the disclosure.
As used herein, the term “user” refers to a user of an electronic device(s) and can include participants or players, as well as non-participants or spectators. Actions performed by a user in the context of computer software shall be considered to be actions taken by a user to provide an input to the electronic device(s) to cause the electronic device to perform the steps embodied in computer software. The terms “stream” or “media stream” are synonymous and generally refer to data or content associated with an online game or an online game session.
As discussed in herein, the subject disclosure generally relates to online gameplay hosted by multi-user platforms and improving spectator experiences. In particular, the techniques disclosed herein integrate and synchronize commentary content an gameplay content and broadcast such content to subscribers of the multi-user platform.
Referring to the figures,FIG. 1illustrates a schematic diagram100of an example communication network105(e.g., the Internet). Communication network105is shown for purposes of illustration and represents various types of networks, ranging from local area networks (LANs) to wide area networks (WANs). LANs typically connect the nodes over dedicated private communications links located in the same general physical location, such as a building or campus. WANs, on the other hand, typically connect geographically dispersed nodes over long-distance communications links, such as common carrier telephone lines, optical lightpaths, synchronous optical networks (SONET), synchronous digital hierarchy (SDH) links, or Powerline Communications (PLC) such as IEEE 61334, IEEE P1901.2, and others.
Communication network105includes a geographically distributed collection of devices or nodes110, interconnected by communication links120for exchanging data such as data packets140and for transporting data to end nodes or client devices130through a multi-user platform125. In particular, multi-user platform125distributes or broadcasts multi-media content (e.g., audio content, visual content, textual content, etc.) to end nodes or client devices130. Client devices130include personal computing devices, online game systems, laptops, tablets, mobile devices, or other devices as is appreciated by those skilled in the art. Notably, one client device130represents a network game system, which includes a game console, peripheral devices, and display hardware. Operatively, a user can subscribe client device130to multi-user platform125and play, spectate, or otherwise access online media content hosted by multi-user platform125.
Further, communication links120represent wired links or shared media links (e.g., wireless links, PLC links, etc.) where certain devices, such as, e.g., routers, servers, switches, sensors, computers, etc., may be in communication with other devices, based on distance, signal strength, current operational status, location, etc. Those skilled in the art will understand that any number of nodes, devices, links, etc. may be used in the communication network, and that the view shown herein is for simplicity.
Data packets140such as network traffic/messages are exchanged between devices over and within communication network105using predefined network communication protocols such as certain known wired protocols, wireless protocols (e.g., IEEE Std. 802.15.4, WiFi, Bluetooth®, etc.), PLC protocols, or other shared-media protocols where appropriate. In this context, a protocol consists of a set of rules defining how the devices or nodes interact with each other.
FIG. 2illustrates a block diagram of an example network device200that represents multi-user platform125(or components thereof). Device200includes one or more network interfaces210, a user input interface215, at least one processor220, and a memory240interconnected by a system bus250.
Network interface(s)210contain the mechanical, electrical, and signaling circuitry for communicating data over links coupled to one or more of the networks shown in schematic diagram100. Network interfaces210are configured to transmit and/or receive data using a variety of different communication protocols, as will be understood by those skilled in the art.
User input interfaces215may be inclusive of any variety of user interface known in the art for receiving different types of user input, including at least handheld controllers, portable controllers, keyboards, keypads, touchscreens, cameras, game peripherals and accessories, etc. Some interfaces215may be specific to virtual reality (VR) environments. Virtual reality (VR) interface(s)215provide interactive graphical interfaces to solicit and receive user input corresponding to gameplay content in a VR environment. For example, VR interface215may include any number of menus, boxes, buttons, editor interfaces, drawing tools, playback tools, selectable elements, graphical icons, and the like. These graphical interfaces can be manipulated by a user to provide commentary for a game session as discussed in greater detail below.
Memory240comprises a plurality of storage locations that are addressable by processor220for storing software programs and data structures associated with the embodiments described herein.
Processor220may comprise necessary elements or logic adapted to execute the software programs and manipulate data structures245. An operating system242, portions of which are typically resident in memory240and executed by processor220, functionally organizes the device by, inter alia, invoking operations in support of software processes and/or services executing on the device. These software processes and/or services may comprise an illustrative “commentary” process/service244. Note that while processes/services244are shown in centralized memory240, these processes/services may be configured to operate in a distributed communication network.
It will be apparent to those skilled in the art that other processor and memory types, including various computer-readable media, may be used to store and execute program instructions pertaining to the techniques described herein. Also, while the description illustrates various processes, it is expressly contemplated that various processes may be embodied as modules configured to operate in accordance with the techniques herein (e.g., according to the functionality of a similar process). Further, while the processes have been shown separately, those skilled in the art will appreciate that processes may be routines or modules within other processes. For example, processor220can include one or more programmable processors, e.g., microprocessors or microcontrollers, or fixed-logic processors. In the case of a programmable processor, any associated memory, e.g., memory240, may be any type of tangible processor readable memory, e.g., random access, read-only, etc., that is encoded with or stores instructions that can implement program modules, e.g., a module having commentary process244encoded thereon. Processor220can also include a fixed-logic processing device, such as an application specific integrated circuit (ASIC) or a digital signal processor that is configured with firmware comprised of instructions or logic that can cause the processor to perform the functions described herein. Thus, program modules may be encoded in one or more tangible computer readable storage media for execution, such as with fixed logic or programmable logic, e.g., software/computer instructions executed by a processor, and any processor may be a programmable processor, programmable digital logic, e.g., field programmable gate array, or an ASIC that comprises fixed digital logic, or a combination thereof. In general, any process logic may be embodied in a processor or computer readable medium that is encoded with instructions for execution by the processor that, when executed by the processor, are operable to cause the processor to perform the functions described herein.
FIG. 3illustrates a schematic diagram300representing gameplay content for a game session. The game session may be hosted by multi-user platform125(discussed above) and accessible by any number of subscribers (e.g., players, spectators, etc.). As illustrated, the gameplay content relates to a car racing tournament, which can be viewed by a number of cameras—camera305c, camera310c, camera315c, camera320c, and etc. Cameras305c,310c,315c, and320care shown for purposes of illustration and may (or may not) be shown during the game session. Importantly, cameras305c,310c,315c, and320crepresent different media streams (e.g., various Points of View (POV) or viewing angles, etc.) of gameplay and can be viewed by subscribers to multi-user platform125. The different media streams are assigned or mapped to respective display screens305,310,315, and320in a digital or VR environment (discussed in greater detail below).
FIG. 4illustrates a schematic diagram400of an exemplary VR environment, particularly showing a simulated commentary studio from a perspective of a user405. User405experiences and interacts with the VR environment using a headset410and a controller415. In operation, headset410and/or controller415may be wirelessly connected to additional components such as a network game system (discussed above) or, alternatively, headset410and controller415may be independently coupled to multi-user platform125over a network (e.g., network105).
Headset410simulates the VR environment (e.g., the commentary studio) and displays or projects graphical elements to user405, tracks eye movements, and measures biometric data, and the like. Controller415, similar to headset410, facilitates user interaction with and within the VR environment and is operable to, for example, detect, track, or otherwise monitor movement and biometric information, communicate data signals with headset410and the network game console, and provide feedback (e.g., tactile, audible, etc.) to a user405. In this fashion, headset410and/or controller415can include any number of sensors, gyros, radios, processors, touch detectors, transmitters, receivers, feedback circuitry, and the like. Headset410and controller415(and any other supporting VR equipment such as network game system) cooperate to provide an immersive and interactive VR environment to user405.
The VR environment shown here can include includes interactive graphical representation of video editing tools as well as a number of display screens showing gameplay content for game session300. As mentioned above, the display screens, including display screen305, display screen310, display screen315, and display screen320, show different media streams, each corresponding to a respective POV or viewing angle that corresponds to camera305c, camera310c, camera315c, camera320c, respectively.
FIG. 5illustrates a schematic diagram500of the commentary studio ofFIG. 4, further showing a display screen510selected as an active display screen. In some embodiments, user405moves controller415to indicate a selection operation. The network game system detects a change in controller orientation, direction, acceleration, or velocity, determines a corresponding path515, and projects path515to headset410. Path515intersects with display screen510and results in selection of display screen510(e.g., the graphical representation corresponding to display screen510) as an active display screen or an active representation of gameplay content. Notably, in some embodiments, user405may provide additional inputs to select display screen510as the active display screen (e.g., button press, eye movement, eye hovering, etc.). As used herein, the terms “active display screen”, “active media stream”, and/or active representation” may be used interchangeable to refer to a selected display screen and/or media content corresponding to the selected display screen. In this fashion, selecting display screen510as the active display screen sets the media stream mapped to display screen510as an active media stream for a time period (e.g., when the user405provides commentary content corresponding to the active media stream).
In other embodiments, display screen510may be selected as the active display screen based on milestones (e.g., transitions or events) in the game session. For example, games can include milestones such as transitions to new levels, discovering new inventory, defeating bosses, interactions between players (e.g., defeats, victories, etc.), a number of points achieved, a likelihood of an event occurring, and the like. Some of these milestones may be set by a game design in advance, while others may be determined based on game statistics derived from iterations of the gameplay. These milestones may be used as a trigger to automatically select and set certain display screens as the active display screen when, for example, the display screen shows players in relative proximity to a milestone. The player's proximity or approach to the gameplay milestone can be determined based on character locations in the game environment (e.g., on a world map), proximity between character locations and the milestone location, players in relative proximity to each other (e.g., in a tournament style game), and the like. In this fashion, the character locations may be tracked during the course of the game session and trigger selection of a display screen as the active display screen.
Still referring toFIG. 5, user405operatively provides commentary content such as audio commentary, textual commentary, and/or visual commentary, for the gameplay in the game session. The VR environment—here, the commentary studio—provides an immersive experience to solicit commentary content and associate/map portions of the commentary content to the media stream displayed by the active display screen. The portions of the commentary content is further associated and synchronized with the media stream for the active display screen, and broadcast to subscribers of the multi-user platform. In addition to the display screens (which show various POVs/viewing angles of gameplay), the VR environment also provides a variety of editing interfaces or tools. User405can view various aspects of the gameplay over the plurality of display screens, select active display screens/active media streams for periods of time, edit portions of active media streams (corresponding to respective active display screens), and generally provide commentary content about the gameplay using controller415, headset410, or other input devices.
FIG. 6illustrates a schematic diagram600of a commentary module605. Commentary module includes an active screen module610, a commentary module615, and an integration module620. In operation, commentary module605receives commentary content during a time period—e.g., commentary content165a,615b,615c,615d, and so on. Active screen module610monitors and identifies portions of active media streams for the same time period to create corresponding media content—e.g., media content610a, media content610b, media content610c, media content610d, and so on. Integration module620maps or associates the active media content with the commentary content based on the time period to synchronize the commentary content with the active media content. That is, user405selects an active display screen in the VR environment and provides commentary content corresponding to the active media stream associated with the active display screen. Commentary module605, including sub-modules active screen module610, commentary module615, and integration module620, collectively cooperate to receive the commentary content and synchronize the commentary content with portions of active media content (for an active media stream) to create synchronized content. The synchronized content is further passed to a broadcast module625for subsequent broadcast transmission to one or more subscribers to the multi-user platform. The synchronized content may include a commentary channel, which can provide commentary content and corresponding media content.
Commentary module605may represent components or modules of device200(and/or multi-user platform125). For example, commentary module605may perform operations embodied by commentary process/services244to provide an immersive VR environment, intuitively present media streams (or portions thereof) as well as editing interfaces/tools, etc., map media content with commentary content, and synchronize such content for subsequent broadcast.
FIGS. 7A, 7B, 7C, and 7Dillustrate schematic diagrams of the VR environment, particularly showing an editing interface705. The editing interface provides various editing tools that allow user405to generally edit (e.g., mark-up, sketch over, rewind, set playback options, overlay gameplay statistics, predict player or gameplay behaviors, etc.) portions of media streams displayed by an active display screen to create modified media content. For example,FIG. 7Ashows a drawing tools interface710that allows a user to create paths, shapes, symbols, and the like, as an overlay to the portion of media stream displayed by the active display screen. Here, the “paths” tool is selected and user405creates a route or path highlighting a potential move by player 3 (P3) to pass player 2 (P2).FIG. 7Bshows a playback tools interface715, which manipulates playback time of the portion of the media stream—e.g., rewind, fast forward, slow motion, stop, replay, etc.
FIG. 7Cshows a gameplay stats interface720that allows user405to select and overlay various gameplay statistics over the portion of the media stream. These gameplay stats can include any number of statistics such as play specific statistics, level statistics, match-up statistics between one or more players, and the like.FIG. 7Dshows a predictive model interface725that allows a user to select and overlay predictive behaviors over the portion of the media stream. The predictive model may be determined based on prior iterations of gameplay for a specific player, for a particular game, and so on.
It is appreciated that the editing interface705shown by7A,7B,7C, and7D provides a variety of editing options to user405and that the illustrated options or tools are provided for purposes of illustration, not limitation. Editing interface705can include (or exclude) any number of editing tools as desired. Editing interface705provides intuitive tools that can enhance or supplement commentary content. In this fashion, a user can manipulate, enhance, or otherwise modify portions of media streams to create modified media content. This modified content may be included as part of the commentary content, which can be broadcast to subscribers of the multi-user platform, discussed above.
FIG. 8illustrates an example simplified procedure800for providing commentary related to gameplay, particularly from the perspective of device200, commentary module605and/or multi-user platform125(or components thereof). For purposes of discussion below, reference is made to a multi-user platform.
Procedure800begins at step805and continues to steps810, where the multi-user platform receives gameplay content for a game session. For example, the multi-user platform can host and provide game content to its subscribers. The subscribers interact with the game content and generate gameplay content for a game session. The multi-user platform receives such gameplay content and can further provide a virtual reality (VR) environment to a user. The VR environment can, for example, include the above discussed commentary studio with graphical representations of media streams related to the gameplay, as shown in step815. For example, a media stream for a particular POV/viewing angle may be represented by a graphical representation of a display screen (e.g., display screen305,310,315,320, etc.) In addition, the VR environment can provide editing interfaces that allow a user to manipulate media content shown by the display screens. These editing interfaces can overlay various types of graphics, information, playback options, statistics, and the like.
As discussed above, the VR environment provides an immersive experience to solicit user commentary related to gameplay content—here, the user can select a particular display screen as an active display screen, provide commentary content regarding portions of a media stream displayed by the active representation/active display screen. This commentary content can include audio content, visual content, textual content, and the like. In this fashion, the commentary content can include audio commentary from the user as well as a graphics, playback options, gameplay statistics, overlays, and so on. Notably, in some embodiments, the multi-user platform may automatically set certain display screens as active based on gameplay milestones (discussed above).
The multi-user platform further synchronizes, at step820, the commentary content received in a time period with portions of a media stream that correspond to an active display screen for the time period to create synchronized content. For example, the multi-user platform determines a time period for portions of the commentary content and selects a graphical representation in the VR environment as an active representation (e.g., an active display screen) for the time period. The multi-user platform further parses portion of a media stream associated with the active representation for the time period and associates or maps the portion of the media stream with the commentary content. Notably, in some embodiments, the commentary content includes gameplay statistics (gameplay stats720), playback modifications (playback tools715), graphical overlays (e.g., drawing tools710, predictive models725), and the like.
The multi-user platform broadcasts, at step825, the synchronized content to its subscribers. In some embodiments, the synchronized content may be broadcast in conjunction with real-time gameplay on a delay, or a time after the game session ends.
Procedure800subsequently ends at step830, but may begin again at step810where the multi-user platform receives gameplay content for the game session. Collectively, the steps in procedure800describe a process to provide commentary content in conjunction with gameplay content. It should be noted that certain steps within procedures800may be optional, and further, the steps shown inFIG. 8are merely examples for illustration. Certain other steps may be included or excluded as desired. Further, while a particular order of the steps is shown and executed from the perspective of a particular device or system, this ordering is merely illustrative, and any suitable arrangement of the steps and/or any number of systems, platforms, or devices may be utilized without departing from the scope of the embodiments herein.
The techniques described herein, therefore, provide interactive commentary processes that combine an immersive simulated VR environment with gameplay content for a game session. These interactive commentary processes define simple and intuitive techniques to enhance spectator participation as well as spectator enjoyment.
While there have been shown and described illustrative embodiments of the commentary processes for VR environments (e.g., a commentary studio), it is to be understood that various other adaptations and modifications may be made within the spirit and scope of the embodiments herein. For example, the embodiments and certain functionality have been shown and described herein with relation to certain systems, platforms, hardware, devices, and modules. However, the embodiments in their broader sense are not as limited, and may, in fact, be employed in non-VR environments as well as employed by any combination of the devices or components discussed herein.
The foregoing description has been directed to specific embodiments. It will be apparent, however, that other variations and modifications may be made to the described embodiments, with the attainment of some or all of their advantages. For instance, it is expressly contemplated that the components and/or elements described herein can be implemented as software being stored on a tangible (non-transitory) computer-readable medium, devices, and memories (e.g., disks/CDs/RAM/EEPROM/etc.) having program instructions executing on a computer, hardware, firmware, or a combination thereof. Further, methods describing the various functions and techniques described herein can be implemented using computer-executable instructions that are stored or otherwise available from computer readable media. Such instructions can comprise, for example, instructions and data which cause or otherwise configure a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. Portions of computer resources used can be accessible over a network. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, firmware, or source code. Examples of computer-readable media that may be used to store instructions, information used, and/or information created during methods according to described examples include magnetic or optical disks, flash memory, USB devices provided with non-volatile memory, networked storage devices, and so on. In addition, devices implementing methods according to these disclosures can comprise hardware, firmware and/or software, and can take any of a variety of form factors. Typical examples of such form factors include laptops, smart phones, small form factor personal computers, personal digital assistants, and so on. Functionality described herein also can be embodied in peripherals or add-in cards. Such functionality can also be implemented on a circuit board among different chips or different processes executing in a single device, by way of further example. Instructions, media for conveying such instructions, computing resources for executing them, and other structures for supporting such computing resources are means for providing the functions described in these disclosures. Accordingly this description is to be taken only by way of example and not to otherwise limit the scope of the embodiments herein. Therefore, it is the object of the appended claims to cover all such variations and modifications as come within the true spirit and scope of the embodiments herein.
Claims
- A method for gameplay view selection for commentary, the method comprising: storing information regarding one or more milestone events defined for a game title;receiving a plurality of different media streams that correspond to a plurality of in-game views of a game session of the game title;identifying one of the in-game views of at least one of the media streams that corresponds to one or more of the milestone events, wherein the one or more of the milestone events include in-game events completed based on in-game actions by user-controlled characters;selecting the at least one identified media stream to include in an active display;receiving commentary content that corresponds to the active display during a time period;synchronizing the commentary content as received relative to the time period of the at least one identified media stream;and distributing the synchronized content via a gameplay channel accessible to one or more subscriber devices.
- The method of claim 1, wherein the one or more milestone events include transition to a new level within the game title, interaction between players within the game title, or a predetermined amount of points achieved within the game title.
- The method of claim 1, wherein the one or more milestones are determined based on one or more game statistics from gameplay of the game title.
- The method of claim 1, wherein the one or more milestones are set in advance of gameplay of the game title.
- The method of claim 1, wherein the at least one identified media stream is identified based on a character location, proximity between the character location and a milestone location, or proximity between players.
- The method of claim 1, further comprising distributing the plurality of different media streams on a plurality of respective display screens.
- The method of claim 1, wherein one or more portions of the active display is modified using an editing interface to create modified media content.
- The method of claim 1, wherein one or more portions of the active display is modified using a drawing interface to create one or more paths, shapes, or symbols, wherein the paths indicate a potential move by a player in the game title.
- A system for gameplay view selection for commentary, the system comprising: a network interface to communicate in a communication network, wherein the network interface: receives a plurality of different media streams that correspond to a plurality of in-game views of a game session of a game title, and receives commentary content that corresponds to an active display during a time period;and a processor that executes instructions stored in memory, wherein execution of the instructions by the processor: identifies one of the in-game views of at least one of the media streams corresponds to one or more milestone events defined for the game title, wherein information regarding the one or more milestone events is stored in the memory, and wherein the one or more of the milestone events are in-game events completed based on in-game actions by user-controlled characters, selects the at least one identified media stream to include in the active display, and synchronizes the commentary content as received relative to the time period of the at least one identified media stream;wherein the network interface distributes the synchronized content via a gameplay channel accessible to one or more subscriber devices.
- The system of claim 9, wherein the one or more milestone events include transition to a new level within the game title, interaction between players within the game title, or a predetermined amount of points achieved within the game title.
- The system of claim 9, wherein the one or more milestones are determined based on one or more game statistics from gameplay of the game title.
- The system of claim 9, wherein the one or more milestones are set in advance of gameplay of the game title.
- The system of claim 9, wherein the at least one identified media stream is identified based on a character location, proximity between the character location and a milestone location, or proximity between players.
- The system of claim 9, further comprising distributing the plurality of different media streams on a plurality of respective display screens.
- The system of claim 9, wherein one or more portions of the active display is modified using an editing interface to create modified media content.
- The system of claim 9, wherein one or more portions of the active display is modified using a drawing interface to create one or more paths, shapes, or symbols, wherein the paths indicate a potential move by a player in the game title.
- A non-transitory computer-readable storage medium having software encoded thereon, the software executable by a processor to perform a method for gameplay view selection for commentary, the method comprising: storing information regarding one or more milestone events defined for a game title;receiving a plurality of different media streams that correspond to a plurality of in-game views of a game session of the game title;identifying one of the in-game views of at least one of the media streams that corresponds to one or more of the milestone events, wherein the one or more of the milestone events are in-game events completed based on in-game actions by user-controlled characters;selecting the at least one identified media stream to include in an active display;receiving commentary content that corresponds to the active display during a time period;synchronizing the commentary content as received relative to the time period of the at least one identified media stream;and distributing the synchronized content via a gameplay channel accessible to one or more subscriber devices.
- The method of claim 1, further comprising displaying the plurality of in-game views in a virtual reality (VR) environment simulating a commentary studio, wherein the commentary content is received through a user interacting with the VR environment.
Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.