U.S. Pat. No. 9,751,011
SYSTEMS AND METHODS FOR A UNIFIED GAME EXPERIENCE IN A MULTIPLAYER GAME
AssigneeElectronic Arts Inc; Electronics Arts Inc
Issue DateMay 24, 2013
Illustrative Figure
Abstract
In some embodiments, a system comprises game state information, a first user device, a second user device, and a processing server. The first and second user devices may include a first and second user interface modules configured to receive first and second user selections, respectively, associated with gameplay of a multiplayer game. The processing server may include a communication module, a simulation module, and rendering module. The communication module may be configured to receive the first and second user selections. The simulation module may be configured to generate simulation results based on the game state information, game rules, and the first and second user selections. The rendering module may be configured to render a first video based on the simulation results and render a second video based on the simulation results, the first and second video to be displayed by the first and second user devices, respectively.
Description
DETAILED DESCRIPTION OF THE INVENTION Typically, video game devices such as consoles (e.g., XBOX) receive user input from a controller, simulate a game result based on the input and a state of the currently active game (i.e., the current game state), render video and/or images associated with the simulated result, and display the video and/or images on a screen (e.g., television screen). Those skilled in the art will appreciate that both simulation and rendering requires resources. In various embodiments, one or more processes including, for example, receiving user input, simulating, and/or rendering video and/or images associated with the simulated result may be performed on different digital devices. A digital device is any device with memory and a processor. In various embodiments, a local user device may generate a user interface for a game (e.g., based on game rules and a current game state) and receive user input. The simulation phase (i.e., the simulation of a game result based on the input and based on the state of the currently active game) may be performed by a different digital device in communication with the local user device. Further, the rendering phase (i.e., the rendering of video and/or images associated with the simulated result) may be performed on yet another digital device in communication with the local user device. By splitting the functions of receiving user input, the simulation phase, and/or the rendering phase, resources on the local user device may be conserved while options for streaming video to multiple users and/or multiple-device play may be expanded. For example, a local user may input a selection in a local user device. The selection and/or current game state may be shared with a server that may perform the simulation. Rendering based on the simulation result may be performed by the same server or ...
DETAILED DESCRIPTION OF THE INVENTION
Typically, video game devices such as consoles (e.g., XBOX) receive user input from a controller, simulate a game result based on the input and a state of the currently active game (i.e., the current game state), render video and/or images associated with the simulated result, and display the video and/or images on a screen (e.g., television screen). Those skilled in the art will appreciate that both simulation and rendering requires resources.
In various embodiments, one or more processes including, for example, receiving user input, simulating, and/or rendering video and/or images associated with the simulated result may be performed on different digital devices. A digital device is any device with memory and a processor.
In various embodiments, a local user device may generate a user interface for a game (e.g., based on game rules and a current game state) and receive user input. The simulation phase (i.e., the simulation of a game result based on the input and based on the state of the currently active game) may be performed by a different digital device in communication with the local user device. Further, the rendering phase (i.e., the rendering of video and/or images associated with the simulated result) may be performed on yet another digital device in communication with the local user device.
By splitting the functions of receiving user input, the simulation phase, and/or the rendering phase, resources on the local user device may be conserved while options for streaming video to multiple users and/or multiple-device play may be expanded. For example, a local user may input a selection in a local user device. The selection and/or current game state may be shared with a server that may perform the simulation. Rendering based on the simulation result may be performed by the same server or a different server (e.g., a plurality of digital devices). A server that performs rendering may provide different renderings of the simulation result including multiple viewpoints, multiple perspectives, multiple resolutions, and/or at multiple dimensions (e.g., depending on hardware requirements of the user device which is to display a rendering). The rendering(s) could be provided back to the local user device and/or shared with one or more remote users with other remote user devices.
In some embodiments, game play is deterministic. A deterministic game is a game with game rules that will produce the same simulation given the same inputs (e.g., user selections and current game state). Game rules include executable instructions that control gameplay and/or the simulation. Multiple simulations of a deterministic game utilizing the same input and the same game state may provide the same simulation result. The deterministic nature of game play in these embodiments may allow for simulation and rendering on multiple devices without impacting local game device resources.
For example, a local user device may receive input from a user, simulate a result based on the received input and the current game state, render the simulation result, and display the output of the rendering. The local video game device may also provide the input from the user and/or the current game state to a processing server that may also perform a simulation that generates the same simulation result as the local user device. Further, like the local user device, the processing server may render the simulation result and provide the output of the rendering and/or stream video and/or audio based on the rendering to other digital devices. Those skilled in the art will appreciate that the local video game device may perform the simulation, rendering or neither. Similarly, the processing server may perform the simulation, rendering, or neither.
In one example, a deterministic game may be turn-based. A turn-based game may receive selections from a player (in a single player game) or multiple players (in a multiplayer game). Subsequently, a simulation is performed to generate a simulation result. The simulation may be based on the game rules (e.g., that govern the simulation) as well as the user selection(s) and the current game state. The simulation result may be rendered and displayed to the player(s) and the current game state may be updated based on the simulation result. The player(s) may then receive another turn whereby additional selection(s) may be made. Choices may be dictated by the game rules and the updated current game state. Examples of turn-based games include, but are not limited to, sports games such as football, baseball, or the like.
By splitting simulation and/or rendering from the functions receiving user input for gameplay, options for rendering multiple videos from the same simulation result become available. In one example, multiple videos of different perspectives and/or viewpoints of the same game may be rendered without impacting a player's local resources. Further, when a game is deterministic, players have the option of retrieving saved game states to replay the same game or explore possibilities (e.g., “what would have happened”) if different choices were made.
Those skilled in the art will appreciate that, utilizing at least one or more of the systems and methods described herein, a player may play the same game on multiple devices thereby allowing the player to better engage in the experience. For example, a player may play a football game on their home personal computer (PC). The player may play a first down by selecting options based on the current game state of the football game. Once committing the selection, the PC may simulate the results based on the player's selected options and the current game state. The PC may subsequently render and display the simulated result. The PC may then generate a new game play state.
In some embodiments, the player may pause play, log out of the PC, and log back into the game utilizing a different digital device (e.g., a smartphone). When the player logs out or otherwise provides permission, the PC may provide the new game play state to a processing server over a network. When the player logs back into the game utilizing a different digital device, the processing server may recognize the different digital device (e.g., via authentication during login) and receive a request to continue playing the game. The processing engine may, in some embodiments, provide the new game play state to the different digital device, perform simulation, and/or rendering for the different digital device. As a result, the player may begin playing the game right where they left off in game play.
In a typical video game scenario, a number of parameters (i.e., game state conditions) are saved which dictate controlling the conditions for the way the game simulation will act. Game invocation typically comprises a loop100(seeFIG. 1).
FIG. 1is a block diagram100of different user devices104,106,108,110,112,114, and116which may engage in one or more games in some embodiments. Each user device104,106,108,110,112,114, or116may be a digital device that communicates over communication network102. A digital device is any device with a processor and memory. Although seven different user devices are depicted inFIG. 1, those skill in the art will appreciate that there may be any number of user devices that are coupled to any number of networks. Various examples are displayed inFIG. 1which is not intended to be exhaustive.
While many user devices on which to play games are different, they may share some common characteristics. The user devices may have some method of capturing user input such as a keyboard, remote control, touchscreen, or the like. Different user devices may also have some method of displaying a two-dimensional image using a display such as a TV screen (e.g., LED, LCD, or OLED) or touchscreen. The user devices may have some form of processing CPU, although the capability often widely varies in terms of capability and performance. Further, the user devices may have a connection to the internet, such as an Ethernet connection, WiFi connection or mobile phone cell data connection. In addition, using the combination of the two-dimensional image display and processing capability, the user devices may be able to receive and display video imagery and either by downloading a file or by streaming image data in or near real-time from a remote streaming server. In some embodiments, presented is a way that multiple users can interact with a game and receive the same experience as every other player regardless of their individual platform.
In various embodiments, one or more players may utilize each user device to play one or more games (e.g., a turn-based game). Each user device104,106,108,110,112,114, and116may display a user interface associated with the desired game. The user interface may be configured to receive user selections (e.g., user input) for game play. For example, there may be any number of menus that provide opportunity for player selection via buttons, radio buttons, check boxes, sliders, text fields, selectable objects, moveable objects, and/or the like.
The content of the user interface may be generated and/or selected based on the game rules as well as the current game state. Game rules and the current game state may dictate options from which the player may choose.
Once the player provides selection(s), in some embodiments, a simulation may be performed to determine the result of the player selection(s) in the context of game play (e.g., utilizing the current game state). In some embodiments, the simulation is conducted locally (e.g., a player utilizing user device106inputs selection(s) and the user device106performs the simulation) based on the game rules. In various embodiments, the simulation may be performed by another digital device. For example, the user device106may provide the selection(s) and/or the current game state to the processing server118via the communication network102. The simulation server120may perform the simulation based game rules, the player selection(s), and the current game state. In various embodiments, the simulation server120and/or any digital device may provide the selection(s) and/or current game state to any digital device which may perform the simulation.
Once the simulation is complete, the simulation result may be rendered. The simulation result from any of user devices104,106,108,110,112,114, or116or the processing server118may be rendered by any digital device. In some embodiments, the rendering server122renders the simulation result from the simulation server120. The video and/or images from the rendering may be displayed on one or more of the user devices104,106,108,110,112,114, and/or116.
In some embodiments, the rendering server122may stream and/or download video and/or images based on one or more renderings of the simulation result. For example, the rendering server122may broadcast the same video and/or images to one or more of the user devices. In another example, the rendering server122may perform multiple renderings of the same simulation result from different viewpoints and provide video and/or audio to different user devices. Renderings of the same simulation result may include video and/or audio which depict different views of the same gameplay.
In various embodiments, the rendering server122may provide game display information to the user devices104,106,108,110,112,114, and/or116to assist in rendering the simulation result. Game display information may include any information that may assist the receiving device to display information to a user (e.g., by providing the user device text, video, or images) or may assist the receiving device to render video and/or images. In one example, the game display information may be text (e.g., a text message) that may be displayed as a part of the game (e.g., scrolling text). In another example, the game display information may comprise vector and quaternion data for device specific rendering of results. In still other examples, the game display information may include position data (e.g., of players on a field), statistics of players, how scores are made, play by play information or the like. The position data, statistics, information on scores, and/or play by play information may be utilized by the receiving user device to display (e.g., render video or display text) all or part of the simulation result. Game display information may comprise video and/or images rendered by the rendering server122.
In some embodiments, game display information is any information that assists a user device to render gameplay and/or includes rendered video and/or images. For example, game display information may comprise position data of players on a field, statistics of players, how the players scored, play by play information, or the like. All or some of the game display information may be utilized by the receiving user device to render game play (e.g., render text, video, or images). For example, the user device may render video of football players completing a play based on the simulation result and play by play information of the game display information. Those skilled in the art will appreciate that the game display information may be any information including previously rendered video and/or images or information that may be displayed by a user device or any information that assists in the local rendering of video and/or images.
User device104may include any laptop, notebook, ultrabook, tablet, or the like. User device106may include any console device such as Microsoft Xbox or Sony Playstation. The user device106may include hand-held gaming devices such as the Nintendo DS. User device108includes any smart media device (e.g., streaming device) that may, for example, be coupled to a television including, for example, Apple TV, Roku, SlingBox, Boxee Box, or the like.
User device110may include any smart phone, cell phone, PDA, tablet, wearable digital device (e.g., smartwatch or Google Glasses), media device (e.g., iPod), or the like. User device112includes any computer such as a personal computer, desktop computer, mainframe, or the like. User device112may include any digital device, for example, that plays one or more games over a browser. User device114includes any server or the like. User device116includes any television or smart television (e.g., portable or nonportable). Those skilled in the art will appreciate that the user devices may comprise any number of digital devices.
Communication network102may include any network or combination of networks. In some embodiments, the communication network102comprises the Internet. The communication network102may include any number of wide area networks (WANs), local area networks (LANs), or the like. All or part of the communication network102may be wireless or wired. For example, any number of the user devices may communicate wirelessly via the communication network102.
Processing server118is any digital device that comprises a simulation server120and a rendering server122. The simulation server120may receive user input (e.g., game selections from a user) and/or a current game state to simulate a game result (i.e., a simulation result). The rendering server122may receive the simulation result and render the video and/or images based on the simulation result. Video and/or images based on the rendering(s) may be provided to any number of user devices.
Although the processing server118is depicted as encompassing both the simulations server120and the rendering server122, those skilled in the art will appreciate that the processing server118may, in some embodiments, include the simulation server120or the rendering server122. For example, the simulation server120of the processing server118may communicate with the rendering server122of a different processing server118. In various embodiments, the simulation server120and/or the rendering server122are independent servers on different digital devices.
Those skilled in the art will appreciate that a server may include any number of digital devices. For example, the processing server118may comprise any number of servers. Similarly, the simulation server120may be any number of servers. For example, the simulation server120may comprise any number of servers in communication over a network such as the cloud. Similarly, the rendering server122may be any number of servers. For example, the rendering server122may comprise any number of servers in communication over the network (e.g., the cloud).
FIG. 2is a flowchart200of a game simulation and rendering process in some embodiments. In step202, input from one or more users (where a user may be a local player with a direct connection with the game or a remote player connected using some form of network such as an Ethernet connection) is gathered. For example, multiple players via different user devices (e.g., user devices104,106,108,110,112,114, and/or116) may log into the processing server118to engage in a multiplayer game. Each of the different user devices may display user interfaces to the players to receive user input (e.g., to receive user selections during the player's turn). All or some of the user inputs may be provided to the processing server118.
The processing server118may maintain a current game state or, in some embodiments, the current game state may be provided to the processing server118from one or more other digital devices (e.g., user devices104,106,108,110,112,114, and/or116). A game state includes those parameters, values, and other indicators which may be used to describe the game at a point in time. In various embodiments, the game state may include the state of many game processes, programs, and the like that may be used to control game play.
In step204, user input as well as the current game state212(i.e., current game state conditions) are applied to a simulation. In some embodiments, the simulation server120of the processing server118may receive the user inputs (e.g., game selection(s)) from one or more of the user devices as well as the current game state212. The simulation server120may simulate the game play result based on the user inputs, the current game state, and game rules. The simulation server120may generate a simulation result based on the simulation as well as a new game state (e.g., an updated current game state based on the simulation).
The current game state may be modified according to the output of that simulation (see New Game State206). In various embodiments, the simulation server120maintains the new game state206(e.g., the modified game state) and/or provides the new (e.g., updated) game state206to one or more other digital devices (e.g., user devices104,106,108,110,112,114, and/or116).
In step208, video and/or images are displayed using the updated game state206and/or the output of the simulation to display the results to the users. In some embodiments, the rendering server122renders the video and/or images based on the simulation result of step204(e.g., the simulation result from the simulation server120). The rendering server122may also render the video and/or images based on the updated game state206(e.g., the new game state).
In various embodiments, the rendering server122may render different video and/or images of the same simulation result but from different camera angles and/or viewpoints. In some embodiments, the processing server118and/or the rendering server122track the viewpoints and/or preferred angles of different players. The rendering server122may render the viewpoints and/or preferred angles and provide the appropriate rendered view and/or images at the different viewpoints and/or preferred angles to the correct user device(s). In some embodiments, multiple players may receive the same view (e.g., the same rendered video and/or images). For example, the rendering server122may broadcast the same rendered video and/or images to different user devices. The rendering server122may perform multiple renderings and provide each rendered video to a different digital device. Those skilled in the art will appreciate that the rendering server122may comprise a cluster of digital devices capable of rendering multiple videos and/or images to provide to any number of user devices without consuming resources of the user devices.
In some embodiments, the rendering server122provides game display information to one or more different user devices. The game display information may comprise rendered video and/or images or information to assist in the rendering of video and/or images. In some embodiments, a first user device and a second user device involved in playing the same game (e.g., as opposing sides in a football game) may receive similar game display information from the rendering server122to assist in local rendering. The first and second user devices may each, independently of the other, render all or some of the video based on the game display information. For example, the first user device may receive play by play information which may be utilized to render video from the first player's perspective of a game. The second user device may receive the same information, more information, or different information (e.g., vector information) to assist in the rendering of video and/or images from the second player's perspective.
In step210, the processing server118may optionally copy the new game state206. For example, the new game state206may become the current game state212. Subsequently, user inputs may again be received based on the new game state, the simulation processed, and game rules.
FIG. 3is a flowchart300depicting a selection phase302and a processing phase304for game simulation and rendering in some embodiments. For some classes of games, game invocation can be thought of as comprising two phases including a selection phase302and a processing phase304. The selection phase302may comprise comparatively light use of processing resources and may comprise one or more user interface (UI) screens on a digital device allowing the user to set up a number of variables that will be used in the processing phase304. The selection phase302may have simpler graphics than the processing phase304and, as a result, may require fewer resources such as a GPU. In some embodiments, the processing phase304, takes inputs from play selection from players, runs the simulation, and renders the results, often in a high fidelity manner. Typically, the processing phase304consumes more processing power than the selection phase302as the system renders a lot of information as well as run simulations.
In various embodiments, elements ofFIG. 2are decoupled such that parts may be handled using the player's local user device (e.g., user devices104,106,108,110,112,114, and/or116) and parts may be handled by a remote computing device, such as a cloud computing resource (e.g., processing server118), seamlessly to the user, which may result in improved visual fidelity as well as a unified experience for any number of different user devices.
In one example, parts of the game loop are decoupled such that the local user device still handles the selection phase, but the output from the selection phase302is sent to a remote computing device (e.g., processing server118) which holds the game state, performs the processing part of the loop (e.g., performs the simulation), and then generates (e.g., renders) an output stream which is sent back to the users local devices for display. For example, in a football management game, gathering user input would include displaying options for a user to choose a play, feeding that play in to a simulation engine which evaluates the user play against the opponents chosen play and displaying the results of the play as a rendered video.
In various embodiments, the game captures the player's input locally, but instead of processing, the user's input is sent to the remote computing device where the current game state and the user's input is processed through the game simulation. At the completion of the simulation, the remote computing device may render the game output and transmit the output back to the user local device where the output is displayed (e.g., as video). If the game play simulation engine is considered to be deterministic (i.e., given the same user input and starting game state, the end game state and output result will always be the same), by saving the user input as well as the game conditions as a user state, the simulation result and/or game state can be processed and rendered at any point in time. Since the processing of the game is effectively decoupled from both the user input as well as the rendering to the output display, the user state can be rendered multiple times at different resolutions, camera angles, etc., with no loss of the original user intention.
In step306of selection phase302, a copy of the current game state324is received or generated by a local user device (e.g., user device104). The current game state324may be received from the processing server118or any digital device. In some embodiments, the current game state324may be retrieved from game rules or updated by the local user device.
In step308, the local user device displays one or more user interface(s) to inform the player of the state of the game, provide instructions, and/or request selections (e.g., input) from the player. The local user device may generate or the one or more user interface(s). In some embodiments, the local user device may download and/or receive a stream to display one or more of the user interface(s).
In step310, the local user device gathers user input (e.g., player selection(s)). The local user device may collect user input in any number of ways. In some embodiments, when the user makes selections and/or provides inputs, a game state is modified and/or new user interfaces are provided to request additional information from the player. In some examples, choices by a player may result in additional or different options that were not previously available.
In some embodiments, the local user device detects when sufficient user input is collected. In various embodiments, the local user device or a game on the local user device may receive a command from the player indicating that the simulation is to be performed based, at least in part, upon the received user inputs (e.g., without waiting or requesting additional user input). In some embodiments, the local user device may detect when sufficient user input is collected based on game rules.
In various embodiments, once sufficient user input is collected, the local user device may modify a new game state by applying all or part of the inputs to the previous game state. For example, a current game state may be modified based on the user input. In some embodiments, the results of a simulation are based on user inputs and a current game state. In various embodiments, the results of the simulation are based on, at least in part, on the new game state (e.g., the application of all or some of the user inputs to the current game state).
In step314, the game and/or the player indicates readiness for a turn. In some embodiments, the player may indicate a desire to add additional selections or change previous choices. In this example, the previous game state may be restored and/or previous user interfaces may be provided to the player to make additional or different choices. If the player indicates readiness for a turn or simulation to occur, the processing phase304may process the simulation in step318.
In some embodiments, once the player and/or a game indicates a readiness for a turn in step314, a local user device may provide the user inputs and/or a current game state to the processing server118to perform the simulation and rendering. In some embodiments, the local user device may modify a new game state based at least in part on the user inputs. The local user device may subsequently provide the new game state, the user inputs, and/or the previous game state to the processing server118to perform the simulation.
The simulation of step318may be based on the game rules, user inputs (e.g., player selections), a modified new game state, and/or the previous game state. The simulation may depend on the game (e.g., the rules for the simulation) as well as the new game state, the user inputs, and/or the previous game state. The simulation server120may perform the simulation.
In step320, the rendering server122of the processing server118may render the video and/or images based on the results of the simulation in step318. In some embodiments, the rendering may also be based on the game rules, new game state, the user inputs, and/or the previous game state. The rendering server122may stream or otherwise provide the results of the rendering to the user device which may display the rendered video and/or images.
In step322, the game state316is updated based on the simulation results, new game state, the user inputs, game rules, and/or the previous game state324. The new game state may be provided to the user device. Based on the new game state, the user device may generate different user interfaces to receive different user inputs. The updated game state may also be saved by the processing server118to be used in the next simulation and/or rendering.
In some situations, such as consumer game consoles, it may be desirable to allow the user to interact with the game simulation in real time but save user states at significant points for later processing and rendering. In this mode, a user device such as a console (e.g., user device106) may be capable of running the same game simulation which is run on a remote computing device (e.g., the processing server118). In this example, a give user state for the user device may generate the same output when run on a remote computing device. This could, for example, allow a different user at a different user device to view a game in progress in near real time without affecting the connectivity or processing capability of the initial user device running the game.
In some embodiments, a further benefit is that, by virtue of the very small size of a user state (input plus game state), compression may be achieved that is greater than the equivalent video output. In addition, the rendering capabilities of the rendering server122may improve over time.
In some embodiments, rather than the output stream comprising encoded video of common format such as H264, the output stream (e.g., the game display information) could comprise a scene graph of 3D commands tailored to the specific graphics system of the local user device. This may require the remote computing device to have some information about the GPU of the local user device. By providing the user's local device with a 3D scene graph, rather than a 2D rendered video, the user may optionally view the result from different camera angles, or choose to remove or highlight one or more elements of the output.
Some embodiments allow for multiuser play. For example, the inputs from several users are used as input for the game logic and the game simulation. The display rendering may involve the same scene for each player or may be different scenes for one or more players (i.e. rendered from each player's point of view). In additional, multiple views may be rendered and each player may select the view they want before receiving the video output.
In some embodiments, user states may contain everything needed to render a scene. As a result, in various embodiments, we can also choose to use them in a number of ways unrelated to direct user input. For instance, a virtual TV channel may runs video highlights from multiple users for broadcast. In some embodiments, the user state for multiple different users may be used to render video and/or images sequentially and edit the video and/or images together to make a highlight reel. In various embodiments, users may replay scenarios starting with a user state but modifying one or more game parameters to achieve a different goal. For instance, in a football game, a user may choose to run a user state where the original set of inputs ended in a touchdown but change the defensive play to a different one to see if they can stop the touchdown.
In various embodiments, the processing server118(aka “SkyCam”) may include an advanced simulation engine (which performs the simulation processing discussed above) and an HD rendering engine (which performs the rendering discussed above). The advanced simulation engine may comprise the simulation server120. The HD rendering engine may comprise the rendering server122. The processing engine118may be communicatively coupled to disparate devices including servers, other simulators (e.g., an on-demand fantasy simulator for a sports game), consoles, web browsers, mobile devices, set-top boxes and smart TVs as shown inFIG. 1.
In one example of operation in a video game such as a sports game where players can be members of virtual leagues, devices such as consoles can provide instant highlights, news and simulation from game play. Mobile devices can allow players to play connected careers anytime, anywhere with anyone. Season scrapbooks can be created for every game the player has ever played by storing historical game play data associated with a user.
FIG. 4is a block diagram of a simulation server120in some embodiments. The simulation server120comprises controller402, an authentication module404, a game state module406, an input module408, a simulation module410, a game and device datastore412, and a game state datastore414.
The controller402may control one or more modules and/or datastore(s) of the simulation server120. In various embodiments, the simulation server120may provide simulation results for different games. The controller402may be configured to identify a player's game, retrieve game rules (e.g., from the game and device datastore412) determine if the player's game is a multiplayer game, receive one or more user inputs, determine when sufficient user input(s) are received, receive or maintain at least one current game state associated with the identified game, select an appropriate simulation for the game(s) being played (e.g., a simulation based on the identified game rules), execute the selected simulation based on user input(s) and/or current game state, update the game state based on simulation results, and provide simulation results and/or the updated game state for rendering.
Different processing servers118, simulation servers120, and rendering servers122may provide services for different games. In some embodiments, when a user activates a game on a user device, the user device may communicate with a processing server118. The processing server118may identify the user device, player, and/or the activated game. Based on the user device, player, and/or activated game, the processing server118may provide simulation serves (e.g., perform simulations and provide results), render video and/or images, or direct communication to one or more different processing servers118which may provide services (e.g., simulations and/or rendering) for the activated game, user device and/or player. The different processing server118may communicate directly with the user device.
In various embodiments, a central communication server initially communicates with the user device and identifies the user device, player, and/or the game. The central communication server may direct communication between the user device and one or more processing servers118based on the user device, player, and/or the game (e.g., different processing servers118may provide simulation and/or rendering services to different games or sets of different games). In some embodiments, different processing servers118may also provide different services to different game types. For example, a first processing server118may provide simulation and/or rendering for a single player version of a game while another processing server118, either alone or in combination with the first processing server118, may provide simulation and/or rendering for a multiplayer version of the same game.
The authentication module404may authenticate a user device, other simulation servers, other processing servers, and/or one or more rendering servers. In one example, when a player initiates game on user device106, the user device106establishes communication with the processing server118. In some embodiments, the controller402may confirm that the processing server118provides services to the user device, player, and/or the activated game. The controller402may direct the user device and/or another processing server118to perform all or some services based on the user device, player, and/or the activated game.
The authentication module404may be configured to authenticate the user device106, establish a game play session, and/or identify the associated hardware of the player. In some embodiments, based on the hardware of the player and the type of game (e.g., multiplayer, single player, single viewpoint, and/or multiple viewpoints), the controller402may direct simulations and/or renderings to be performed in different manners. For example, if the user device is a console (e.g., user device106), the controller402may direct simulations and/or renderings to be performed by the user device which may have sufficient resources. Alternately, the controller402may direct simulations and/or renderings to be performed by the user device or the processing server118either alone or in combination with other processing servers118and/or the user device.
In various embodiments, the authentication module404may authenticate communication with the user device (e.g., authenticate the user device, the game, and/or the player). The authentication module404may confirm codes, encryption keys, or the like. In some embodiments, the authentication module404authenticates the software of the game to prevent unauthorized use and/or unauthorized copies.
The game state module406may retrieve a game state associated with the activated game. In some embodiments, the game state module406may retrieve a game state from a game state datastore414or from the user device. In some embodiments, the game state module406provides the game state to the user device and the user device may then provide user interfaces to a player requesting input and/or selection(s) associated with the game.
The game state module406may maintain a current state of game play. Based on simulation results from the simulation module410and/or selection(s) received from the user, the game state module406may update the current game state to a new game state. The updated game state may subsequently be provided, in some embodiments, to one or more other processing servers118, rendering servers122, or user devices. In various embodiments, game states are not provided between devices but rather the game state is individually updated on each device.
The input module408is configured to receive user input from any number of user devices. In various embodiments, the controller402retrieves input instructions regarding the game from any number of user devices and/or the game and device datastore412. The input instructions may indicate a quantity and/or quality of expected inputs from any number of the user devices. In some embodiments, the input instructions may indicate that required number of user devices that must provide instructions before a simulation may be performed (e.g., input must be received from two user devices for a two-player game). The input module408may receive the inputs and determine if the game instructions are satisfied.
In some embodiments, the game state module406and/or the input module408may update a current game state based at least in part on the user inputs.
The simulation module410may perform different simulations based on the game and/or type of game (e.g., based on game rules). In some embodiments, once the required quantity and/or quality of inputs are received, the simulation module410may perform a simulation of the game to determine a simulation result. The simulation may be based on the current game state, inputs received from the user devices, and/or simulation instructions from the game and device datastore412that are associated with how the simulation is to be performed for the desired game.
In various embodiments, the game state module406may update the current game state based at least in part on the simulation result from the simulation module410. All or part of the new current game state may be provided to the user devices(s) which may provide different user interfaces and/or requests for selection to continue gameplay. In some embodiments, the game state module406, instead of or in addition to the current game state, the game state module406may provide instructions to the user devices to obtain new user input (e.g., menu identifiers and/or requests for information).
The controller402may provide the simulation results and/or the current game play state to the rendering server122, the user device, and/or any digital device. In various embodiments, one or more digital devices may render video and/or images based on the simulation results. For example, a rendering server122may perform multiple renderings based on different viewpoints and provide each user device different video and/or images based on the multiple renderings. In another example, a user device may perform a rendering based on the simulation result for the local player while the rendering server122may perform a rendering that generates the same video to provide to one or more different user devices to allow other players to view a game. Those skilled in the art will appreciate that any number of digital devices (e.g., user devices and/or rendering servers122) may render the same or different video and/or images based on the simulation result.
The game and device datastore412and the game state datastore414may be one or more different data structures including, for example, tables, databases, or the like. The game and device datastore412may comprise information that may be used to identify a user device, a game, information to assist in the authentication process, and/or information to assist in the simulation process. The game state datastore414may comprise information of an initial game state associated with a game, updated game states, past game states, and/or current game states.
Those skilled in the art will appreciate that the simulation server120may comprise any number of modules. For example, some modules may perform multiple functions. Further, in various embodiments, multiple modules may work together to perform one function. In some embodiments, the simulation server120may perform more or less functionality than that described herein.
It will be appreciated that a “module” as referred to herein may comprise software, hardware, firmware, and/or circuitry. In one example, one or more software programs comprising instructions capable of being executable by a processor may perform one or more of the functions of the modules described herein. In another example, circuitry may perform the same or similar functions. Alternative embodiments may comprise more, less, or functionally equivalent modules and still be within the scope of present embodiments.
FIG. 5is a block diagram of a rendering server122in some embodiments. The rendering server122comprises a device and game identification module502, a simulation result module504, a rendering module506, and a communication module508. The rendering server122may be a part of the processing server118. The rendering server122may, in some embodiments, be a digital device that is remote from the simulation server120.
The device and game identification module502may be configured to provide information regarding the game being played by one or more user devices as well as the characteristics (e.g., hardware and/or software resources) of the user devices (e.g., user device capability information). In one example, the rendering server122may render video for a first user device with a large, high resolution screen based on information from the device and game identification module502. The rendering server122may also render video for a portable digital device with a relatively small display and limited resolution. In some embodiments, the rendering server122may provide game display information including information to assist a receiving user device to render gameplay (e.g., video and/or images).
The device and game identification module502may identify players, user devices, and games based on information from the user device, information from the simulation serve120, or accounts associated with players, user devices, and/or active games.
The simulation result module504may receive the simulation result from the simulation server120. The rendering module506may render view and/or images based on the simulation result from the simulation result module504and/or the simulation server120.
In various embodiments, the device and game identification module502indicates different viewpoints that a single player may wish to access. The device and game identification module502may identify available viewpoints and/or perspectives based on the game and/or input from the player(s) (e.g., from the user devices). Based on information from the device and game identification module502, the rendering module506may, based on the simulation result, separately render multiple videos and/or images to be shared with one or more user devices. In this example, it will be appreciated that different video may be separately rendered based on the same input without impacting resources of the user device(s). In another example, the rendering module506may render some information and provide the rendered information to a user device which may further render (e.g., finish rendering) video and/or images based, in part, on the information from the rendering module506.
In another example, the device and game identification module502may receive player preferences from the user device indicating one or more preferred viewpoints and/or perspectives. The rendering module506may, based on the simulation result and the preferences of the player, render view from the identified preferred viewpoints and/or perspectives. The rendering module506may, based on the simulation results and the preferences of the player, provide game display information from the identified preferred viewpoints and/or perspectives. The receiving user device may utilize the game display information to render or finish rendering game play.
In some embodiments, the device and game identification module502may also track perspectives of different players in a multiplayer game. For example, each user device may provide one or more expected video perspectives. The rendering module506may render video and/or images or provide game display information based on the needed perspectives.
The communication module508may direct the correct game display information, rendered videos and/or images to different user devices. For example, in multiplayer games, the communication module508will track each player's desired perspective(s) and provide the correct rendered video.
Those skilled in the art will appreciate that the rendering server122may comprise any number of modules. For example, some modules may perform multiple functions. Further, in various embodiments, multiple modules may work together to perform one function. In some embodiments, the rendering server122may perform more or less functionality than that described herein.
FIG. 6is a block diagram of a user device600(e.g., user devices104,106,108,110,112,114, and/or116) in some embodiments. The user device600comprises an authentication module602, a user interface (UI) module604, an input module606, a simulation module608, a rendering module610, a display module612, and a communication module614.
In various embodiments, the authentication module602communicates with the processing server118. The authentication module602may communicate with the processing server118via the communication network102when the user device600accesses a game, access the communication network102, or upon command by the player.
The authentication module602may authenticate communication with the processing server118. The processing server118may confirm that the user device600is authorized to play the game and/or receive services. The processing server118may also receive information regarding the hardware of the user device600, identify hardware specifications (e.g., size and resolution of the display, processor, and/or available memory). In some embodiments, the user device600authenticates the server.
The user interface module604may provide one or more interfaces to the player. The interface may include menus or any functional objects which the player may use to provide input (e.g., indicate player preferences for the next play or game conditions). In various embodiments, the user interface module604generates menus and/or functional objects based on game rules and/or the current user state. For example, the game rules and/or the current user state may trigger one or more menus and/or fields to request information from the player (e.g., to identify the next play).
The input module606may receive the input from the user interface module604. Input may include play selections, player choices, text, control executions (e.g., keyboard entry, joystick entry, or the like). In various embodiments, the input module606may provide input to the processing server118.
The simulation module608may generate simulation results based on the user input, game rules, and/or the current game state. The game rules may indicate the simulation to execute while the current game state and the user input may provide parameters which may affect the simulation result. In some embodiments, simulation is performed by a processing server118. In various embodiments, simulation may be performed by a remote digital device such as the simulation server120of the processing server118. In some embodiments, the simulation server120generates simulation results for multiple devices.
The rendering module610may be configured to render video and/or images. The perspective(s) and/or viewpoint(s) of the rendered video may be identified from the game rules, the user device600(e.g., limitations of rendering resources such as limited memory or GPU, size of screen of user device600, and/or other limitations such as resolution), and user point of view (e.g., based on the player's point of view in the game), and/or player preferences (which may indicate a preference for one or more views over others). In various embodiments, rendering may be performed by a remote digital device such as the rendering server122of the processing server118.
In some embodiments, the rendering module610may render video and/or images based, at least in part, on game display information received from the processing server. The game display information may provide instructions or information regarding rendering. In some embodiments, the game display information assists in the rendering of video and/or images such that the resource requirements of the user device600for rendering may be reduced.
The display module612is configured to display the at least some of the rendered video and/or images. For example, the rendering module610may render the simulation results in three different viewpoints. The player, for example, may receive one viewpoint by default and may optionally choose to view the other viewpoints. Alternately, the player may not opt to view the other rendered viewpoints. The display module612may comprise, for example, a screen.
The communication module614may provide authentication requests, user inputs, game rules, and/or game states to another digital device (e.g., the processing server118). The communication module614may also receive information to perform simulations (e.g., the communication module614may receive game state information and/or user inputs from other user devices in a multiplayer game. Further the communication module614may receive rendered video from another digital device (e.g., the processing server118) which may be displayed to the player by the display module612.
Those skilled in the art will appreciate that the user device600may comprise any number of modules. For example, some modules may perform multiple functions. Further, in various embodiments, multiple modules may work together to perform one function. In some embodiments, the user device600may perform more or less functionality than that described herein.
FIG. 7is a flow diagram for sharing video with multiple devices in some embodiments. In some embodiments, simulation and rendering may be performed by both a user device and at least one remote device such as the processing server. In one example, a game may be deterministic wherein given the same user selection(s) and current game state, multiple simulations provide the same result. As such, the user selection(s), game state, and/or other information may be provided to multiple devices to perform multiple simulations generating the same simulation result and/or render video and/or images. The rendered video and/or images may be based on the simulation result. The rendered video and/or images may be the same on all digital devices. In some embodiments, all or some of the rendered video and/or images may be different (e.g., from different perspectives or viewpoints) even though the video and/or images may capture the same gameplay (e.g., the same simulation result). For example, if two players are playing a football game, the player of the football team playing offense may see a play based on the offensive team's viewpoint while another player of a football team playing defense may see the same play based on the defensive team's viewpoint.
FIG. 7comprises a first user device702, a second user device704, and a processing server706. In step708, the first user device702generates a user interface based on the previous (or initial) game state and/or game rules. In one example, the user interface module602of the first user device702displays menus or any other input functionality or fields to request user input (e.g., user selections). The different menus and/or input functionality or fields may be based on the game and/or current game state.
In step710, the first user device702receives first device user input. In one example, the input module604of the first user device702receives user selection(s).
In step712, the first user device702provides the current game state and/or the first device user input to the processing server706. In some embodiments, the input module604updates the current game state with the first device user input and provides the updated current game state and/or the user input to the processing server706.
In step714, the first user device702simulates game play based on the current game state and the user input. In various embodiments, the simulation module610generates a simulation result based on the current game state and/or user input. The simulation module610may also update the current game state based on the simulation result.
In step716, the first user device702renders the simulation result. In one example, the rendering module612of the first user device702may render the simulation result into multiple videos and/or images (e.g., for different perspectives, viewpoints, or the like).
In step718, the first user device702displays the rendered result (e.g., displays the rendered video(s) and/or image(s)). The first user device702may display the result via a display module614.
Those skilled in the art will appreciate that the first user device702may follow game rules thereby allowing another menu and/or other functionality to request additional input from the user to continue gameplay, display events associated with gameplay, provide awards, provide points, award currency, identify winners and/or losers, pause time out, or provide any other functionality as required by the game.
In step720, the processing server706simulates game play based on the current game state and the user input. Those skilled in the art will appreciate that the processing server706may perform the same simulation utilizing the same inputs as the first user device702, and, as a result, the simulation result may be the same as that of the first user device702. In various embodiments, the simulation module410of the simulation server120generates a simulation result based on the current game state and/or user input.
The game state module406of the simulation server120may update the current game state based on the simulation result. In some embodiments, the game state module406maintains the updated current game state at the processing server706. Those skilled in the art will appreciate that the updated current game state at the processing server706may be the same updated current game state of the first user device702.
In step722, the processing server706renders the simulation result. Those skilled in the art will appreciate that the processing server706may perform the same rendering utilizing the same simulation result as the first user device702, and, as a result, the rendered video and/or images may be the same as that of the first user device702. In some embodiments, the rendering server122of the processing server706may render the simulation result into multiple videos and/or images (e.g., for different perspectives, viewpoints, or the like).
The processing server706and the first user device702may perform simulations and/or renderings simultaneously, near simultaneously, or at different times or days.
In step724, the processing server706provides the rendering result and/or the updated current game state to the second user device704. In step726, the second user device704displays the rendered result (e.g., displays the rendered video(s) and/or image(s)). The second user device704may display the result via a display module614.
Those skilled in the art will appreciate that the displayed video and/or images of the second user device704may be the same or different from the video and/or images of the first user device702.
In various embodiments, by simulating game play and/or rendering video and/or images by multiple remotely located devices, additional players at other digital devices may share (e.g., watch) the game of the player at the first user device702. Further, other players are not necessarily bound by the same viewpoint as the player at the first user device702. For example, multiple renderings at different viewpoints can be performed to allow any number of players to view gameplay of the first player without impacting the quality or speed of the game (e.g., the quality or speed of the renderings for the first player). Similarly, others may watch the game at any time.
Further, in various embodiments, game information, user input, and/or game states may be maintained to allow simulations and/or renderings to be performed at any time thereby allowing the game to be recreated at any time for any number of other users or the player if they wish to view past games.
In some embodiments, because game information, user input, and/or game states may be maintained to allow simulations and/or renderings to be performed at any time, players and/or other users may go to any point in a past game or a past point of time in a current game and make different selection(s) to experience different outcomes. In this example, a player may choose to load the game information, user input, and/or game states at a determined point in game play. The player may then make different selections moving forward in game play and continue to play the game based on new selections and the subsequent changes to the simulation, rendering, and game state based on the new selections.
In some embodiments, rather than rendering a game result based on the simulation, the processing server706may generate game display information based on the simulation result. The game display information may include any information associated with display of a user device. For example, the game display information may comprise text to be displayed by a user device. The game display information may comprise information to be rendered by the user device, information to assist the user device to render, or may include video and/or images rendered by the processing server706.
FIG. 8is a flow diagram for sharing different renderings with different devices in some embodiments. In some embodiments, simulation and rendering for a multiplayer game may be performed by one or more processing servers808and the rendered video and/or images provided to any number of digital devices (e.g., user devices802,804, and806). As similarly discussed regardingFIG. 7, a game may be deterministic wherein given the same user selection(s) and current game state (e.g., game state information), multiple simulations provide the same result. As such, the user selection(s), game state, and/or other information may be provided to a remote device to produce simulation and produce a simulation result. The game may be turn-based. The remote device may also provide different renderings based on the same simulation result. The different video and/or images may be based on different points of view, perspectives, sides played in the game, characters, or the like. In other embodiments, the rendered video and/or images may be the same for all digital devices.
FIG. 8comprises a first user device802, a second user device804, a third user device806, and a processing server808. In step810, the first user device802generates a user interface based on the current (or initial) game state. In one example, the user interface module602of the first user device802displays menus or any other input functionality or fields to request user input (e.g., request user selections). The different menus and/or input functionality or fields may be based on the game rules and/or current game state.
In step812, the first user device802receives first device user input. In one example, the input module604of the first user device802receives user selection(s). In step814, the first user device802provides the current game state and/or the first device user input to the processing server808. In some embodiments, the input module604updates the current game state with the first device user input and provides the updated current game state and/or the user input to the processing server808.
Similar to steps810and812, in step816, the second user device804generates a user interface based on the current (or initial) game state. In one example, the user interface module602of the second user device804displays menus or any other input functionality or fields to request user input (e.g., user selections). The different menus and/or input functionality or fields may be based on the game and/or current game state. In some embodiments, the menus and/or any other input functionality may be similar to that generated in step810by the first user device802.
In step818, the second user device804receives second device user input. In one example, the input module604of the second user device804receives user selection(s). In step820, the second user device804provides the current game state and/or the second device user input to the processing server808. In some embodiments, the input module604updates the current game state with the second device user input and provides the updated current game state and/or the user input to the processing server808.
Similarly, in step822, the third user device806generates a user interface based on the current (or initial) game state. In one example, the user interface module602of the third user device806displays menus or any other input functionality or fields to request user input (e.g., user selections). The different menus and/or input functionality or fields may be based on the game and/or current game state. In some embodiments, the menus and/or any other input functionality may be similar to that generated in step810by the first user device802.
In step824, the third user device806receives third device user input. In one example, the input module604of the third user device806receives user selection(s).
In step826, the third user device806provides the current game state and/or the third device user input to the processing server808. In some embodiments, the input module604updates the current game state with the third device user input and provides the updated current game state and/or the user input to the processing server808.
In step828, the processing server808simulates game play based on the current game state and/or the user inputs from the first, second and third user devices802,804, and806. In some embodiments, the processing server808maintains a current game state and only receives the user inputs from the other user devices. In various embodiments, one or more of the other user devices provides a current game state to the processing server808. In some embodiments, each user device802,804, and806modifies the current game state based on the related player input and provides at least the modified current game state to the processing server808.
In various embodiments, the simulation module610of the processing server808generates a simulation result based on the current game state(s), modified game state(s), and/or user inputs from the user devices. The simulation module610may also update the game state based at least in part on the simulation result.
In step830, the processing server808generates all or part of game display information based on the simulation. The game display information may comprise video and/or audio and/or may comprise information (which may include video and/or audio) to assist a local device with rendering.
In one example, the rendering module506of the processing server808may generate first, second, and third game display information for different user devices by rendering at least some of the simulation result into multiple videos and/or images for the different players (e.g., for different perspectives of each player, different viewpoints, or the like). In various embodiments, a rendering server122associated with the processing server808may receive user device display information from each of the user devices indicating a preference and/or a view point. The rendering server122may generate first, second, or third game display information (e.g., render video and/or audio or render information to assist in the rendering of video and/or audio) based at least in part on the preference, view point, and/or user device capability. Further, the rendering sever122or rendering modules of the first, second, and/or third devices may render different video and/or provide game display information for each of the user devices802,804, and806. The different video displayed on the user devices802,804, and806may have different resolutions, screen size, colors, or the like based on the hardware requirements, resources, and/or limitations of the user devices. The resulting appropriate video and/or images may be provided to the respective user device in steps832,834, and836respectively.
In various embodiments, the processing server808provides an updated current game state to each of the user devices. The updated current game state may be the same for each of the user devices. In some embodiments, each of user devices may receive a different current game state depending upon the inputs provided to the processing server808, the different game states received by the processing server808from the user devices, and/or game instructions.
In steps838,840, and842, the first, second, and third user devices802,804, and806display the respective rendered result (e.g., displays the locally and/or remotely rendered video(s) and/or image(s)) received from the processing server808. In various embodiments, different players of the different user devices may request and/or receive rendered video and/or images that was provided to other user devices (e.g., one player may request and view a multiplayer game from another player's perspective).
Those skilled in the art will appreciate that each user device may follow game requirements thereby allowing another menu and/or other functionality to request additional input from the user to continue gameplay, display events associated with gameplay, provide awards, provide points, award currency, identify winners and/or losers, pause time out, or provide any other functionality as required by the game.
Those skilled in the art will appreciate that the user devices may display the video and/or images based on the game display information from the processing server808simultaneously, near simultaneously, or at different times or days.
Further, in various embodiments, game information (e.g., information regarding game rules), user input, and/or game states may be maintained to allow simulations and/or renderings to be performed at any time thereby allowing the game to be recreated at any time for any number of other users or the player if they wish to view past games.
AlthoughFIG. 8depicts three user devices802,804, and806engaging in a multiplayer game, those skilled in the art will appreciate that there may be any number of user devices engaged in any number of single player and/or multiplayer games.
FIG. 9is a flow diagram for playing a game over different devices in some embodiments. In some embodiments, simulation and rendering may be performed by both a user device and at least one remote device such as the processing server. In one example, a game may be deterministic wherein given the same user selection(s) and current game state, multiple simulations provide the same result. As such, a player may begin playing a game on a first user device902, pause the game, and access the same game on another user device (e.g., second user device904). Even though the first and second user devices may be different (e.g., the first user device902is a console and the second user device904is a smartphone), the game state and the simulation result may be the same thereby allowing for a user to pick up where the user left off in a previously saved game, even though the user in on a different device.
FIG. 9comprises a first user device902, a second user device904, and a processing server906. In step908, the first user device902generates a user interface based on a current (or initial) game state and game rules. In one example, the user interface module602of the first user device902displays menus or any other input functionality or fields to request user input (e.g., user selections). The different menus and/or input functionality or fields may be based on the game rules and/or current game state. In step910, the first user device902receives first device user input. In one example, the input module604of the first user device902receives user selection(s).
In step912, the first user device902simulates game play based on the current game state and the user input. In various embodiments, the simulation module610generates a simulation result based on the current game state and/or user input. The simulation module610may also update the current game state based on the simulation result.
In step914, the first user device902renders the simulation result. In one example, the rendering module612of the first user device902may render the simulation result into multiple videos and/or images (e.g., for different perspectives, viewpoints, or the like).
In step916, the first user device902displays the rendered result (e.g., displays the rendered video(s) and/or image(s)). The first user device902may display the result via a display module614.
In step918, the first user device902receives a game hold command. A game hold command may be a command from the player of the first user device902to pause the game, log out, or otherwise save game play. The command may trigger the first user device to save the current game state and terminate play (e.g., close the game).
In step920, the first user device902provides the current game state to the processing server906. In some embodiments, the input module604updates the current game state with the first device user input and provides the updated current game state and/or the user input to the processing server906.
In some embodiments, when the first user device902initiates the game, the first user device902may log into or otherwise be authenticate by the processing server906. The processing server906may identify the first user device902, the game, and/or the player. When the processing server906receives the saved game state, the processing server906may save the saved game state.
AlthoughFIG. 9depicts the current game state being provided to the processing server906after the game hold command, those skilled in the art will appreciate that the current game state may be provided to the processing server906at any time after the current game state is generated (e.g., after the previous game state was updated with the results from the simulation).
The processing server906may associate the updated current game state with the first user device902, the player, an account associated with the user device902and/or the player, the game and/or any other information.
In step922, the player may access the second user device904(e.g., a smartphone) and may access the game (e.g., the second user device904logs in the player). In some embodiments, when the second user device904receives a game login from the player, the player may provide identify the game to execute as well as request that the game to continue from a previously saved state.
In step924, the second user device904may provide an authentication request to the processing server906. The authentication request may identify the second user device904and the game to continue (e.g., identify the game generally and/or identify the saved state such as the date and time when the game was previously saved). The processing server906may authenticate the player, game, and/or the second user device904.
In some embodiments, the processing server906receives a game continuation request that identifies the desired saved game state. Based on information contained within the game continuation request, the processing server906may retrieve the saved game state. In some embodiments, the processing server906may determine if the second user device904has sufficient resources to simulate the game and/or render simulation results. The processing server906may identify the hardware of the second user device904from the authentication request and/or any other information provided by the second user device904.
In one example, if the processing server906determines that the second user device904has sufficient resources for processing and rendering, the processing server906may provide the saved game state to the second user device904. In some embodiments, if the processing server906determines that the second user device904has insufficient resources for rendering, the simulation may be performed by the processing server906or the second user device904. The simulation result may be provided to the rendering server associated with the processing server906to render the video and/or images that are provided to the second user device904.
In various embodiments, the processing server906assumes that the second user device904has sufficient resources to simulate and render the video and/or images.
In step928, once the saved game state is received from the processing server906, the second user device904generates a user interface based on the game state received from the processing server906to continue game play. In one example, the user interface module602of the second user device904displays menus or any other input functionality or fields to request user input (e.g., user selections). The different menus and/or input functionality or fields may be based on the game and/or saved game state. In step930, the second user device904receives second device user input. In one example, the input module604of the second user device904receives user selection(s).
From the perspective of the player, the player may play the game on the second user device904where the player left off when the game was paused by the first user device902.
In step932, the second user device904simulates game play based on the saved game state and the second user input. In various embodiments, the simulation module610of the second user device904generates a simulation result based on the current game state and/or user input. The simulation module610may also update the current game state based on the simulation result.
In step934, the second user device904renders the simulation result. In one example, the rendering module612of the second user device904may render the simulation result into multiple videos and/or images (e.g., for different perspectives, viewpoints, or the like).
In step936, the second user device904displays the rendered result (e.g., displays the rendered video(s) and/or image(s)). The second user device904may display the result via a display module614.
FIG. 10is a flow chart for determining and providing different game display information to different user devices in some embodiments. The following flow chart is described in the context ofFIG. 7. Those skilled in the art will appreciate that the flow chart may comprise any number of user devices and/or processing servers. Further, the process of determining and providing different game display information may be applied in any number of ways including, but not limited to multiplayer sessions (e.g., seeFIG. 8) and single player across disparate devices (e.g., seeFIG. 9).
In step1002, a processing server706receives a game login from a first user device702(seeFIG. 7). In various embodiments, when the first user device702executes a game, the first user device702may communicate with the processing server706. In one example, the login of the first user device702may include an account identifier associated with the game, user, and/or first user device702. The login of the first user device702may also provide a username, password, device identifier, user identifier, or any other information. In some embodiments, the processing server706may authenticate the login request and/or confirm the authenticity of the game software associated with the first user device702.
Those skilled in the art will appreciate that the first user device702may not provide a game login or a login request to the processing server706. Rather, the first user device702may provide any information (e.g., account identifier, username, password, device identifier, user identifier, or any other information) to the processing server706at any time without providing a game or login request.
In step1004, the processing server706receives a device login from a second user device704. In some embodiments, the second user device704may access the processing server706to view gameplay of first user device702or participate in a multiplayer game. The second user device704login may include an account identifier associated with the game, user, and/or first user device702. The login of the second user device704may also provide a username, password, device identifier, user identifier, or any other information. In some embodiments, the processing server706may authenticate the device login.
Those skilled in the art will appreciate that the second user device704may not provide a device login or a login request to the processing server706. Rather, the second user device704may provide any information (e.g., account identifier, username, password, device identifier, user identifier, or any other information) to the processing server706at any time without providing a game or login request.
In step1006, the processing server706receives first and second user device capability information. Device capability information may be any information regarding the capabilities of one or more user devices. For example, from information received from the first user device702, the processing server706may identify the first user device702as a game console (e.g., Xbox) and/or hardware capabilities (e.g., processing capability and rendering capability). Device capability information may include, but is not limited to, screen size, resolution, color depth, rotation, scaling, processor, video card, currently available processing power, RAM, currently available RAM, VRAM, currently available VRAM, rendering capability, operating system, wireless network accessibility, wired network accessibility, cellular accessibility, or the like. Capability information may also comprise a state of the network in communication with a user device such as bandwidth, throughput, and/or quality of service (QoS).
In some embodiments, the processing server706may store information (e.g., in a database) identifying at least some device capability information. The processing server706may retrieve at least some device capability information from storage based on information from the first user device702(e.g., from the game login or information provided by the first user device702). In another example, the processing server706may retrieve at least some device capability information from storage based on information from the second user device704(e.g., from the user device login or information provided by the first user device702). Those skilled in the art will appreciate that all or some device capability information may be provided to the processing server706from the first user device702and/or the second user device704.
In step1008, the processing server706receives game state information and user input from the first user device702. In some embodiments, the in first user device702updates the current game state with first device user input and provides the updated current game state and/or the user input to the processing server706.
In step1010, the processing server706simulates game play based on the current game state and the user input. Those skilled in the art will appreciate that the processing server706may perform the same simulation utilizing the same inputs as the first user device702, and, as a result, the simulation result may be the same as that of the first user device702. The simulation server120may generate a simulation result based on the current game state and/or user input. In various embodiments, the simulation server120may update the current game state based on the simulation result.
In step1012, the processing server706determines game display information to be provided to the first and second user devices702and704based on game demands and/or user device capability information. In various embodiments, the rendering server of the processing server706makes the determination. The rendering server may make the determination for each device by comparing game demands to the user device capability information. If the game demands are low and the user devices are capable of meeting or exceeding the game demands, the rendering server may determine to provide the simulation result and/or updated game state information to the user devices such that the user devices may locally render and/or display game information as required.
Those skilled in the art will appreciate that the processing server706may provide a wide range of different game display information to a user device depending on game demands and capabilities of the user device. For example, the processing server706may provide text or simple information to the user device if game demands require text or simple information. If the capabilities of a user device may meet some but not all requirements at a desired level of performance, the processing server706may provide game display information to reduce resource requirements by the user device (e.g., by providing vector data or the like to assist the user device in rendering). If the capabilities of the user device do not meet the game demands at the desired level of performance, the processing server706may determine to render video and/or images to be provided to the user device.
In some embodiments, the rendering server determines game display information based on game demands in view of the simulation result. If the game demands and simulation result indicate that, for example, text or other basic information is to be displayed on one or both user devices, the rendering server may provide the text or other basic information to one or both user devices without rendering video or images. For example, the text or other basic information may be sent as plaintext or a text message.
Game demands may change at different points in the game depending on the game and/or the simulation result. For example, game demands may require simple text to scroll across a display of a user device at one point in time and subsequently require video to display movement of players or the like at another point in time. In some embodiments, the processing server706accounts for different game demands during gameplay when determining game display information to be provided to the different user devices.
In some embodiments, if a user device is only able at meeting some game demands but not all, the rendering server may determine to provide assisted rendering wherein game display information provided to the user device may allow for limited local rendering. For example, the rendering server may provide vector and/or quaternion data for device specific rendering of the simulation results by the user device to reduce resources required of the user device to display the game.
In another example, the rendering server may determine from the user device capability information that one or both devices have resources that exceed game demands and, as such, one or both user devices may be provided updated game state information and/or the simulation result to allow each user device to locally render game display information (e.g., video and/or audio) based on the simulation result and/or updated game state.
In step1014, the rendering server provides the determined game display information to the first and/or second user devices702and704. In various embodiments, the processing server706provides the updated game state to the first and/or second user devices702and704. The processing server706may also provide the simulation result to the first and/or second user devices702and704.
FIG. 11is a block diagram of an exemplary digital device1100. The digital device1100comprises a processor1102, a memory system1104, a storage system1106, a communication network interface1108, an I/O interface1110, and a display interface1112communicatively coupled to a bus1114. The processor1102is configured to execute executable instructions (e.g., programs). In some embodiments, the processor1102comprises circuitry or any processor capable of processing the executable instructions.
The memory system1104is any memory configured to store data. Some examples of the memory system1104are storage devices, such as RAM or ROM. The memory system1104can comprise the ram cache. In various embodiments, data is stored within the memory system1104. The data within the memory system1104may be cleared or ultimately transferred to the storage system1106.
The storage system1106is any storage configured to retrieve and store data. Some examples of the storage system1106are flash drives, hard drives, optical drives, and/or magnetic tape. In some embodiments, the digital device1100includes a memory system1104in the form of RAM and a storage system1106in the form of flash data. Both the memory system1104and the storage system1106comprise computer readable media which may store instructions or programs that are executable by a computer processor including the processor1102.
The communication network interface (com. network interface)1108can be coupled to a network (e.g., communication network102) via the link1116. The communication network interface1108may support communication over an Ethernet connection, a serial connection, a parallel connection, or an ATA connection, for example. The communication network interface1108may also support wireless communication (e.g., 802.11 a/b/g/n, WiMax). It will be apparent to those skilled in the art that the communication network interface1108can support many wired and wireless standards.
The optional input/output (I/O) interface1110is any device that receives input from the user and output data. The optional display interface1112is any device that is configured to output graphics and data to a display. In one example, the display interface1112is a graphics adapter. It will be appreciated that not all digital devices1100comprise either the I/O interface1110or the display interface1112.
It will be appreciated by those skilled in the art that the hardware elements of the digital device1100are not limited to those depicted inFIG. 11. A digital device1100may comprise more or less hardware elements than those depicted. Further, hardware elements may share functionality and still be within various embodiments described herein. In one example, encoding and/or decoding may be performed by the processor1102and/or a co-processor located on a GPU (i.e., Nvidia).
FIG. 12is an exemplary screen1200shot of a football game wherein players on offense are represented as round or circular while players on defense are represented as arrows. In various embodiments, a game may be displayed in many different ways. The game display for the same game may be different on different digital devices based on hardware capability (e.g., screen size, resolution, color depth, processing power, graphic rendering capability, network access, and/or bandwidth), user preferences (e.g., user selections of different ways to view the game), and game instant (e.g., point in the game which may allow for graphic representations of a summary of a drive or animation of players involved in the execution of individual plays).
In some embodiments, the processing server determines game display information to be provided to a user device. The determination may be based on hardware capability as well as game demands. If the game demands may prefer that players be animated at a point in the game but allow for assisted rendering or different actions should hardware capability be at or below game demands. For example, the processing server may determine to represent players as objects (e.g., circles and arrows) if the hardware capability of the user device (e.g., processing and rendering power of a smartphone) is limited. The processing server may subsequently provide the user device with game display information consistent with the determination. If a different user device is also sharing the same game experience and the different user device has significant hardware capabilities (e.g., the different user device is a console such as a Sony PlayStation), the processing server may provide game display information which allows rendering of animated players at the different user device. As a result, the different user device may have a different visual display of the game than other user devices viewing the same game.
FIG. 13is an exemplary screen shot1300of a perspective view of a football game wherein offensive drives are represented with graphics and video of plays may be accessed. In the exemplary screen shot1300, users may interact with different elements (e.g., click on a video element) to produce different effects (e.g., view rendered video of a play). In various embodiments, a wide variety of different graphical events may be depicted based, in part, on the hardware capability of the displaying user device. A user device with limited hardware capability may limit interactive elements (e.g., not provide video elements). In some embodiments, the video elements are links to video rendered by the user device (e.g., assisted by the game display information from the processing server) or rendered by the processing server.
In some embodiments, exemplary screen shot1300displays options for offensive plays which may allow for users to provide user inputs. The selection may be provided for simulation for the next play.
FIG. 14is another exemplary screen shot1400of a view of a football game wherein offensive drives are represented with graphics and video of plays may be accessed. In various embodiments, different user devices may present the same game in different ways. For example, a user device with limited display capability may display the exemplary screen shot1400while another user device with more robust display capability may display the perspective view as depicted in exemplary screen shot1300.
The above-described functions and components can be comprised of instructions that are stored on a storage medium such as a computer readable medium. The instructions can be retrieved and executed by a processor. Some examples of instructions are software, program code, and firmware. Some examples of storage medium are memory devices, tape, disks, integrated circuits, and servers. The instructions are operational when executed by the processor to direct the processor to operate in accord with embodiments of the present invention. Those skilled in the art are familiar with instructions, processor(s), and storage medium.
The present invention is described above with reference to exemplary embodiments. It will be apparent to those skilled in the art that various modifications may be made and other embodiments can be used without departing from the broader scope of the present invention. Therefore, these and other variations upon the exemplary embodiments are intended to be covered by the present invention.
Claims
- A system comprising: first game state information;a first user interface module on a first user device, the first user interface module configured to generate first user game input options associated with gameplay of a multiplayer game based on the first game state information, the first user interface module further configured to present the first user game input options to a first player associated with the first user device, the first user interface module further configured to receive a first user selection associated with one or more of the first user game input options;a second user interface module on a second user device, the second user interface module configured to generate second user game input options associated with gameplay of the multiplayer game based on the first game state information, the second user interface module further configured to present the second user game input options to a second player associated with the second user device, the second user game input options being different than the first user game input options, the second user interface module further configured to receive a second user selection associated with one or more of the second user game input options;and a processing server including: a communication module configured to receive the first and second user selections from the first and second user devices, respectively;a simulation module configured to generate simulation results based on the first game state information, game rules, and the first and second user selections;and a rendering module configured to perform multiple first renderings based on the simulation results and separately perform multiple second renderings based on the simulation results, the multiple first and multiple second renderings being rendered independent of each other, the multiple first and multiple second renderings to be displayed by the first and second user devices, respectively, wherein the first user device and the processing server are configured to split processing resources whereby the first user interface module generates the first user game input options, and the processing server performs the multiple first renderings subsequent to receiving the first user selection from the first user device and the second user selection from the second user device, wherein the second user device and the processing server are configured to split processing resources whereby the second user interface module generates the second user game input options, and the processing server performs the multiple second renderings subsequent to receiving the first user selection from the first user device and the second user selection from the second user device, wherein a perspective among multiple perspectives of the multiple first renderings is identified for rendering based on rendering resources associated with the first user device, or a perspective among multiple perspectives of the multiple second renderings is identified for rendering based on rendering resources associated with the second user device, and wherein each of the multiple first renderings and each of the multiple second renderings are videos.
- The system of claim 1 , wherein the perspective of the multiple first renderings is from a perspective of the first player and the perspective of the multiple second renderings is from a perspective of the second player.
- The system of claim 1 , wherein the game is turn based.
- The system of claim 1 , wherein the multiple first and multiple second renderings are similar.
- The system of claim 1 , wherein the first user interface module is further configured to generate the first user game input options in at least one menu to the first player based on the first game state information and the game rules.
- The system of claim 1 , wherein the simulation module of the processing server is further configured to generate second game state information based on the simulation results.
- The system of claim 6 , wherein the communication module of the processing server is further configured to provide updated game state information to the first and second user devices.
- The system of claim 7 , wherein the first and second user interface modules on the first and second user devices are each further configured to generate a menu to receive user input based on the second game state information and the game rules, the menu generated by the first user interface module including information different than the information included in the menu generated by the second user interface module.
- A system comprising: first game state information;and a processing server including: a communication module configured to receive first and second user selections from a first user interface module on a first user device and a second user interface module on a second user device, respectively, the first user interface module configured to generate first user game input options associated with gameplay of a multiplayer game based on the first game state information, the first user interface module further configured to present the first user game input options to a first player associated with the first user device, the first user interface module further configured to receive the first user selection associated with one or more of the first user game input options, the second user interface module configured to generate second user game input options associated with gameplay of the multiplayer game based on the first game state information, the second user interface module further configured to present the second user game input options to a second player associated with the second user device, the second user game input options being different than the first user game input options, the second user interface module further configured to receive the second user selection associated with one or more of the second user game input options;a simulation module configured to generate simulation results based on the first game state information, game rules, and the first and second user selections;and a rendering module configured to perform multiple first renderings based on the simulation results and separately perform multiple second renderings based on the simulation results, the multiple first and multiple second renderings being rendered independent of each other, the multiple first and multiple second renderings to be displayed by the first and second user devices, respectively, wherein the first user device and the processing server are configured to split processing resources whereby the first user interface module generates the first user game input options, and the processing server performs the multiple first renderings subsequent to receiving the first user selection from the first user device and the second user selection from the second user device, wherein the second user device and the processing server are configured to split processing resources whereby the second user interface module generates the second user game input options, and the processing server performs the multiple second renderings subsequent to receiving the first user selection from the first user device and the second user selection from the second user device, and wherein a perspective among multiple perspectives of the multiple first renderings is identified for rendering based on rendering resources associated with the first user device, or a perspective among multiple perspectives of the multiple second renderings is identified for rendering based on rendering resources associated with the second user device, and wherein each of the multiple first renderings and each of the multiple second renderings are videos.
- The system of claim 9 , wherein the perspective of the multiple first renderings is from a perspective of the first player and the perspective of the multiple second renderings is from a perspective of the second player.
- The system of claim 9 , wherein the game is turn based.
- The system of claim 9 , wherein the multiple first and multiple second renderings are similar.
- The system of claim 9 , wherein the first user interface module is further configured to generate the first user game input options in at least one menu to the first player based on the first game state information and the game rules.
- The system of claim 9 , wherein the simulation module of the processing server is further configured to generate second game state information based on the simulation results.
- The system of claim 14 , wherein the communication module of the processing server is further configured to provide the second game state information to the first and second user devices.
- The system of claim 15 , wherein the first and second user interface modules on the first and second user devices are each further configured to generate a menu to receive user input based on the second game state information and the game rules, the menu generated by the first user interface module including information different than information included in the menu generated by the second user interface module.
- A method comprising: receiving first game state information;receiving, by a processing server, first and second user selections from first and second user devices, respectively, the first user device configured to generate for first user game input options associated with gameplay of a multiplayer game based on the first game state information, the first user device further configured to present the first user game input options to a first player associated with the first user device and to receive the first user selection associated with one or more of the first user game input options, the second user device configured to generate second user game input options associated with gameplay of the multiplayer game based on the first game state information, the second user device further configured to present the second user game input options to a second player associated with the second user device and to receive the second user selection associated with one or more of the second user game input options, the second user game input options being different than the first user game input options;generating, by the processing server, simulation results based on the first game state information, game rules, and the first and second user selections;performing, by the processing server, multiple first renderings based on the simulation results, the multiple first renderings to be displayed by the first user device;and performing, by the processing server, multiple second renderings based on the simulation results, the multiple second renderings being rendered separate and independent of the multiple first renderings, the multiple second renderings to be displayed by the second user device, wherein the first user device and the processing server are configured to split processing resources whereby the first user interface module generates the first user game input options, and the processing server performs the multiple first renderings subsequent to receiving the first user selection from the first user device and the second user selection from the second user device, wherein the second user device and the processing server are configured to split processing resources whereby the second user interface module generates the second user game input options, and the processing server performs the multiple second renderings subsequent to receiving the first user selection from the first user device and the second user selection from the second user device, and wherein a perspective among multiple perspective of the multiple first renderings is identified for rendering based on rendering resources associated with the first user device, or a perspective among multiple perspective of the multiple second renderings is identified for rendering based on rendering resources associated with the second user device, and wherein each of the multiple first renderings and each of the multiple second renderings are videos.
- The method of claim 17 , wherein the perspective of the multiple first renderings is from a perspective of the first player and the perspective of the multiple second renderings is from a perspective of the second player.
- The method of claim 17 , wherein the game is turn based.
- The method of claim 17 , wherein the multiple first and multiple second renderings are similar.
- The method of claim 17 , wherein the first user device is configured to receive the first user selection by displaying at least one menu to the first player based on the first game state information and the game rules.
- The method of claim 17 , further comprising generating, by the processing server, second game state information based on the simulation results.
- The method of claim 22 , further comprising providing, by the processing server, the second game state information to the first and second user devices.
- The method of claim 23 , wherein the first and second user devices are each further configured to generate a menu to receive user input based on the second game state information and the game rules, the menu generated by the first user device including information different than information included in the menu generated by the second user device.
- A non-transitory computer readable medium comprising instructions, the instructions being executable by a processor for performing a method, the method comprising: receiving first game state information;receiving, by a processing server, first and second user selections from first and second user devices, respectively, the first user device configured to generate first user game input options associated with gameplay of a multiplayer game based on the first game state information, the first user device further configured to present the first user game input options to a first player associated with the first user device and to receive the first user selection associated with one or more of the first user game input options, the second user device configured to generate second user game input options associated with gameplay of the multiplayer game based on the first game state information, the second user device further configured to present the second user game input options to a second player associated with the second user device and to receive the second user selection associated with one or more of the second user game input options, the second user game input options being different than the first user game input options;generating, by the processing server, simulation results based on the first game state information, game rules, and the first and second user selections;performing, by the processing server, multiple first renderings based on the simulation results, the multiple first renderings to be displayed by the first user device;and performing, by the processing server, multiple second renderings based on the simulation results, the multiple second renderings being rendered separate and independent of the multiple first renderings, the multiple second renderings to be displayed by the second user device, wherein the first user device and the processing server are configured to split processing resources whereby the first user interface module generates the first user game input options, and the processing server performs the multiple first renderings subsequent to receiving the first user selection from the first user device and the second user selection from the second user device, wherein the second user device and the processing server are configured to split processing resources whereby the second user interface module generates the second user game input options, and the processing server performs the multiple second renderings subsequent to receiving the first user selection from the first user device and the second user selection from the second user device, and wherein a perspective among multiple perspectives of the multiple first renderings is identified for rendering based on rendering resources associated with the first user device, or a perspective among multiple perspectives of the multiple second renderings is identified for rendering based on rendering resources associated with the second user device, and wherein each of the multiple first renderings and each of the multiple second renderings are videos.
- The non-transitory computer readable medium of claim 25 , wherein the perspective of the multiple first renderings is from a perspective of the first player and the perspective of the multiple second renderings is from a perspective of the second player.
- The non-transitory computer readable medium of claim 25 , wherein the game is turn based.
- The non-transitory computer readable medium of claim 25 , wherein the multiple first and multiple second renderings are similar.
- The non-transitory computer readable medium of claim 25 , further comprising generating, by the processing server, second game state information based on the simulation results.
- The non-transitory computer readable medium of claim 29 , the method further comprising providing, by the processing server, the second game state information to the first and second user devices to enable the first and second user devices to each generate a menu to receive user input based on updated first game state information and the game rules, the menu to be generated by the first user device including information different than information included in the menu to be generated by the second user device.
Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.