U.S. Pat. No. 11,000,771
GAMEPLAY TELEMETRY AND VIDEO ACQUISITION SYSTEM
AssigneeElectronic Arts Inc
Issue DateMarch 30, 2017
Illustrative Figure
Abstract
Systems and methods for automated acquisition of gameplay session data of a game application are disclosed. While the game application is executed in a gameplay session, embodiment of the systems and methods can acquire data associated with the game application. The data acquired during the gameplay session may be aggregated and searched across game sessions.
Description
DETAILED DESCRIPTION Overview Testing software applications can be done manually (by a live tester) and can be automated. The automation of software application testing can be a difficult task. It can be difficult to automate game testing and it can be difficult to analyze the data that the automated game testing generates. Automated game testing can be useful for detecting some types of operational bugs within the game, such as, for example crashes, desyncs, game flow problems, freezing, and other types of bugs that affect operation of the game application. However, it can be difficult for automated testing to detect qualitative issues or bugs within an application, such as, for example, audio quality, visual quality, game balance, game flow, and other types of qualitative issues that affect software application. Manual testing is generally used to assess qualitative issues within an application. The manual tester will be required to recreate an issue, encounter, and/or situation during runtime execution of the application. However, it can be difficult for a manual tester to spend the amount time necessary to recreate hundreds, and sometimes thousands, of different events that may be needed to be verified within an application. For example, a manual tester may need to verify the audio quality of dialogue within a game application. In some instances, the audio will be associated with a specific event, which may be difficult to recreate within the application. Accordingly, it may be difficult for a manual tester to recreate an event in order to verify the audio quality of the dialogue. In order to address at least some of the difficulties with game testing, a gameplay acquisition system130can be used to capture and analyze video capture data associated with a game application. Game developers can obtain video and telemetry data during a gameplay session of ...
DETAILED DESCRIPTION
Overview
Testing software applications can be done manually (by a live tester) and can be automated. The automation of software application testing can be a difficult task. It can be difficult to automate game testing and it can be difficult to analyze the data that the automated game testing generates. Automated game testing can be useful for detecting some types of operational bugs within the game, such as, for example crashes, desyncs, game flow problems, freezing, and other types of bugs that affect operation of the game application. However, it can be difficult for automated testing to detect qualitative issues or bugs within an application, such as, for example, audio quality, visual quality, game balance, game flow, and other types of qualitative issues that affect software application.
Manual testing is generally used to assess qualitative issues within an application. The manual tester will be required to recreate an issue, encounter, and/or situation during runtime execution of the application. However, it can be difficult for a manual tester to spend the amount time necessary to recreate hundreds, and sometimes thousands, of different events that may be needed to be verified within an application. For example, a manual tester may need to verify the audio quality of dialogue within a game application. In some instances, the audio will be associated with a specific event, which may be difficult to recreate within the application. Accordingly, it may be difficult for a manual tester to recreate an event in order to verify the audio quality of the dialogue.
In order to address at least some of the difficulties with game testing, a gameplay acquisition system130can be used to capture and analyze video capture data associated with a game application. Game developers can obtain video and telemetry data during a gameplay session of the game application. The telemetry data can capture events that occur during runtime of the game application. The telemetry data may include, for example, a player's position, a character's physical movements, and their associated timestamps, and other information of a game application. Telemetry data of a game application can be used to recreate a game state and to analyze issues during game development. The telemetry data can be presented numerically in table format, charts, or as word descriptions.
The telemetry data can be associated with video data of the gameplay session. During the gameplay session, both telemetry data and video data are recorded. The gameplay session can have a session identifier (session ID). The telemetry data and video data of the gameplay session can both be linked to the same session ID.
The data acquired during the gameplay session may be streamed to a data analysis system and processed for further storage in a data store. The gameplay acquisition system130can automatically record telemetry events that occur during runtime of the gameplay session. The system can associate the telemetry data and the video data with the event based on the session ID and the timestamp of the event. Each event can be associated with a specific timestamp. After the data has been collected, the gameplay acquisition system130can provide a data visualization interface for a user to access the video and telemetry data. A user can search the telemetry data using the search system to identify events that occurred during gameplay sessions. The data visualization system can load the video and audio of the gameplay session at the time of a specific event. Advantageously, the search system can be used to search telemetry events recorded during different gameplay session. For example, in a football game, the search system can search for all touchdowns that occurred in the last two minutes of the game as the result of a run that was greater than 20 yards. The search can also allow a user to create customizable searches that allow a user to search for specific triggered events, such as specific telemetry data events. The system can automatically show the telemetry data and play the video data from the relevant timestamp when the user selects an event.
Overview of Gameplay Acquisition System130
FIG. 1illustrates an embodiment of a computing environment100for implementing a gameplay acquisition system130. The environment100includes a network108, a player computing system102, a user computing system103, and an interactive computing system120. To simplify discussion and not to limit the present disclosure,FIG. 1illustrates only one player computing system102, one user computing system103, and one interactive computing system120, though multiple systems may be used.
The interactive computing system can include application host systems122, one or more data stores124, and a gameplay acquisition system130. The gameplay acquisition system130can communicate with data store124and/or with the application host systems122to acquire data associated with a game application. The gameplay acquisition system130can communicate with user computing system103, and player computing system102through the network108. Although only one network108is illustrated, multiple distinct and/or distributed networks108may exist.
One or more of the systems described herein may be associated with the analysis of a game application104. For example, the player computing system102can be configured to run the game application104and acquire data for the game application104. The user computing system103can be configured to show analyses of the game application104. The interactive computing system120may process the data acquired by and/or the player computing system102and communicate the results of the data analyses to the user computing system103.
The player computing system102can be controlled by a user or a game automation system142. The player computing system102can be used to acquire data of the game application104during gameplay sessions. The user computing system103can review events encountered during gameplay sessions of the game application104. In some embodiments, a user, such as a game tester, may run a game application104on the player computing system102. The data acquisition system of the player computing system102can capture data associated with a gameplay session of the game application104. The captured data from gameplay sessions of the game application can be viewed and analyzed through the data visualization system146. Users, such as game developers or customer service personnel, may use user computing system103to analyze events in a player gameplay session using the data visualization system146.
The data visualization system146can provide an interface for the user to search and choose an event to review on a user computing system103. The user computing system103can communicate with the interactive computing system120through the data visualization system146in order to retrieve video data associated with the event. The video data associated with the event can be retrieved and can be displayed at the specific time that the event occurs during the gameplay session. For example, a user may want to view the in-game camera angle or dialog associated with a specific event. The user may search for the event and the data visualization system146can display the video of the gameplay session at a specific time associated with the event. In some embodiments, the time shown in the video can be offset by a defined amount of time before the event occurred.
For purposes of the present disclosure, the term “player” or “player system” can refer to a game application that is being operated by a person or automated by the game automation system (also referred to as a “bot”) during a gameplay session. For example, the player computing system102can refer to the computing system operated by the player or bot providing gameplay session data to the interactive computing system120through the data acquisition system150. The term “user” can refer to a person that is accessing the gameplay acquisition system130to view gameplay data associated with player's gameplay sessions through the data visualization system146. Though illustrated as separate systems, the player computing system102and the user computing system103can be the same computing system depending on whether the system is providing the gameplay session data or accessing the gameplay session data through the data visualization system146.
A. Interactive Computing System
In the illustrated embodiment, the interactive computing system120includes application host systems122, a data store124, and a gameplay acquisition system130. These systems may communicate with each other. For example, the gameplay acquisition system130can obtain data associated with a game application from the application host systems122and can store such data in the data store124. The application host systems122can communicate with the data store124to execute and/or host a game application. In certain embodiments, the interactive computing system120may be associated with a network-based video service.
1. Application Host Systems
The application host systems122can be configured to execute a portion of the game application104and/or host application106. In certain embodiments, the application host systems122may execute another application instead of or in addition to executing a portion of the game application104and/or host application106, which may complement and/or interact with the game application104during execution of a gameplay session of the game application104. Further details regarding application host systems are described below.
The interactive computing system120may enable multiple players or computing systems to access a portion of the game application104and/or host application106. In some embodiments, the portion of the game application104executed by application host systems122of the interactive computing system120may create a persistent virtual world. This persistent virtual world may enable one or more players to interact with the virtual world and with each other in a synchronous and/or asynchronous manner. In some cases, multiple instances of the persistent virtual world may be created or hosted by the interactive computing system120. A set of players may be assigned to or may access one instance of the persistent virtual world while another set of players may be assigned to or may access another instance of the persistent virtual world. In some embodiments, the application host systems122may execute a hosting system for executing various aspects of a game environment. For example, in one embodiment, the game application104may be a competitive game, such as a first person shooter or sports game, and the host application system122can provide a dedicated hosting service for hosting multiplayer game instances or facilitate the creation of game instances hosted by player computing devices. In some embodiments, the application host systems122can provide a lobby or other environment for players to virtually interact with one another. Such environments may include environments for conducting transactions between players, such as an auction house or type of environment for facilitating transactions.
2. Gameplay Acquisition System
As described with respect to other systems inFIG. 1, the gameplay acquisition system130can communicate with other systems to acquire data associated with a game application and to analyze the data. The gameplay acquisition system130can include one or more systems for data acquisition and analysis. For example, the gameplay acquisition system130can include a data visualization system146, a telemetry data acquisition system134, a search system136, a video acquisition system138, a game automation system142, and a data analysis system144. These example systems are not intended to be limiting, and the gameplay acquisition system130may include fewer or more systems than described. For example, in some embodiments, the interactive computing system may interface with a bug reporting system. In some embodiments, the gameplay acquisition system130may include more systems and/or functionalities that facilitate the acquisition of game data and the analysis of a game application.
The gameplay acquisition system130computing system130and its various systems may be distributed across multiple computing systems. The various systems of the gameplay acquisition system130can communicate with each other to obtain and analyze data associated with a game application. For example, a portion of the video acquisition system138may be executed by the player computing system102, while another portion of the video acquisition system138may be executed by the interactive computing system120. The video acquisition system138of the gameplay acquisition system130may communicate with the video acquisition system138of the player computing system102to acquire video data. The video acquisition system138of the interactive computing system120may generate a session ID for a particular game session. The video acquisition system138of the player computing system102may be a plug-in to the game application104and acquire video data of the game's execution. In some embodiments, the video acquisition system138may be entirely implemented by the player computing system102. In some embodiments, the computing device may have a video acquisition system that is independent of the gameplay acquisition system130and the game application. For example, a game console may acquire gameplay videos and communicate the acquired video to the data visualization system146. Some example interactions between various systems of the gameplay acquisition system130are further illustrated inFIGS. 2A and 2B. Each system of the gameplay acquisition system130is described in more detail below.
a. Video Coordination System
The video coordination system132can be configured to coordinate capture of video data and telemetry data associated with gameplay sessions. The video coordination system132can be configured to communicate with the game automation system142and video acquisition system138to coordinate the capture of the data streams from the gameplay session. The coordination system can send the command to video acquisition system to initiate capture of the gameplay session.
A gameplay session may be associated with a start time timestamp and/or an end time timestamp. The start time may be the time when the game application begins recording. The end time may be the time when the recording of the game application is terminated, for example, by a player or by a crash. The gameplay session can have timestamps for each event recorded during the gameplay session. During the gameplay session, the various systems may simultaneously acquire data of the game application. For example, the telemetry data acquisition system may acquire telemetry data of the gameplay session while the video acquisition system acquires video data of the gameplay session.
The gameplay session can be associated with a session ID. The session ID can be unique to a gameplay session. The session ID may be generated by the coordination system or it may be generated by one or more systems described herein, such as, for example, the player computing system102, application host systems122, or other systems of the gameplay acquisition system130, or the like. The session ID may be based on a variety of information such as, for example, an IP address, a timestamp when the game application begins to be executed, information associated with the computing system, an/or other information. The session ID may be used to link data acquired by different systems during the gameplay session. Advantageously, in some embodiments, data acquired by various systems may be further linked together using the timestamps of the gameplay session.
b. Telemetry Data Acquisition System
Telemetry data acquisition system134can be configured to acquire telemetry data during a gameplay session. Telemetry data of a game application can be used to identify events that occur during the gameplay session. The telemetry data acquisition system134can be configured to record defined events that are triggered during the gameplay session. Telemetry data may include data specific to the game application such as, for example, character's movements, character's positions, character actions (for example, firing a gun, shooting a basketball, and the like), in-game event (for example, an enemy's death, the start of a play, a point being scored), player inputs (for example, buttons pressed), in-game camera position, character dialogue, and the like. The telemetry data may also define one or more segments within a game application. A segment can define a start time and an end time. A segment can include all events that trigger within the start time and the end time. For example, in a football game, the segments may include individual plays and quarters.
In some embodiments, the telemetry data acquisition system can record a defined event that is triggered occur when defined criteria have been satisfied. For example, an event may be scripted to trigger only after the movement speed of a character within the game increases above a threshold. Each event can have an event identifier (ID), where the same types of events have the same event ID. For example, in a football game, each touchdown has the same event ID. The event ID can be used to search for events in any gameplay session for the game application.
In some embodiments, the telemetry data acquisition system134may also acquire system information associated with operation of the game application. The system information may include performance data such as, frame rate, CPU or memory utilization rate, machine on which the game application is executed, and the like.
Telemetry data can be stored in a variety of database formats. The telemetry data can be transformed into statistics for analyzing and debugging the game application. The telemetry data may be transformed or organized to show various metrics or statistics associated with a game application, such as, for example, average completion time as a function of individual game level, average weekly bug rate, revenue per day, number of active players per day, and so on. These statistics can sometimes be referred to as game metrics.
Telemetry data acquisition system134may associate telemetry data acquired during a gameplay session with the session ID. The telemetry data can be associated with the video data using the session ID.
c. Video Acquisition System
The video acquisition system138can record execution of a gameplay session of the game application104. The video acquisition system138can record the video data and audio data output by the video game application as one or more data streams. For example, the video acquisition system138may gather video data such as, for example, moving visual images of the gameplay (for example, object's movements), audio data (for example, sound of bombing or shooting), system notifications, dialogues, interactions with items, messages among the players, player commentary, web cam footage or the like. The video acquisition system138can record the screen of the player computing system102during a gameplay session. In some embodiments, the video acquisition system138may be configured to acquire video data associated with multiple views of a gameplay session. For example, a game application may record a plurality of video capture streams within a gameplay session even though only a single view is displayed on the player computing system at a given time.
The video acquisition system138can be configured to interface with different types of player computing systems. Each type of gaming console and operating system can have different types of interfaces that are used to record a gameplay session. In some embodiments, the console may already include video recording software that can record the gameplay session and provide it to the video acquisition system138. In some embodiments, the video acquisition system138can generate instances, such a virtual machine instance, for each player computer system that are responsible for interfacing with the player computing system and retrieving the gameplay session. In some embodiments, video and audio data may be acquired by separate software and/or hardware.
In some embodiments, the data acquired by the video acquisition system138may be streamed live to the data visualization system146and stored in temporary storage prior to analysis and encoding by the video processing system140. For example, the video acquisition system138may gather video data and transmit the video data to the interactive computing system120. The video acquisition system138may record the gameplay session, store the recorded video, such as in data store126, and transmit the recorded data to other computing systems at a later time.
As described with reference to telemetry data acquisition system, the video data may be linked to a gameplay session based on the session ID. During the gameplay session, the video data acquired by the video acquisition system138may be associated with a start timestamp and/or an end timestamp. The start timestamp can be an absolute value based on the time that the gameplay session was initiated (such as, 8:30:00.05 am). Each timestamp recorded during the gameplay session can also be based on the absolute time. For example, the end timestamp may be 8:31:00.43 am. In some embodiments, timestamps may be generated and/or expressed that are relative to the start timestamp of the video. For example, a one minute video of the gameplay session may have the timestamps between 0 seconds and 60 seconds.
In between the start timestamp and end timestamp, the video data may be divided into a plurality of frames. The timestamps may be used to refer to specific frames within the video. In some embodiments, the gameplay acquisition system130can associate an event with certain timestamps of the video data. The event can further be associated with other data, such as telemetry data, in the gameplay session using the session ID and timestamps. For example, the gameplay acquisition system130may record a shooting game application in a gameplay session. During a shooting game, an event may trigger be when a player shoots a weapon. The gameplay acquisition system130can identify the timestamp when this event occurs. The gameplay acquisition system130can associate video frames and telemetry data for this event based on the timestamps.
d. Video Processing System
The video processing system140can be configured to encode video and audio data associated with gameplay sessions for final storage in a data store (such as, data store124). The video processing system140can retrieve a completed gameplay session video from a temporary storage location and encode the video into a standard file format, such as, for example, an MP4 file. The video processing system can be configured to curate video files within the data store and delete videos that have been in the data store for a defined period of time. The video processing system140can associate each video with a session ID associated with the gameplay session.
e. Game Automation System
The game automation system142can be a system configured to automate gameplay of a gameplay session using an artificial intelligence (AI) system. The game automation system is optional component and can be used during automated testing. The game automation system142can include one or more virtual game agents that are configured to play the video game application104. A game agent can have certain characteristics. A game agent can automatically make decisions at various states of the video game application104. The game automation system142may have various types of agents with each type of agent being associated with defined characteristics. In some embodiments, the game agent's characteristics may include a level (also referred to as AI level). For example, the AI level may include qualitative grades such as low, medium, or high or quantitative grades, such as for example, in a scale of 1 to 10 with 10 being the hardest (or easiest). The characteristics of the agents may also include personas where agents may be trained to mimic the choices of a person at a particular state.
At each state within the video game application, a game agent in the game automation system142can make an action based on its operational characteristics. The action may be a random action chosen from a pool of available actions at that state, a predefined option, or an informed action based on the available information (such as, for example, data received in the current and/or previous gameplay sessions).
In some game applications, multiple game agent systems may be used to simulate a gameplay. For example, in a football video game, each game agent may play a team. As another example, in a single player turn-based game, a game agent may simulate the gameplay as if it is a human. Each game automation system142may include game agents with similar types or characteristics. For example, one game automation system142may include learning agents while another game automation system142may include random agents. A game automation system142may also include game agents with mixed types or characteristics. For example, the game automation system142may include static agents for farming and learning agents for attacking or defending.
The game automation system142can execute within the video game application104by interfacing with the API of the video game application104. For example, a game agent of the game automation system142can perform an action in the video game by calling the API associated with the action. Additionally or alternatively, the game automation system142can interact with the video game application104by simulating an action of a peripheral device. For example, the game automation system142(and/or the game agents of the game automation system142) can simulate a mouse click, a mouse movement, a keyboard stroke, or other movements of a game controller. The graphics of the video game application104may not rendered when the game automation system142interacts with the video game application104via the API. When the game automation system142executes within the video game application104by simulating an action of a peripheral device, the video game application104may be configured to render graphics. For example, a game agent may be configured to only click on a certain spot of the screen.
The game automation system142can include a game automation schedule that can be used to coordinate the automation of gameplay sessions on various computing systems102. The automation schedule can be used to control the parameters used for each automated gameplay session, such as, for example, the type of agent, the duration of the gameplay session, the level within the game application, the number of virtual agents within the game, and any other characteristics associated with an automated gameplay session. The automated gameplay session may coordinate and provide instructions to initiate hundreds of different systems using different architectures (for example, an XBOX console or a Sony PlayStation console) and various types of games. The automation schedule can be configured by users to acquire specific types of gameplay data associated with a game application. For example, a game application may need to validate and verify various aspects of the game during quality assurance testing. The quality assurance schedule may include validation of audio recordings at specific points within the game. The automation schedule may be configured to execute the game applications104so that the specific gameplay sessions are recorded that include the required audio recordings.
As further described with reference to the telemetry data acquisition system134, as a game automation system142plays the video game application104, the events generated by the game agent's actions as well as the results of the actions can be recorded by the telemetry data acquisition system134. In the illustrated example, only one game automation system142is shown, the computing system102can simulate a gameplay session by running multiple game agents systems108in series or in parallel by running multiple instances of the video game or multiple agents within a video game.
f. Data Analysis System
The data analysis system144can analyze data associated with a game application and generate game metric information. The data analysis system144may perform data analyses after a gameplay session. For example, the data analysis system144can obtain telemetry data from the data store124. The data analysis system144can generate reports based on the acquired telemetry data.
The data analysis system144can analyze data across multiple gameplay sessions. The data analysis system144may retrieve data from the data store124associated with multiple session IDs. In some implementations, other identifiers may be used to obtain data. The data analysis system144may use event ID and/or timestamps in a recorded video to retrieve data.
The data analysis system144can communicate with the data visualization system146and present the game metrics data in a variety of formats such as, for example, graphs (for example pie charts, heat maps, tables, line graphs), tables, word descriptions, or the like. In some embodiments, the data analysis system144may contain multiple modules where each module is for a certain type of analysis. For example, one module may be used to generate graphs for game metrics data. These modules may be implemented as plugins to the game application104and/or to the data visualization system146. The user can customize the data analysis system144, for example, by adding or removing one or more modules.
g. Search System
The search system136can communicate with various systems of the gameplay acquisition system130and/or the interactive computing system120. For example, the search system136may communicate with the data visualization system146to receive one or more search criteria. The search system can communicate with data store124and look up information based on the search criteria. The search system136can transmit information obtained from the data store124to the data visualization system146for the user to view.
The search system136can look for one or more events in a gameplay session. The search system136can also look for information such as specific types of events, across multiple gameplay sessions. For example, the search system136can search for information associated with every touchdown during the last two minutes of a football game. The search system136can also search for all recorded videos in the past two days.
h. Data Visualization System
The data visualization system146can generate a user interface for a user to view data analyses associated with the game application104. The user interface may include game telemetry data, a recorded video of a game session, a filtering tool, a search tool, or the like. The filtering tool and/or the search tool may be configured to receive user input and filter data based on the user input. The user interface may be rendered through a web interface (such as a webpage), and/or on an application locally installed on a computing device. Example embodiments of a user interface of the data visualization system are illustrated inFIGS. 5 and 6.
The data visualization system146may generate user interface data using one or more user interface templates. With reference toFIG. 2, a user interface template can have multiple modules, such as, for example, an event information module250, a visual game information module270, a video module280, an interface control module260, or the like. The data visualization system146may populate the modules using information received from other systems of the gameplay acquisition system130.
3. Data Store
The interactive computing system120can include a data store124. The data store124can be configured to store data acquired by other systems, such as, for example, telemetry data, video data, user data, or the like. The data store may be distributed across multiple computing devices (see for example computing device700inFIG. 7). In some embodiments, the data store124may be network-based storage system where data may be stored in different locations.
B. Player Computing System
The player computing system102may include hardware and software components for establishing communications over a communication network108. For example, the user computing system102may be equipped with networking equipment and network software applications (for example, a web browser) that facilitates communications via one or more networks (for example, the Internet or an intranet). The player computing system102may have varied local computing resources such as central processing units and architectures, memory, mass storage, graphics processing units, communication network availability and bandwidth, and so forth. Further, the player computing system102may include any type of computing system. For example, the player computing system102may include any type of computing device(s), such as desktops, laptops, game application platforms, virtual reality systems, augmented reality systems, television set-top boxes, televisions (for example, Internet TVs), network-enabled kiosks, car-console devices computerized appliances, wearable devices (for example, smart watches and glasses with computing functionality), and wireless mobile devices (for example, smart phones, PDAs, tablets, or the like), to name a few. In some embodiments, the player computing system102may include one or more of the embodiments described below with respect toFIG. 9.
1. Game Application and Host Application System
The player computing system102is capable of executing a game application104, which may be stored and/or executed locally and/or in a distributed environment. In a locally executed game application104, generally, the game does not rely or utilize an external computing system (for example, the interactive computing system120) to execute the game application. In some instances, a locally executable game can communicate with an external server to retrieve information associated with the game, such as game patches, game authentication, clouds saves, or other features. In distributed game applications, the player computing system102may execute a portion of a game and the interactive computing system120, or an application host system122of the interactive computing system120may execute another portion of the game. For instance, the game may be a massively multiplayer online role-playing game (MMORPG) that includes a client portion executed by the player computing system102and a server portion executed by one or more application host systems122. For the present discussion, the type of game application104can be a locally executable game, a distributed application, or an application that includes a portion that executes on the player computing system102and a portion that executes on at least one of the application host systems122.
2. Data Acquisition System
Data acquisition system150may be used to acquire data associated with a player and/or game application104. Data acquisition system150can interface with the system of the interactive computing system such as the video acquisition system138and telemetry data acquisition system134in order to transfer data during gameplay sessions. The data acquisition system150can acquire telemetry data of a game application (using telemetry data acquisition system134), video data of the gameplay (using video data acquisition system138), and/or other types of data.
The various systems of the data acquisition system150may be implemented by hardware, software or a combination. For example, the systems can be software plug-in to the game application104, host application system106, and/or application host systems122. One or more of the systems can also be a standalone application which can communicate with the game application104, host application system106, and/or application host systems122.
C. User Computing System
The user computing system103can be implemented by a computing device as described with reference toFIG. 7. The user computing system103can comprise the data visualization system146. The user computing system103can communicate with the player computing system102and/or interactive computing system120through a network108. In some embodiments, the user computing system103may be a part of the player computing system102or the interactive computing system120.
The data visualization system146of the user computing system103can include a user interface. As described with reference toFIG. 2, the user interface can display one or more game metrics. The game metrics may be shown in various formats such as tables, word descriptions, graphs, or the like. The user interface can display a game environment map such as a player's location or movement direction at a certain timestamp. The user interface can show a video acquired by the video acquisition system138during a gameplay.
The data visualization system146may be implemented in a variety of ways such as, for example, a website, a mobile page, a plug-in to an application (such as for example, a debug application, a game application, or a host application), and so on. The data visualization system146will be described in more detail below.
D. Other Considerations
Although the various systems are described separately, it should be noted that one or more of these systems may be combined together. For example, the user computing system103may be combined with player computing system102. In another example, the search system136may be a part of the data visualization system146. Additionally, one or more of the systems may be executed by the same computing device (see for example, computing device10inFIG. 7). For example, the administrative computing system103may be executed on the same computing device as the player computing system102.
On the other hand, one or more systems may be executed by multiple computing devices. For example, a portion of the data visualization system146may be implemented by a player's personal computer while another portion may be implemented by a server.
Example Embodiments of Gameplay Session Capture Process
FIGS. 2A and 2Billustrate embodiments of interactions between the various components of the gameplay acquisition system130and the player computing system when acquiring data streams of a gameplay session.FIG. 2Aillustrates an embodiment with the game automation system142that would be used during automated testing of the game application.FIG. 2Billustrates an embodiment with the game automation system142that would be used during manual testing of the game application.
With specific reference toFIG. 2A, at (1), during automated testing, the game automation system142can communicate with the computing system102in order to initiate a gameplay session of the game application104. The game automation system142can initiate the gameplay session in accordance with a defined automation schedule, which can control the parameters used for the automated gameplay session, such as, for example, the type of agent, the duration of the gameplay session, the level within the game application, the number of virtual agents within the game, and any other characteristics associated with an automated gameplay session. Though, the illustrated embodiment includes only a single computing system, the game automation system can control a plurality of computing systems. The GAS can also communicate with the coordination system132to indicate that the gameplay session is being initiated.
At (2), the coordination system132can provide instructions to the video acquisition system to initiate capture of the video stream and to the telemetry data acquisition system134to initiate capture of the telemetry data stream. In some embodiments, the coordination server can associate and/or instantiate one or more a virtual machine in order to capture the data from the gameplay session. For example, a virtual machine may be associated with the video acquisition system138and a virtual machine may be associated with the telemetry data acquisition system134. In some embodiments, the coordination system132can communicate directly with the computing system to initiate capture of the gameplay session. In such embodiments, the computing system102may have a video acquisition system that is integrated into the computing system.
At (3), the video acquisition system138and the telemetry data acquisition system134can acquire video data streams and telemetry data streams, respectively. The video acquisition system138can record the video data and audio data output by the video game application as one or more data streams. For example, the video acquisition system138may gather video data such as, for example, moving visual images of the gameplay (for example, object's movements), audio data (for example, sound of bombing or shooting), system notifications, dialogues, interactions with items, messages among the players, player commentary, web cam footage or the like. The telemetry data acquisition system134can be configured to record defined events that are triggered during the gameplay session. Telemetry data may include data specific to the game application such as, for example, character's movements, character's positions, character actions (for example, firing a gun, shooting a basketball, and the like), in-game event (for example, an enemy's death, the start of a play, a point being scored), player inputs (for example, buttons pressed), in-game camera position, character dialogue, and the like. The telemetry data may also define one or more segments within a game application. The telemetry data can be associated with the specific gameplay session by a session ID.
At (4), the video processing system140can encode video and audio data associated with gameplay sessions for final storage. The video processing system140can retrieve the completed gameplay session video from a temporary storage location and encode the video into a standard file format, such as, for example, an MP4 file.
At (5), the processed and encoded video file of the gameplay session is transferred to a data store, such as data store124, for final storage. At (6), the telemetry data associated with the gameplay session is stored in a data store, such as data store124, for storage. The telemetry data and the video data can be stored in different data stores. The telemetry data and the video data can be accessed by the other systems, such as the data visualization system146.
With specific reference toFIG. 2B, at (1), during manual testing, the coordination system132can communicate directly with computing system in order to coordinate acquisition of the video and telemetry data streams during the gameplay session. In some embodiments, the computing system can have a coordination system module or interface that can be accessed by a manual tester in order to initiate gameplay session. The other interactions are substantially the same as those described above with reference toFIG. 2A.
Example Embodiments of Gameplay Telemetry Data
FIGS. 3A, 3B, and 3Cillustrate embodiments of telemetry data and video data associated with a gameplay session.FIG. 3Aprovides an example embodiment of data for an event302recorded during the gameplay session. A timestamp304for the event302provides the time when the event was recorded during the gameplay session. In this embodiment, the timestamp is an absolute time. An illustration of the sequence of events310are illustrated in sequential order. The event302corresponds to the video data320. The playback time308of the identified event302within the video can be determined based on the video start timestamp306and the event timestamp302. In this instance, the playback time308of the event302is 10 minutes and 47 seconds after the start of the video.
The event data can be specific to the game application such as, for example, character's movements, character's positions, character actions (for example, firing a gun, shooting a basketball, and the like), in-game event (for example, an enemy's death, the start of a play, a point being scored), player inputs (for example, buttons pressed), in-game camera position, character dialogue, and the like.
In some embodiments, the telemetry data acquisition system can record a defined event that is triggered to occur when defined criteria have been satisfied. For example, an event may be scripted to trigger only after the movement speed of a character within the game increases above a threshold. Each event can have an event identifier (ID), where the same types of events have the same event ID. For example, in a football game, each touchdown can have the same event ID. The event ID can be used to search for events across gameplay sessions for a game application. The event data can be any type of data associated with a triggered event within the game application.
FIG. 3Bprovides an illustrative example of data for a segment312captured during a gameplay session. A segment312may also be referred to as a segment event. A gameplay session may be divided into one or more segments. A game application can include different types of segments. For example, in a football game, the segments may be a play or a quarter. In some embodiments, each event can be a segment. A segment can define a start time. In some embodiments, the segment may also include an end time. The segment can be used to define searchable groupings of data within the game application. In some embodiments, a segment may include all events that trigger between a start time and an end time of the segment. The events may include a reference to segment an associated segment. A segment start timestamp can be used to identify start time for replay of the video file. The telemetry data acquisition system and/or the search system can identify the closest segment that occurs prior to the triggering of an identified event. The start timestamp of the segment can be used as a start time for viewing the video. For example, in a football game, the start time of a play in which an identified event occurs can be used as the start time for viewing a playback of the gameplay session.
FIG. 3Cprovides an illustration of a portion of events that occurred during a gameplay session. The use of segments can be used to identify a replay time of a video from a gameplay session. In the illustrated example, each column identifies search criteria being used to search for the occurrence of specific events within a gameplay sessions. In the illustrated example, the game is a football game. The search system can use the event data to do complex searches in order to identify occurrences of events, characteristics associated with the events, and other event data associated with the gameplay session. In this example, the first column322identifies an all the touchdown events that occurred during the gameplay session, which was three. The search system can also identify segments associated with the event. The second column324identifies plays that resulted in 20 or more yards being gained by the player, which there were three. Two of the three plays resulted in touchdowns. The third column326identifies segments of the game that were played within the final two minutes of a quarter. The fourth 328 column identifies segments of the game that were played in snowy conditions. The fifth column330provides the results of segments that satisfy the identified search criteria. The search criteria only identified a single segment. The matching segment can be output by the search system within an interface within the data visualization system.
Example Embodiment of a Data Visualization System
FIG. 4illustrates an embodiment of data visualization system and various other systems. As shown in the illustrated embodiment, the data visualization system146can communicate with various systems such as, for example, a search system140, a data analysis system144, one or more databases (for example, database124), and/or other systems. The data visualization system146can generate user interface data and transmit the user interface data to be displayed by a computing device (for example, the computing device described with reference toFIG. 9). Each of the systems may be part of a game application (such as, for example, a plug-in of the game application), a host application, a webpage, or a standalone application.
In the illustrated embodiment, the data visualization system146includes, for example, an event information module250, a visual game information module270, a video module280, an interface control module260, and so on. These modules can be used to display information received from systems herein.
Although the examples inFIG. 4are described with reference to various systems and modules, these examples are not limiting. The system200may include more or fewer systems and/or modules. The data visualization system146may include another module that is used to display data generated by the data analysis system. In some implementations, one or more modules may be a part of another module.
The data visualization system146can communicate with data analysis system144and display game metrics in a variety of formats such as, for example, graphs (for example pie charts, heat maps, tables, line graphs), tables, word descriptions, and so on. The data visualization system146may display such data analyses using one or more modules such as debug information module240, event information module250, or the like.
In some embodiments, the data visualization system146may include one or more plug-ins for rendering game metric analyses in certain formats. For example, the data visualization system146may have a plug-in that can allow the data visualization system146to display game data using a heatmap.
The data visualization system146can include a user interface control module260which allows the user to search and filter data. The user interface control module260can include a search tool allowing a user to input search queries and receive data based on the search queries. For example, the user may provide search terms in a search query. The interface control module260can communicate an event ID to the search system136. The data visualization system146can receive from the search system136, data such as, for example, telemetry data, video data associated with the event ID. The data visualization system146can then display the video data in the video module280and telemetry data in the debug information module240.
The interface control module260can also include various options for generating search queries. The search system can be configured to include defined search terms that are associated with events within a game application. The defined search terms can be plain English versions of events within the game application. The user can input the plain English terms in lieu of a specific event ID. A filtering tool can allow a user to choose how the user would like the search results returned. For example, the user can select a type of segment that can be used to parse the events. The filtering tool can allow the user to select between one or more sets of data the user is interested in, such as data associated with different types of testing (such as manual or automated testing). The filtering tool may be applied to various types of data such as, for example, game metrics data, video data, telemetry data, or the like. The user interface control module260may display one or more options from which the user can choose for filtering. For example, the options may include, timestamps, events, session IDs, segment types, and so on. The user interface control module260can also allow the user to directly input filtering criteria using specific the exact event IDs. In some embodiments, the search system can provide a system for generating scripted search queries. For example, a quality assurance program may have a defined battery of test results that need to be analyzed. The scripted search queries can be used to generate searches associated with each of the requirements for the specific quality assurance tests, rather than having the user generate the entire search query each time.
In some embodiments, the user may input a query which causes the data visualization system146to receive data associated with multiple gameplay sessions. For example, the user may request a list videos associated with a specific gameplay event (such as a specific boss fight) in the past two days. The data visualization system146may receive the session IDs and/or data associated with the list of videos requested by the user. The search system may identify a segment associated with each event that satisfies the search query.
The data visualization system146can display the video data acquired by the video acquisition system138. The data visualization system146can display the video data in the video module280. As described with reference toFIG. 1, the video may include one or more timestamps associated with certain events. The user may choose to watch the video beginning at a timestamp associated with one of the events. The data visualization system146may display raw telemetry data acquired by the telemetry data acquisition system134. In some embodiments, the data acquisition system132may display telemetry data processed by other systems such as, for example, data analysis system144.
Examples Embodiments of User Interfaces
FIGS. 5, 6, and 7are examples of user interfaces implementing outputs of the data visualization system. The user interface may be a webpage or an application configured to be output on a computing device, such as user computing device104. Each of the user interfaces shown includes one or more user interface controls that can be selected by a user, for example, using a browser or other application software. The user interface controls shown are merely illustrative examples and can be varied in other embodiments. For instance, buttons, dropdown boxes, select boxes, text boxes, check boxes, slider controls, and other user interface controls shown may be substituted with other types of user interface controls that provide the same or similar functionality. Further, user interface controls may be combined or divided into other sets of user interface controls such that similar functionality or the same functionality may be provided with very different looking user interfaces. Moreover, each of the user interface controls may be selected by a user using one or more input options, such as a mouse, touch screen input, or keyboard input, among other user interface input options.
With specific reference toFIG. 5, the user interface500provides a plurality of search filters502. The search filters can be populated with one or more search terms504. The search terms can be automatically identified based on predefined search terms, such as plain English search terms or script-based search queries. The user can combine search terms as desired. In some embodiments, the search filters can provide for a user to input any term or event ID. The system may include one or more filters506that provide functionality for filtering the type of data search, such as, for example, type of segment, test configuration, test type, types of games, types of events, types of computing device (for example, Xbox v. PC), in combination or the like.
The interface can include a list of search results508. Each search result can identify additional information associated with the search result. In some embodiments, the system can be configured to provide a popup510with additional details associated with a result when an input device selects or hovers over one of the search results. When a search result is selected, the video associated with the result can populate a video player portion516of the interface. The video player can include playback controls512for viewing the video, such as, for example, play, pause, playback speed, and the like.
The event module514can include additional information associated with the gameplay session of the selected search result. For example, as shown inFIG. 5, the event module350may include the date, time, type, and description of the event. The user can obtain more details about an event by clicking on the event in the event module.
The event module can include a listing of video segments from the gameplay session. The timeline372can allow the user to view the gameplay session at different points in time. Advantageously, when the user clicks on a video segment, the data visualization system may begin to play the recorded video at the starting point of the video segment. At the same time, the data visualization system may also show changes associated with event data from the newly selected video segment. In some embodiments, the user may click on a different video segment from the same gameplay session and the data visualization system can start playing the newly selected video segment. The user may also click on other video segments of the gameplay session.
Example Embodiments of Gameplay Data Analysis
FIGS. 6A and 6Billustrate example embodiments of user interfaces illustrating an analysis of gameplay metrics of aggregated telemetric data. The user interfaces display aggregated telemetry data information associated with a plurality of telemetric events. The data visualization system146may display the user interfaces when a user requests additional data metrics regarding telemetric events. Analysis of the gameplay data is further described below with reference toFIGS. 7, 8, and 9.
With reference toFIG. 6A, the user interface600provides an example user interface that can provide an output of an analysis of defined gameplay statistics602of aggregated telemetric data. The illustrated example provides analysis of a gameplay statistic associated with the final score of Cleveland playing Orlando. The user can select and/or define any gameplay statistics associated with the collected telemetric data. The displayed output for a gameplay statistic can be based on recorded metrics (for example, final score, number of points, number of passes, and the like), derived metrics (for example, average score, average time between scores, median number of points scored, and the like), and/or other types of metrics that are associated with the gameplay sessions.
In the illustrated embodiment, the user can select one or more gameplay statistics602using one or more user inputs, such as drop down boxes or text entry boxes. An output representing one or more data sets and/or derived game metrics604can be generated (for example, average, max value, min value, standard deviation, and the like). The user may customize how to display the output within the user interface.
The user interface600can generate a visual representation606of the data metrics. One or more of the generated datasets608can be displayed. In some embodiments, the user can provide target ranges, as illustrated by the shaded portion610. The data visualization system146can display game metrics in a variety of formats such as, for example, graphs (for example pie charts, heat maps, tables, line graphs), tables, word descriptions, and so on.
With reference toFIG. 6B, the user interface600provides another example an output of an analysis of gameplay metrics602of aggregated telemetric data. In this example, derived data metrics604are being displayed that are associated with plays in a game. The user can select any game metric associated with the collected telemetric data. In the illustrated embodiment, the data is displayed on a timeline612at defined time intervals. The data analysis provides aggregated data points associated with each interval. For example, the data for each dataset608can be aggregated over a week. The aggregated data can then be parsed and divided into subsets of data representing each day of the week and displayed on a timeline.
The data analysis system144can analyze data of any defined time period (for example, minutes, hours, days, weeks, and the like) and/or any defined segment (for example, per play, per quarter, per game level, and the like). The data analysis system can provide for the analysis of any number of data sets. In the illustrated example, eleven data sets are being provided for comparison. The user may customize which metrics, the type of analysis, and how the results are displayed within the user interface. The user can select to display the game metrics over any defined time period. For example, the user could display data before and after a game application patch in order to visualize how the game metrics have changed based on the patch.
FIG. 7provides additional detail associated with analysis of aggregated telemetric by the data analysis system144. The data analysis system144can be configured to create gameplay statistics702associated with gameplay events recorded during gameplay sessions. The generation of a gameplay statistic702is described with further reference toFIG. 8.
With additional reference toFIG. 8, an example embodiment of generating a gameplay statistic702is illustrated. The gameplay statistic702can be based on one or more gameplay events that trigger within the game application. A gameplay event or a grouping of gameplay events can be associated with a specific a gameplay event filter704. The gameplay event filter704can be used to identify the target data set associated with a gameplay event or data sets associated with a grouping of gameplay events, for example, a shot made in a basketball game, an attack executed, speaking with an in-game character, or any other event. The gameplay event filters704include one or more target parameters706associated with the gameplay events. The target parameters706can include additional information associated with the gameplay event. For example, in a basketball game, the target parameters may include the point value associated with a basket (for example, one point, two points, or three points), the distance from the basket where the shot was taken, or another parameter value associated with the event. The gameplay statistic can return the output based on the identified target parameter. In some embodiments, if no target parameter is identified, the data analysis can return a count for each time the event occurred.
In the illustrated example, the gameplay statistic702being defined is “Score,” which name can be defined by the user. A gameplay event(s)704is identified using the specific gameplay event identifier used within the game application. The gameplay event identifier is used by the data analysis system144to access the dataset associated with the gameplay event. In this embodiment, the “SHOT_MADE” gameplay event is associated with a shot being made within a basketball game. The column716identifies all the events that triggered during the defined segment A target parameter706is identified using the specific target parameter identifier used within the game application. The target parameter is used by the data analysis system144to access the parameter within the data set. In this embodiment, the “VALU” target parameter indicates the point value associated with basket.
With reference now toFIG. 7, the data analysis process is further described. The interface can provide for the selection of one of the defined gameplay statistics702and the type of analysis710to perform on the selected gameplay statistic. The type of analysis can also be referred to as a derived gameplay metric710. The derived game metrics may include various calculations, such as, for example, average, maximum value, minimum value, sum, count, standard deviation, and the like. The gameplay segment712identifies the type of gameplay segment used for calculating the derived gameplay metric710. A segment filter714can filter the segments to a subset of the total data set of segments. For example, the filter may be used to identify only segments where a first team (for example, Cleveland) played a second team (for example, Orlando). A color identifier716can identify an output color associated with the data set.
Columns718,720and722provide an illustrative example of processing data associated with a gameplay statistic702. In the illustrated example, the gameplay statistic702is “Score” and the segment is a game, which has beginning and end points illustrated by the arrows. The event column718identifies each event that occurred within the segment. The identified events correspond to the gameplay event identifier704of the gameplay statistic702. The value column720identifies the value associated with each event. The identified value corresponds to the value associated with the target parameter706of the gameplay statistic702. The value will depend on the target parameter and the gameplay event. In some embodiments, the output may not be a numeric output The average column722calculates a value for the segment based on identified metric710. Each identified segment can have a corresponding calculated metric value. In some instances, segments, where none of the identified events occur, are removed from the calculation. An overall value724may be computed for all segments.
FIG. 9illustrates an example embodiment of performing calculations using the aggregated data from gameplay events. The MPS730can be used perform post-collection calculations on the event data using a defined equation732. The defined equation732can be used to calculate an output. The formula can use mathematical operators and one or more gameplay event variables734. The gameplay event variables734can be gameplay statistics (such as, gameplay statistics702) or gameplay events (such as, gameplay events704), derived gameplay metrics, and/or other types of numerical variables that can be used in the equation732. The equation can be calculated and output in the same manner as other gameplay statistics are calculated. The equation can include more or less than the number of variables in the illustrated example.
Example Process of Capturing Telemetry Data and Video Data
FIG. 10illustrates an embodiment of a flowchart for a method of associating telemetry data with video data. The process1000can be implemented by any system that can decode and stream content within a game environment during runtime of a game application. For example, the process1000, in whole or in part, can be implemented by a game application104, a gameplay acquisition system130, a user computing system103, a player computing system102, an interactive computing system120, or other application module. Although any number of systems, in whole or in part, can implement the process700, to simplify discussion, the process1000will be described with respect to the interactive computing system120and particular systems of the gameplay acquisition system130.
In process1000, at block1010the gameplay acquisition system130can communicate with the player computing system102in order to initiate a gameplay session of the game application104. During automated testing the game automation system142can communicate with the player computing system102. The game automation system142can initiate the gameplay session in accordance with a defined automation schedule, which can control the parameters used for the automated gameplay session, such as, for example, the type of agent, the duration of the gameplay session, the level within the game application, the number of virtual agents within the game, and any other characteristics associated with an automated gameplay session. During manual testing, the coordination server132can provide an indication to a user of the player computing system102to initiate a gameplay session. The indication may also include indications of the parameters used for the gameplay session.
At block1020, the gameplay acquisition system130can initiate data capture of the gameplay session during runtime of the game application on the player computing system102. In some embodiments, the coordination system132can provide instructions to the video acquisition system to initiate capture of the video stream, and provide instructions to the telemetry data acquisition system134to initiated capture of the telemetry data stream. In some embodiments, the video acquisition system and the telemetry data acquisition system are the same system. In some embodiments, the coordination server can associate and/or instantiate one or more a virtual machine in order to capture the data from the gameplay session. For example, a virtual machine may be associated with the video acquisition system138and a virtual machine may be associated with the telemetry data acquisition system134.
At block1030, a session ID is generated for the gameplay session. The session ID may be generated by the interactive computing system120, the coordination system132, or the player computing system102. In certain embodiments, the session ID may include information unique to the player computing system102, such as the IP address associated with the player computing system102.
At block1040, the gameplay acquisition system130can receive data associated with the gameplay session. The data associated with the gameplay session may include video data, telemetry data, system data, and/or other data associated with the execution of the game application during the gameplay session. In some embodiments, separate systems can individually communicate with the game application and acquire specific data associated with the gameplay session. For example, the video acquisition system138can acquire video data, the telemetry data acquisition system134can acquire telemetry data, and/or other systems can be responsible for acquiring different types of data. Each system can store their respective data in data stores associated with the respective system. In some embodiments, the telemetry data acquisition system can record a defined event that is triggered occur when defined criteria have been satisfied. For example, an event may be scripted to trigger only after the movement speed of a character within the game increases above a threshold.
At block1050, the interactive computing system120can associate the session ID with the video data of the gameplay session. For example, the video acquisition system144can associate the session ID with the video data received from the gameplay session. The video data can include with timestamps for the gameplay session. For example, the video data may generate timestamps associated with each frame of video data received during the gameplay session. The gameplay acquisition system130can associate the video data with the session ID. The gameplay acquisition system130may store the recorded video in a data store, such as data store124.
At block1060, the interactive computing system120can associate an event in the gameplay session with the session ID of the gameplay session and one or more timestamps. The telemetry data acquisition system134can associate the session ID with the telemetry data received from the gameplay session. An event may be associated with one or more timestamps. The session ID and timestamp information associated with an event can be used to retrieve video data associated with the gameplay event. The gameplay acquisition system130can associate the received telemetry data with the session ID.
Example Process of Gameplay Data Analysis
FIG. 11illustrates a flowchart of an embodiment for a gameplay acquisition system130process. The process1100can be implemented by any system that can decode and stream content within a game environment during runtime of a game application. For example, the process1100, in whole or in part, can be implemented by a game application104, a gameplay acquisition system130, a user computing system103, a player computing system102, an interactive computing system120, or other application module. Although any number of systems, in whole or in part, can implement the process1100, to simplify discussion, the process1100will be described with respect to the interactive computing system120and particular systems of the gameplay acquisition system130.
At block1110of process1100, the interactive computing system120can receive data associated with a plurality of gameplay sessions of a game application. Each gameplay session can include a session ID. The data associated with each session ID can include video data, telemetry data, system data, and/or other data associated with the gameplay session.
At block1120, the interactive computing system120can receive a request including a search criteria associated with one or more telemetric events. The request may come from the user computing system103(shown inFIG. 1). The search criteria can specify complex searches in order to identify occurrences of events, characteristics associated with the events, and other event data associated with gameplay sessions. For example, a search query may request all missed 3 point shots that occurred in the last 3 seconds of a game. The search queries may identify events by one or more event identifiers described herein.
At block1130, the interactive computing system120can identify each event that satisfies the search criteria. At block1140, the interactive computing system120can determine a segment associated with each of the identified events. The segment associated with an event can be based on a user selection. A game application can include different types of segments. For example, in a football game, defined segments may include events, plays, and quarters. A segment can define a start time. A segment start timestamp can be used to identify start time for replay of the video file. The telemetry data acquisition system and/or the search system can identify the closest segment that occurs prior to the triggering of an identified event.
At block1150, the interactive computing system120can generate instructions for display of data identifying each segment associated with the identified events(s). The interactive computing system120can provide each identified segment within a user interface. Each identified segment can include additional information associated with the segment, such as for example, a session ID, a segment ID, an event ID, duration of the segment, and other information associated with the segment. The instructions may be transmitted to the data visualization system146of the user computing system103. The data may include telemetry data associated with the event, event description, one or more search criteria, or the like.
At block1160, the interactive computing system120can receive a selection of one of the identified segments. At block1170, responsive to the selection of the segment, the interactive computing system120can locate video data associated with the gameplay session using the session ID. The interactive computing system120can generate instructions to display video data associated with the event. The instructions may be transmitted to the data visualization system146of the user computing system103. The video data may include a complete video that includes what was previously recorded and stored by the video acquisition system. The data may include telemetry data associated with the event, event description, one or more search criteria, or the like. The instructions may instruct the data visualization system146to play the recorded video from a timestamp where the event begins.
Overview of Computing Device
FIG. 12illustrates an embodiment of computing device10according to the present disclosure. Other variations of the computing device10may be substituted for the examples explicitly presented herein, such as removing or adding components to the computing device10. The computing device10may include a game device, a smart phone, a tablet, a personal computer, a laptop, a smart television, a car console display, a server, and the like. The computing device10may also be distributed across multiple geographical locations. For example, the computing device10may be a cluster of cloud-based servers.
As shown, the computing device10includes a processing unit20that interacts with other components of the computing device10and also external components to computing device10. A game media reader22is included that communicates with game media12. The game media reader22may be an optical disc reader capable of reading optical discs, such as CD-ROMs or DVDs, or any other type of reader that can receive and read data from game media12. One or more of the computing devices may be used to implement one or more of the systems disclosed herein.
Computing device10may include a separate graphics processor24. In some cases, the graphics processor24may be built into the processing unit20. In some such cases, the graphics processor24may share Random Access Memory (RAM) with the processing unit20. Alternatively, or in addition, the computing device10may include a discrete graphics processor24that is separate from the processing unit20. In some such cases, the graphics processor24may have separate RAM from the processing unit20. Computing device10might be a handheld game application device, a dedicated game console computing system, a general-purpose laptop or desktop computer, a smart phone, a tablet, a car console, or other suitable system.
Computing device10also includes various components for enabling input/output, such as an I/O32, a user I/O34, a display I/O36, and a network I/O38. I/O32interacts with storage element40and, through a device42, removable storage media44in order to provide storage for computing device10. Processing unit20can communicate through I/O32to store data, such as game state data and any shared data files. In addition to storage40and removable storage media44, computing device10is also shown including ROM (Read-Only Memory)46and RAM48. RAM48may be used for data that is accessed frequently, such as when a game is being played or the fraud detection is performed.
User I/O34is used to send and receive commands between processing unit20and user devices, such as game controllers. In some embodiments, the user I/O34can include a touchscreen input. The touchscreen can be capacitive touchscreen, a resistive touchscreen, or other type of touchscreen technology that is configured to receive user input through tactile inputs from the player. Display I/O36provides input/output functions that are used to display images from the game being played. Network I/O38is used for input/output functions for a network. Network I/O38may be used during execution of a game, such as when a game is being played online or being accessed online and/or application of fraud detection, and/or generation of a fraud detection model.
Display output signals produced by display I/O36comprise signals for displaying visual content produced by computing device10on a display device, such as graphics, user interfaces, video, and/or other visual content. Computing device10may comprise one or more integrated displays configured to receive display output signals produced by display I/O36. According to some embodiments, display output signals produced by display I/O36may also be output to one or more display devices external to computing device10.
The computing device10can also include other features that may be used with a game, such as a clock50, flash memory52, and other components. An audio/video player56might also be used to play a video sequence, such as a movie. It should be understood that other components may be provided in computing device10and that a person skilled in the art will appreciate other variations of computing device10.
Program code can be stored in ROM46, RAM48or storage40(which might comprise a hard disk, other magnetic storage, optical storage, other non-volatile storage or a combination or variation of these). Part of the program code can be stored in ROM that is programmable (ROM, PROM, EPROM, EEPROM, and so forth), part of the program code can be stored in storage40, and/or on removable media such as game media12(which can be a CD-ROM, cartridge, memory chip or the like, or obtained over a network or other electronic channel as needed). In general, program code can be found embodied in a tangible non-transitory signal-bearing medium.
Random access memory (RAM)48(and possibly other storage) is usable to store variables and other game and processor data as needed. RAM48is used and holds data that is generated during the execution of an application and portions thereof might also be reserved for frame buffers, application state information, and/or other data needed or usable for interpreting user input and generating display outputs. Generally, RAM48is volatile storage and data stored within RAM48may be lost when the computing device10is turned off or loses power.
As computing device10reads game media12and provides an application, information may be read from game media12and stored in a memory device, such as RAM48. Additionally, data from storage40, ROM46, servers accessed via a network (not shown), or removable storage media44may be read and loaded into RAM48. Although data is described as being found in RAM48, it will be understood that data does not have to be stored in RAM48and may be stored in other memory accessible to processing unit20or distributed among several media, such as game media12and storage40.
It is to be understood that not necessarily all objects or advantages may be achieved in accordance with any particular embodiment described herein. Thus, for example, those skilled in the art will recognize that certain embodiments may be configured to operate in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other objects or advantages as may be taught or suggested herein.
All of the processes described herein may be embodied in, and fully automated, via software code modules executed by a computing system that includes one or more computers or processors. The code modules may be stored in any type of non-transitory computer-readable medium or other computer storage device. Some or all the methods may be embodied in specialized computer hardware.
Many other variations than those described herein will be apparent from this disclosure. For example, depending on the embodiment, certain acts, events, or functions of any of the algorithms described herein can be performed in a different sequence or can be added, merged, or left out altogether (for example, not all described acts or events are necessary for the practice of the algorithms). Moreover, in certain embodiments, acts or events can be performed concurrently, for example, through multi-threaded processing, interrupt processing, or multiple processors or processor cores or on other parallel architectures, rather than sequentially. In addition, different tasks or processes can be performed by different machines and/or computing systems that can function together.
The various illustrative logical blocks and modules described in connection with the embodiments disclosed herein can be implemented or performed by a machine, such as a processing unit or processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A processor can be a microprocessor, but in the alternative, the processor can be a controller, microcontroller, or state machine, combinations of the same, or the like. A processor can include electrical circuitry configured to process computer-executable instructions. In another embodiment, a processor includes an FPGA or other programmable device that performs logic operations without processing computer-executable instructions. A processor can also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Although described herein primarily with respect to digital technology, a processor may also include primarily analog components. For example, some or all of the signal processing algorithms described herein may be implemented in analog circuitry or mixed analog and digital circuitry. A computing environment can include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a portable computing device, a device controller, or a computational engine within an appliance, to name a few.
Conditional language such as, among others, “can,” “could,” “might” or “may,” unless specifically stated otherwise, are understood within the context as used in general to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.
Disjunctive language such as the phrase “at least one of X, Y, or Z,” unless specifically stated otherwise, is understood with the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (for example, X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to each be present.
Any process descriptions, elements or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or elements in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown, or discussed, including substantially concurrently or in reverse order, depending on the functionality involved as would be understood by those skilled in the art.
Unless otherwise explicitly stated, articles such as “a” or “an” should generally be interpreted to include one or more described items. Accordingly, phrases such as “a device configured to” are intended to include one or more recited devices. Such one or more recited devices can also be collectively configured to carry out the stated recitations. For example, “a processor configured to carry out recitations A, B and C” can include a first processor configured to carry out recitation A working in conjunction with a second processor configured to carry out recitations B and C.
It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure.
Claims
- A system comprising: at least one data store comprising gameplay data associated with a game application;and a server computing system in electronic communication with the at least one data store and configured to execute computer-readable instructions that configure the server computing system to: provide instructions to a first computing system to configure an automated gameplay session of the game application in accordance with automated testing parameters, wherein the automated gameplay session is configured to use at least one artificial intelligence (AI) controlled agent, wherein the at least one AI controlled agent is configured to control execution of the game application on the first computing system during the automated gameplay session;initiate the automated gameplay session of the game application on the first computing system;initiate data acquisition of the gameplay session of the game application executing on the first computing system;generate a session identifier for the gameplay session;receive video data associated with the gameplay session;associate the video data with the session identifier of the gameplay session;receive telemetric events triggered during the gameplay session, the telemetric events comprising non-scripted telemetric events and scripted telemetric events, the scripted telemetric events triggered during the gameplay session based at least in part on a scripted event trigger, the scripted event trigger configured to trigger when characteristics of a virtual entity satisfy a defined threshold during the gameplay session, wherein each telemetric event is associated with an event timestamp based on when the telemetric event triggered during the gameplay session, wherein each telemetric event is associated with and occurs within at least a first gameplay segment type and a second gameplay segment type, each of the first gameplay segment type and the second gameplay segment type has a different gameplay segment length during the gameplay session, wherein the second gameplay segment type is nested within the first gameplay segment type such that a plurality of gameplay segments of the second gameplay segment type occur sequentially within the segment length of the first gameplay segment type, wherein a plurality of gameplay segments of the first gameplay segment type occur during the gameplay session, each gameplay segment having a segment start timestamp and a segment end timestamp distinct from the event timestamp, wherein the segment start timestamp is used as a start time for viewing video data associated with an event occurring within the gameplay segment;associate the telemetric events with the session identifier of the gameplay session;and output the video data and telemetric events for storage within the at least one data store.
- The system of claim 1 , wherein the computer-readable instructions further configure the server computing system to initiate execution of the automated gameplay session of the game application on the first computing system based on a defined automation schedule, wherein the defined automation schedule defines the automated testing parameters used for configuring the automated gameplay session, wherein the automated testing parameters include at least one of a number and type of AI controlled agents controlling execution of the game application, a duration of the automated gameplay session, or a game level within the game application.
- The system of claim 1 , wherein the computer-readable instructions further configure the one or more AI controlled agents to control operation of the game application during the automated gameplay session by at least one of interfacing with an application programming interface (API) of the game application or simulating input actions on a peripheral input device of the first computing system.
- The system of claim 1 , wherein the at least one data store comprises a first data store and a second data store, wherein the video data is stored in the first data store and the telemetric events are stored in the second data store.
- The system of claim 4 , wherein the video data associated with the gameplay session is independent of the telemetric events associated with the gameplay session.
- The system of claim 1 , wherein the video data further comprises audio data associated with the gameplay session.
- The system of claim 1 , wherein the event timestamp is independent of a start time of the gameplay session.
- The system of claim 1 , wherein the server computing system is further configured to encode the video data into a standard file format when outputting the video data.
- The system of claim 1 , wherein the computer-readable instructions further configure the server computing system to provide instructions to one or more virtual machines to communicate with the first computing system in order to acquire data during the gameplay session.
- The system of claim 1 , wherein the computer-readable instructions further configure the server computing system to aggregate the telemetric events associated with the gameplay session with telemetric events associated with other gameplay sessions, and provide at least a portion of the telemetric events to a second computing system in response to a search query associated with the game application.
- A computer-implemented method comprising: under control of a computer system comprising computer hardware, the computer system configured with computer executable instructions: providing instructions to a first computing system to configure an automated gameplay session of the game application in accordance with automated testing parameters, wherein the automated gameplay session is configured to use at least one artificial intelligence (AI) controlled agent, wherein the at least one AI controlled agent is configured to control execution of the game application on the first computing system during the automated gameplay session;initiate the automated gameplay session of the game application on the first computing system;initiating data acquisition of the gameplay session of the game application executing on the first computing system;generating a session identifier for the gameplay session;receiving video data associated with the gameplay session;associating the video data with the session identifier of the gameplay session;receiving telemetric events triggered during the gameplay session, the telemetric events comprising non-scripted telemetric events and scripted telemetric events, the scripted telemetric events triggered during the gameplay session based at least in part on a scripted event trigger, the scripted event trigger configured to trigger when characteristics of a virtual entity satisfy a defined threshold during the gameplay session, wherein each telemetric event is associated with an event timestamp based on when the telemetric event triggered during the gameplay session, wherein each telemetric event is associated with and occurs within at least a first gameplay segment type and a second gameplay segment type, each of the first gameplay segment type and the second gameplay segment type has a different gameplay segment length during the gameplay session, wherein the second gameplay segment type is nested within the first gameplay segment type such that a plurality of gameplay segments of the second gameplay segment type occur sequentially within the segment length of the first gameplay segment type, wherein a plurality of the first type of gameplay segments occur during the gameplay session, each gameplay segment having a segment start timestamp and a segment end timestamp distinct from the event timestamp, wherein the segment start timestamp is used as a start time for viewing video data associated with an event occurring within the gameplay segment;associating the telemetric events with the session identifier of the gameplay session;and outputting the video data and telemetric events for storage within at least one data store.
- The method of claim 11 further comprising initiating execution of the automated gameplay session of the game application on the first computing system based on a defined automation schedule, wherein the defined automation schedule defines the automated testing parameters used for configuring the automated gameplay session.
- The method of claim 11 further comprising providing instructions to one or more virtual machines to communicate with the first computing system in order to acquire data during the gameplay session.
- The method of claim 11 , wherein each telemetric event is triggered based, at least in part, on a defined triggering criteria.
- The method of claim 11 , wherein outputting the video data further comprises encoding the video data into a standard file format.
- The method of claim 11 further comprising aggregating the telemetric events associated with the gameplay session with telemetric events associated with other gameplay sessions, and providing at least a portion of the telemetric events to a second computing system in response to a search query associated with the game application.
- A non-transitory computer-readable storage medium having stored thereon computer-readable instructions that, when executed, configure a computing system to: provide instructions to a first computing system to configure an automated gameplay session of the game application in accordance with automated testing parameters, wherein the automated gameplay session is configured to use at least one artificial intelligence (AI) controlled agent, wherein the at least one AI controlled agent is configured to control execution of the game application on the first computing system during the automated gameplay session;initiate the automated gameplay session of the game application on the first computing system;initiate data acquisition of the gameplay session of the game application executing on the first computing system;generate a session identifier for the gameplay session;receive video data associated with the gameplay session;associate the video data with the session identifier of the gameplay session;receive telemetric events triggered during the gameplay session, the telemetric events comprising non-scripted telemetric events and scripted telemetric events, the scripted telemetric events triggered during the gameplay session based at least in part on a scripted event trigger, the scripted event trigger configured to trigger when characteristics of a virtual entity satisfy a defined threshold during the gameplay session, wherein each telemetric event is associated with an event timestamp based on when the telemetric event triggered during the gameplay session, wherein each telemetric event is associated with and occurs within at least a first gameplay segment type and a second gameplay segment type, each of the first gameplay segment type and the second gameplay segment type has a different gameplay segment length during the gameplay session, wherein the second gameplay segment type is nested within the first gameplay segment type such that a plurality of gameplay segments of the second gameplay segment type occur sequentially within the segment length of the first gameplay segment type, wherein a plurality of the first type of gameplay segments occur during the gameplay session, each gameplay segment having a segment start timestamp and a segment end timestamp distinct from the event timestamp, wherein the segment start timestamp is used as a start time for viewing video data associated with an event occurring within the gameplay segment;associate the telemetric events with the session identifier of the gameplay session;and output the video data and telemetric events for storage within at least one data store.
- The non-transitory computer-readable storage medium of claim 17 , wherein the computer-readable instructions further configure the computing system to aggregate the telemetric events associated with the gameplay session with telemetric events associated with other gameplay sessions, and provide at least a portion of the telemetric events to a second computing system in response to a search query associated with the game application.
- The non-transitory computer-readable storage medium of claim 17 , wherein the automated testing parameters include a number and type of AI controlled agents controlling execution of the game application.
- The non-transitory computer-readable storage medium of claim 17 , wherein the session identifier is automatically generated for a gameplay session.
Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.