U.S. Pat. No. 10,949,325

AUTOMATED CROSS-SESSION VIDEO GAME TESTING

AssigneeELECTRONIC ARTS INC.

Issue DateAugust 18, 2017

Illustrative Figure

Abstract

Embodiments disclosed herein include a system that is capable of processing test data across multiple sessions of a video game. In some cases, the tests are performed over multiple video games that share a game engine. The generated test data may be analyzed as the test is being performed and key performance indicators may be extracted from the test data reducing the test data by at least an order of magnitude. Further, the extracted key performance indicators are used to automatically conduct further testing, such as regression testing, based on an analysis of the key performance indicators with respect to trend data generated from prior tests of the video game or of a shared game engine used by multiple video games.

Description

DETAILED DESCRIPTION OF EMBODIMENTS Introduction Testing large applications, such as modern video games can be complex and time-consuming. Further, determining why a video game may not be executing efficiently can be challenging. In some cases, it can be challenging to identify that a video game is not executing efficiently. For example, if the video game is executing without error, it may be difficult or impossible to determine that the video game is not executing efficiently. A video game may not be executing efficiently if it uses more computing resources to execute than may be necessary to achieve similar or the same results. It can be particularly difficult to realize that a video game is not running efficiently when the computing resources, such as RAM, available to execute the video game exceed the computing resources utilized by the video game during its execution. However, as different users may have different computing resources available or as some users may desire to execute a greater number of applications at least partially in parallel, it is desirable to optimize the code of a video game to execute as efficiently as possible, or at least within a particular target efficiency. One way to determine whether a video game is executing efficiently or has been programmed to execute efficiently is to compare execution of the video game across multiple sessions. For example, during testing of the video game, multiple iterations of the code of the video game can be compared to determine whether the video game is running efficiently, or whether the efficiency of the video game has increased or decreased as the code of the video game is changed during development. Comparing iterations of the code of the video game may include comparing performance indicators, such a RAM or CPU cycles utilized, during execution of ...

DETAILED DESCRIPTION OF EMBODIMENTS

Introduction

Testing large applications, such as modern video games can be complex and time-consuming. Further, determining why a video game may not be executing efficiently can be challenging. In some cases, it can be challenging to identify that a video game is not executing efficiently. For example, if the video game is executing without error, it may be difficult or impossible to determine that the video game is not executing efficiently. A video game may not be executing efficiently if it uses more computing resources to execute than may be necessary to achieve similar or the same results. It can be particularly difficult to realize that a video game is not running efficiently when the computing resources, such as RAM, available to execute the video game exceed the computing resources utilized by the video game during its execution. However, as different users may have different computing resources available or as some users may desire to execute a greater number of applications at least partially in parallel, it is desirable to optimize the code of a video game to execute as efficiently as possible, or at least within a particular target efficiency.

One way to determine whether a video game is executing efficiently or has been programmed to execute efficiently is to compare execution of the video game across multiple sessions. For example, during testing of the video game, multiple iterations of the code of the video game can be compared to determine whether the video game is running efficiently, or whether the efficiency of the video game has increased or decreased as the code of the video game is changed during development. Comparing iterations of the code of the video game may include comparing performance indicators, such a RAM or CPU cycles utilized, during execution of the iterations of the code of the video game. In some cases, the performance of different video games that utilize the same game engine can be compared to determine whether a particular video game is efficiently utilizing the game engine.

The execution of an instance of a video game can be referred to as a “video game session” or a “session.” It can be challenging, and in some cases impossible, to compare multiple sessions of a video game, or sessions of different video games because the amount of data produced during one session can be very large. Typically, the test data may include all of the state information of the video game and all of the measured performance data for the host system at each state of the video game. This test data can often be larger than the video game itself. For example, one ten minute session can produce 60 gigabytes or more of test data relating to the execution of the video game and its performance. Thus, both storing the performance data generated during multiple sessions of a video game and comparing the performance data across multiple sessions can be impracticable if not impossible. This problem is exacerbated with test sessions that are even longer than ten minutes and thus, produce even more test data.

Embodiments disclosed herein present a method and system that is capable of processing test data across multiple sessions of a video game. The methods and system herein can analyze a byte stream of test data as the test data is being generated. Further, data objects associated with key performance indicators can be extracted enabling a significant reduction in the size of the data to be processed. For example, in some cases it is possible to reduce 60 GBs of test data to 11 MBs. The extracted performance data can be aggregated and stored across multiple sessions of the video game enabling generation of a performance trend. Further, performance data for a particular test session can be compared with the performance trend to facilitate evaluating the code corresponding to the particular test session.

Advantageously, by processing the test data across multiple sessions of the video game, it is possible to determine performance trends for the video game and to determine whether a particular build of the code for the video game satisfies the trend. Further, embodiments disclosed herein can compare test data for sessions of different video games that share a game engine to determine whether a particular video game is utilizing the game engine as efficiently as other video games built or developed using the same game engine.

In certain embodiments, an automated testing process can be performed. Based at least in part on the evaluation of the performance data, a test system can modify code associated with the video game or a test environment of the video game. A test session can be initiated using the modified code or test environment to obtain performance data. This performance data can be compared to prior performance data or to trend data. The result of the comparison can be used to repeat the automated process enabling automated and efficient testing of the video game code. This automated testing process in some cases enables more efficient testing than a manual testing process. Further, in some cases, machine learning algorithms may be used to facilitate performance of the automated testing process. Using parameter functions developed using a machine learning process, the selection of tests to run and modifications to the testing environment can be automated.

To simplify discussion, the present disclosure is primarily described with respect to a video game. However, the present disclosure is not limited as such and may be applied to other types of applications. For example, embodiments disclosed herein may be applied to educational applications (for example, applications that help users learn a new language) or other applications where large amounts of data may be produced during testing of the application.

Example Networked Computing Environment

FIG. 1illustrates an embodiment of a networked computing environment100that can implement one or more embodiments of the video game test system and a key performance indicator analysis system. The networked computing environment100includes a user computing system110that can communicate with an interactive computing environment101via a network104. Further, the networked computing environment100may include an application host system108that can communicate with the interactive computing environment101and the user computing system110via the network104.

The user computing system110may include, host, or execute a video game112. In some embodiments, the user computing system110hosts or executes a portion of the video game112in the application host system108hosts and/or executes another portion of the video game112. When a user initiates execution of the video game112on the user computing system110, a network connection may be established with the application host system108and the two portions of the video game112may execute in conjunction with each other. For example, the application host system108may host and execute a portion of the video game112that comprises a video game environment while the user computing system110may execute a portion of the video game112that enables a user to interact with the video game environment using, for example, a playable in game character. The video game environment may include an online or digital persistent world that may be maintained after a user of the user computing system110disconnects from the application host system108. As another example, the video game may be a massively multiplayer online role-playing game (MMORPG) that includes a client portion executed by the user computing system110and a server portion executed by one or more application host systems108. In some embodiments, one or more application host systems108may be included as part of the interactive computing environment101.

As previously mentioned, the application host system108may host and/or execute at least a portion of the video game112. Alternatively, or in addition, the application host system108may host or execute the entirety of the video game112and a user may interact with the video game112using the user computing system110.

The video game112may include a telemetry system106. In some embodiments, the telemetry system106may be included by a portion of the video game112at the user computing system110, a portion of the video game112at the application host system108, or at portions of the video game112at the user computing system110and at the application host system108. The telemetry system106may include a system capable of recording and transmitting data relating to the execution of the video game112. This data may include a log or identification of each operation performed by the video game112, a point in time recordation of computing resources (for example, CPU utilization, RAM utilization, GPU utilization, the RAM utilization, and the like) accessed by the video game112, timing information relating to the receipt or triggering of operations to be performed by the video game112, timing information relating to the execution or completion of operations to be performed by the video game112(for example, the amount of time to generate or display a frame a video, the amount of time to load a user or playable character profile, and the like), state information for the video game112, login performance information (for example, the amount of time to complete a login or to load a user's account or avatar), any other type of information that can be recorded regarding the execution of the video game112at the user computing system110and/or at the application host system108, and any other type of metric that can be recorded or measured regarding the performance of the video game112at the user computing system110and/or at the application host system108. Much of the above data may be obtained continuously and transmitted continuously as, for example, a byte stream as the video game112continues to execute. This byte stream may be transmitted via the network104to the interactive computing environment101. Various systems of the interactive computing environment101may process, analyze, or perform various actions responsive to receipt of the byte stream as will be described below.

The interactive computing environment101may include a video game test system130, a key performance indicator (KPI) warehousing system120, a KPI analysis system150, and a model generation system146. Further, in certain embodiments, the interactive computing environment101may include the application host system108. Moreover, in certain embodiments, the interactive computing environment101may include the user computing system110. For example, the interactive computing environment101may include one or more user computing systems110for performing testing of the video game112during, or after, development of the video game112.

The video game test system130may include any system that can receive telemetry data from a telemetry system106. Further, the video game test system130may include any system that can initiate or perform one or more tests with respect to the video game112. These tests may be performed with respect to the entirety of the video game112, a portion of the video game112hosted by the user computing system110, or a portion of the video game112hosted by the application host system108. The video game test system130may include a testing system132that selects, initiates, performs tests with respect to the video game112. Further, the video game test system130can include a KPI extractor136that can extract particular portions or pieces of data from a byte stream of telemetry data received from the telemetry system106. In addition, the video game test system130may include an object generator134that can generate one or more data objects from portions or pieces of data extracted from the byte stream of telemetry data. The video game test system130may also include a data controller138that identifies or specifies the types of data to extract from the byte stream. Typically, the data to be extracted from the byte stream may include data that relates to one or more performance indicators that a tester of the video game112is interested in analyzing. In some cases, the data to be extracted from the byte stream may be data utilized by an automated testing system to determine additional or social contests to be performed. The data that is to be extracted from the byte stream may be data associated with key performance indicators, or particular types of data that can be used to determine the performance of the video game112. For example, the key performance indicators may indicate CPU utilization, GPU utilization, RAM utilization, or VRAM utilization at a given point in time during execution of the video game112, timing for generating or displaying a frame of video, timing for generating or outputting a portion of audio, a number of threads instantiated, a number of threads for a particular aspect of the video game, state information for different aspects of the video game, and the like.

The KPI warehousing system120may include any system that can store data objects associated with key performance indicators at a KPI repository122. The KPI repository122may store raw KPI data corresponding to the individual data objects generated from data extracted from the byte stream. Further, the KPI repository122may store aggregated KPI data generated based on an aggregation of a plurality of KPI data objects. In addition, the KPI warehousing system120may include an object validator124that can validate objects generated by the object generator134from a byte stream received from the telemetry system106. In addition, the KPI warehousing system120can include a KPI search system126for searching the KPI repository122for a particular KPI or set of KPIs.

The KPI analysis system150may include any system for analyzing data or data objects associated with a KPI. The KPI analysis system150may include a test configuration system142and a KPI user interface144. The test configuration system142may include any system for configuring a test to be performed on the video game112. In some embodiments, the test configuration system142may configure a testing environment for testing the video game112. The test configuration system142may include a test selector140that can select one or more tests to perform on the video game112. The test selector140may select a test to perform based at least in part on a parameter function generated using one or more machine learning algorithms. The KPI user interface144may include any system that can output data associated with a KPI for display to a user, such as a tester.

The model generation system146can use one or more machine learning algorithms to generate one or more prediction models or parameter functions. One or more of these parameter functions may be used to determine an expected value or occurrence based on a set of inputs. For example, a prediction model can be used to determine a suite of tests to perform on the video game112based on one or more inputs to the prediction model, such as, for example, a set of tests performed and performance data for previous iterations of the video game or for one or more other video games, such as other video games that may share a game engine. In some cases, the prediction model may be termed a prediction model because, for example, the output may be or may be related to a prediction of an action or event, such as the prediction that a particular test is to be performed on the video game112or that a particular portion of the video game112is programmed incorrectly or inefficiently. A number of different types of algorithms may be used by the model generation system146. For example, certain embodiments herein may use a logistical regression model. However, other models are possible, such as a linear regression model, a discrete choice model, or a generalized linear model.

The machine learning algorithms can be configured to adaptively develop and update the models over time based on new input received by the model generation system146. For example, the models can be regenerated on a periodic basis as new test history is available to help keep the predictions in the model more accurate as the test processes for a video game or a game engine evolves over time. The model generation system146is described in more detail herein. After a model is generated, it can be provided to the test selector140to facilitate the selection of test suites.

Some non-limiting examples of machine learning algorithms that can be used to generate and update the parameter functions or prediction models can include supervised and non-supervised machine learning algorithms, including regression algorithms (such as, for example, Ordinary Least Squares Regression), instance-based algorithms (such as, for example, Learning Vector Quantization), decision tree algorithms (such as, for example, classification and regression trees), Bayesian algorithms (such as, for example, Naive Bayes), clustering algorithms (such as, for example, k-means clustering), association rule learning algorithms (such as, for example, Apriori algorithms), artificial neural network algorithms (such as, for example, Perceptron), deep learning algorithms (such as, for example, Deep Boltzmann Machine), dimensionality reduction algorithms (such as, for example, Principal Component Analysis), ensemble algorithms (such as, for example, Stacked Generalization), and/or other machine learning algorithms.

The user computing system110may include hardware and software components for establishing communication with another computing system, such as the interactive computing system130, over a communication network104. For example, the user computing system110may be equipped with networking equipment and network software applications (for example, a web browser) that facilitate communications via a network (for example, the Internet) or an intranet. The user computing system110may include a number of local computing resources, such as central processing units and architectures, memory, mass storage, graphics processing units, communication network availability and bandwidth, and so forth. Further, the user computing system110may include any type of computing system. For example, the user computing system110may include any type of computing device(s), such as desktops, laptops, video game platforms, television set-top boxes, televisions (for example, Internet TVs), network-enabled kiosks, car-console devices, computerized appliances, wearable devices (for example, smart watches and glasses with computing functionality), and wireless mobile devices (for example, smart phones, PDAs, tablets, or the like), to name a few. In some embodiments, the user computing system110may include one or more of the embodiments described below with respect toFIG. 8andFIG. 9.

The network104can include any type of communication network. For example, the network104can include one or more of a wide area network (WAN), a local area network (LAN), a cellular network, an ad hoc network, a satellite network, a wired network, a wireless network, and so forth. Further, in some cases, the network104can include the Internet.

Example Byte Stream Flow

FIG. 2illustrates an embodiment of a data flow system200for the flow of data during a key performance indicator extraction and analysis process. As previously described, the telemetry system106can generate and transmit a byte stream of data relating to the operation and performance of the video game112. The amount of data that can be generated or transmitted by the telemetry system106for just a short period of time can be substantial. For example, for a 10 minute test session of the video game112, the amount of data produced can be upwards of 60 GB. While some test sessions are 10 minutes or less, many test sessions are much greater than 10 minutes and thus, may produce substantially more than 60 GB of data. Accordingly, it can be difficult if not impossible to analyze and compare multiple test sessions of the video game112because of the large amount of data produced when testing the video game112. For example, it may not be possible to store such large amounts of data. Further, even if the telemetry data can be stored for multiple test sessions of the video game112, the amount of computing resources and time required to analyze and compare the telemetry data for multiple test sessions of the video game112may be impractical.

Embodiments disclosed herein address the aforementioned problems by scanning the telemetry data as it is created to identify one or more KPIs. The identified KPI's can then be extracted from the byte stream and stored at a KPI repository122. By extracting the KPI's, the amount of telemetry data saved and evaluated during a test process can be reduced by several orders of magnitude. Advantageously, by reducing the amount of telemetry data, the amount of processing resources required to analyze the test session can be reduced.

The data flow system200illustrates the process of extracting KPI data and storing it for analysis and comparison with other test sessions. As illustrated inFIG. 2, the telemetry system106may produce a byte stream of telemetry data210for each test session. This telemetry data may be provided to a KPI extractor136. Further, the KPI extractor136may receive from a data controller138an identification of KPI data to extract from the telemetry data210. The KPI data to extract may be specified by a user, such as a tester or administrator, or may be specified by an automated testing system, such as the testing system132.

As the byte stream210is received by the KPI extractor136, the KPI extractor136can determine whether a portion of the data included in the byte stream210corresponds to one or more of the selected KPI. If the portion of the data does correspond to the one or more selected KPI, a portion of the data may be stored at the KPI repository122as illustrated by the KPI data212, which is generated for each test session. Further, the KPI extractor136may forward the telemetry data210to a session image generator204which may create an image of the entire test session. This image may be created on a hard drive or solid state drive, or in a non-transitory storage medium, such as a DVD. As illustrated by the thinner arrow, the KPI data212forwarded to the KPI repository122may be substantially less than the telemetry data210that is obtained from the telemetry system106. For example, while the telemetry data210may encompass hundreds of gigabytes of data, the KPI data212may only be 15 or 20 MB of data.

Example KPI Identification and Storage Process

FIG. 3presents a flowchart of an embodiment of a key performance indicator identification and storage process300. The process300can be implemented by any system that can identify key performance indicators within a byte stream of telemetry data obtained from a telemetry system106of a video game112. The process300, in whole or in part, can be implemented by, for example, a telemetry system106, a video game112, a video game test system130, a KPI extractor136, a data controller138, a KPI warehousing system120, a KPI analysis system150, or a KPI user interface144, among others. Although any number of systems, in whole or in part, can implement the process300, to simplify discussion the process300will be described with respect to particular systems. Further, the process300, or particular operations of the process300may be performed continuously or repeatedly during a session of a video game112.

The process300begins at block302where the KPI extractor136receives a configuration file identifying one or more key performance indicators. The configuration file may be received from the data controller138in response to an action by a user or by initiation of a test session by the testing system132. In some embodiments, the data controller138specifies one or more key performance indicators for the KPI extractor136to identify from a byte stream with or without providing a configuration file to the KPI extractor136. The key performance indicators identified by the data controller138may depend on the tests initiated by the testing system132. For example, if a test initiated by the testing system132relates to the display of images or video to a user playing the video game112, the KPIs identified by the data controller138may relate to utilization of graphics resources of the user computing system110, the speed with which a frame is generated for display to a user, or the number of frames which may be generated per second. As another example, if the test initiated by the testing system132relates to the game flow logic of the video game112, the KPIs identified by the data controller138may relate to the state of the video game112in response to different triggers generated by the testing system132.

At block304, the KPI extractor136receives a byte stream for a session of the video game112. The byte stream may include binary coded data. As previously described, the byte stream may include some or all of the state information that can be generated by the video game112as well as performance information relating to utilization of computing resources at the user computing system110. Further, the byte stream may include information relating to the utilization of computing resources of the application host system108as well as state information for a portion of the video game112posted by the application host system108. As the byte stream may include all of the state information generated by the video game112, the byte stream may be a continuous stream of data provided by the telemetry system106to the video game test system130. Thus, some or all of the process300may be occurring repeatedly as additional bytes in the byte stream are received by the video game test system130. In some embodiments, the byte stream received by the video game test system130may be a pair of byte streams received from the telemetry system106of the user computing system110in the telemetry system106of the application host system108. Alternatively, the byte stream received by the video game test system130may be an aggregated byte stream that is aggregated from a byte stream generated by the telemetry system106of the user computing system110in the telemetry system106of the application host system108.

It should be understood from the above description that a large percentage, and typically a majority, of the bytes of data included in the byte stream may be unrelated to the KPI specified in the configuration file or otherwise identified as part of the block302. For example, the bytes of data may include state information of the video game112that is unrelated or tangentially related to performance, such as the outfit of an avatar or the amount of in-game coins collected by the user. Further, the byte stream may include data that does relate to the performance of the video game112, but is unrelated to the requested KPI data. Further, as different tests or different aspects of the video game112may be tested during different performances of the process300, the KPIs identified at the block302and the corresponding KPI data of interest in the byte stream may change during different occurrences of the process300.

At block306, the object generator134decodes a portion of the byte stream into a data object. Decoding a portion of the byte stream may include identifying markers within the byte stream that mark boundaries of information within the byte stream. Further, decoding a portion of the byte stream into the data object may include identifying multiple portions of the byte stream that relate to the same metric or information associated with execution of the video game112. The multiple portions of the byte stream may be aggregated together in generating the data object.

In some cases, the portion of the byte stream corresponds to a particular event occurring during execution of an instance of the video game112. Thus, the data object may correspond to the particular event. The event may relate to particular actions or occurrences within the video game112. For example, the event may relate to the defeat of an enemy, the completion of a level, or the obtaining of an in-game item. Alternatively, or in addition, the event may relate to particular types of processing or computer resource utilization by the video game112. For example, the event may relate to the instantiation of several instances of an object corresponding to code for the creation and/or control of several instances of an enemy. As another example, the event may relate to the reservation of a portion of RAM for the video game112. For example, an event may be defined by or associated with the allocation, accessing, or modification of memory or RAM. In certain embodiments, one or more portions of the byte stream may be decoded at the block306while additional portions of the byte stream are being received as part of the operations associated with the block304. Thus, operations associated with the blocks304and306may be performed at least partially in parallel.

In some embodiments, the data object may include one or more of event data (such as the defeat of a monster), performance related data (such as VRAM utilization), timeline information, or any other information that can be derived from a byte stream of the video game112under test. Thus, it is possible to examine the data object in the context of the test session, or multiple test sessions, to determine what operations occurred at a given point in time during execution of the video game112and the effect of the specific operations on performance of the video game112.

At decision block308, the KPI extractor136determines whether the data object is related to a key performance indicator. In some cases, the data object may relate to the performance of the video game112, but may not relate to a KPI identified at the block302. Determining whether the data object is related to a key performance indicator may include determining a name or tag associated with the data object or the type of data associated with the data object. Further, determining whether the data object is related to a key performance indicator may include comparing the data object, or an identifier thereof, to a set of one or more types of KPIs identified by the data controller138. Advantageously, in certain embodiments, the decision block308involves performing a real time, or substantially real-time, inspection or analysis of the byte stream. By performing a real-time analysis of the byte stream, it is possible to reduce the amount of storage required for storing the telemetry data because telemetry data that is unrelated to particular metrics of interest to a tester can be omitted from storage at the KPI repository122.

If it is determined at the decision block308that the data object is related to a key performance indicator, the KPI extractor136extracts the data object from the byte stream at block310. Extracting the data object from the byte stream may include creating a copy of the data object. Alternatively, or in addition, extracting the data object from the byte stream may include removing the data object from the byte stream.

At block312, the KPI warehousing system120stores the data object at the KPI repository122. The data object may be stored as raw KPI data at the KPI repository122. Alternatively, or in addition, the data object may be stored as aggregated KPI data that is aggregated with other data objects. In some embodiments, the block312may include validating the data object using, for example, the object allocator124before cueing the data for storage or aggregation, or other processing. Validating the data object they include confirming that the data object satisfies a particular format. Further, validating the data object may include confirming that no errors occurred during conversion of the byte stream to the data object. In some cases, validating the data object may include determining that the data object includes values for each aspect or variable of the data object. For example, suppose that one KPI interest relates to the timing of video frames. The identification of a portion of the byte stream relating to an initial request to generate the frame of the video may cause generation of a data object. However, if the data object does not include the completion time for the frame, the data object may not be successfully validated by the object validator124. In some embodiments, the block312may also include providing the data object to the session image generator for storage with the rest of the byte stream that was not associated with a key performance indicator.

The total amount of storage for storing the data objects identified as relating to the key performance indicators is usually substantially less than the size of the byte stream. Typically, the amount of storage space for storing the data objects identified as related to the key performance indicators is one or more orders of magnitude less than the size of the byte stream. Advantageously, in certain embodiments, extracting key performance indicators or data corresponding to key performance indicators from the byte stream, it is possible to store data for multiple test sessions and compare the data across test sessions. Often, it is not possible to store the entire byte stream due to its size. Further, due to the size of the byte stream, it is often not possible to compare data across multiple test sessions of the video game112because, for example, it is not possible to store data from multiple byte streams or to process multiple byte streams in parallel due to the amount of data produced by each test session of the video game112. However, extracting the data correspond to the identify key performance indicators, it is possible to develop the trend for multiple test sessions and to compare the KPI data across multiple test sessions and/or to the calculated trend.

At block314, the object generator134converts the data object into a human readable form. Converting the data object into a human readable form may include converting the data object to a particular file format that uses human readable text. For example, the data object may be converted to an eXtensible Markup Language (XML) format or a JavaScript Object Notation (JSON) format. In some embodiments, the data object created at block306is in human-readable form. In such embodiments, the block314may be redundant and may be omitted.

At block316, the KPI warehousing system120aggregates the data object with other data objects at the KPI repository122. Aggregating the data object with other data objects may occur as the data object is obtained or generated from the byte stream. Alternatively, or in addition, data objects stored at the KPI repository122may be aggregated after processing of the byte stream is completed. Aggregating the data object may include performing a statistical process on a set of related data objects. For example, aggregating the data objects may include averaging the data objects over a particular time period. For example, a set of data objects may be averaged to determine an average amount of time for rendering one or more frames over different time periods. As another example, a set of data objects may be averaged to determine an average utilization of one or more types of computing resources over different periods of time.

Generally, the data objects aggregated at the block316are obtained from the same byte stream associated with the same session or instance of the video game112under test. However, in some embodiments, data objects may be aggregated across instances of the video game112. For example, data objects for test sessions conducted on a particular version of the video game112or during a particular day may be aggregated together.

In some cases, a data object or an aggregated set of data objects may be associated with or linked to particular states of the video game. Advantageously, in certain embodiments, by associating or linking a data object or an aggregated set of data objects with a particular state of the video game, it can be determined how different states of the video game may impact different metrics, such as computer resource utilization, over time. For example, by linking the aggregated set of data objects with a particular state of the video game, it can be determined how increasing the number of characters within the display can affect frame rendering or computer resource utilization for a particular video game.

At block318, the KPI user interface144outputs the human readable data objects for presentation to a user. The human readable data objects may be output individually for display. Alternatively, or in addition, an aggregated set of the human readable data objects may be output for display. Outputting one or more data objects for display may include generating one or more different types of user interfaces to present the data associated with the one or more data objects to a user. For example, a graph may be generated based on the data objects to illustrate changing computer resource consumption over time. In some embodiments, the user interface may present data objects associated with the execution of the video game112is correlated with particular states of the video game112. For example, frame processing time versus dynamic objects within the video game environment may be displayed on a graph over time as the number of dynamic objects within the video game environment changes. As another example, a plurality of data objects corresponding to a number of instantiated threads for different aspects of the video game at a given point in time may be displayed to a user via a chart or graph. Further, by aggregating the data objects, the total number of instantiated threads a given point time for a video game112can be determined and displayed to the user.

In some embodiments, a user can view the KPI data as it is being generated or stored. Thus, for example, if a user has a suspicion that a particular aspect of the video game112is causing performance issues, the user can monitor the KPI data as a test session of the video game112is occurring to determine whether the user's suspicion is correct. Moreover, a user can determine whether performance of a video game112is getting better or worse within a particular session of the video game. By monitoring the performance of the video game while it is under test, performance issues can be detected sooner than waiting for a test session to complete. For example, if performance of the video game is steadily decreasing during a test session, it may be determined that there is a memory leak. Waiting for a test session to complete may not be a problem for a short test (e.g., 30 seconds), but for longer test sessions (e.g., an hour or more), waiting for the test to complete to identify a performance issue that may be detectable within, for example, the first five minutes can waste a lot of test time and computing resources.

If it is determined at the decision block308that the data object is not related to a key performance indicator, the KPI extractor136discards the data object. Advantageously, in certain embodiments, by storing data objects relating to the KPI and discarding other data objects, the amount of data storage required to save session information may be reduced drastically. For example, a 20 minute test session that may produce 120 GB of data may be reduced to 22 MBs. By reducing the storage used to store KPI data for a session, it is possible to store KPI data for more sessions. Further, it is possible to analyze KPI data across multiple sessions. In some embodiments, instead of discarding the data object at the block320, the block320involves forwarding the data object to a session image generator, such as the session image generator204. In some embodiments, the data object is forwarded to the session image generator204regardless of whether the data object is related to a key performance indicator. The session image generator204may store the data object as part of an image of the test session. This image of the test session may be stored at a secondary storage, network storage, or on a non-transitory computer readable medium, such as a DVD or flash drive. Although typically the amount of storage needed to store data objects corresponding to the entire byte stream makes storing the entire byte stream impossible or impracticable, in some cases, where the test session is relatively short or produces a relatively small amount of data, it may be possible to store the entire byte stream (for example, a test session that produces 1 GB of data). When creating an image based on the entire byte stream, analyzing data across multiple test sessions may in some cases still be challenging due to the amount of processing resources and time required to analyze multiple test sessions.

Portions of the process300may be repeated with additional portions of the byte stream. In some embodiments, the process300may include determining whether receipt of the byte stream is complete. If portions of the byte stream are still being received to, the process300may return to the block306. However, if it is determined that portions of the byte stream are no longer being received, the process300may end.

Example Automated Testing Process

FIG. 4presents a flowchart of an embodiment of a process400for performing automated testing using key performance indicator information. The process400can be implemented by any system that can test a video game to determine key performance information. In some embodiments, one or more tests of the video game may be determined based on KPI information obtained from prior tests enabling automated testing of the video game. The process400, in whole or in part, can be implemented by, for example, a telemetry system106, a video game112, a video game test system130, a testing system132, a KPI extractor136, a data controller138, a KPI warehousing system120, a KPI analysis system150, a test configuration system142, a test selector140, a model generation system146, or a KPI user interface144, among others. Although any number of systems, in whole or in part, can implement the process400, to simplify discussion the process400will be described with respect to particular systems. Further, the process400, or particular operations of the process400may be performed continuously or repeatedly for a video game112until a condition is reached or a user stops the process400. For example, the process400may be configured to repeat a particular number of times, until a result of the testing satisfies a condition, it is determined that a result of the testing is unlikely to satisfy the condition, or it cannot be determined which test to perform. In some embodiments, the process400can be performed as part of a nightly build or at some other predetermined or identified time.

The process400begins at block402where the testing system132receives a selection of a test to perform in a video game112. The selection of the test to perform may be received from a user, such as a tester. Alternatively, or in addition, the selection of the test to perform may be selected using an automated process and/or using one or more machine learning algorithms. For example, the test selector140may automatically select a test to perform based on a result of applying KPI data to a parameter function generated using a machine learning process. Further, the test configuration system142may configure the video game112or a testing environment for the video game112to facilitate performance of the test selected by the test selector140.

At block404, the KPI analysis system150obtains KPI data associated with the test selected at the block402. In some cases, the KPI analysis system150may obtain the KPI data using one or more embodiments of the process300previously described with respect toFIG. 3. Alternatively, or in addition, the KPI analysis system150may obtain the KPI data by accessing the KPI repository122at the KPI warehousing system120. In some cases, the KPI analysis system150may use the KPI search system126to search for and/or obtain KPI data associated with the video game, or a particular build of code corresponding to the video game112, from the KPI repository122. In some cases, at least some of the KPI data is associated with a game engine used by the video game112. The KPI data for the game engine may be used to help select a test to perform on the video game112.

At block406, the KPI analysis system150determines a trend for a particular KPI across a number of test sessions. In some cases, the block406may be performed for multiple KPI. Determining a trend for the KPI may include comparing KPI data corresponding to the particular KPI across the number of test sessions. Further, determining the trend for the KPI data may include performing one or more statistical processes on the KPI data. For example, determining a trend for the KPI data may include averaging KPI data for one or more test sessions and comparing subsequent test sessions to the averaged KPI data. The trend for the KPI data may be determined for any number of test sessions. For example, the trend may be determined for two, five, ten, or twenty test sessions.

Further, in some embodiments, at least some of the test sessions may correspond to different iterations of the video game112. For example, over time, the code for the video game112may change as testing identifies bugs in the code or features that can be added or improved. Moreover, over time, efficiency improvements may be made to the code of the video game112. Thus, the KPI data obtained at the block404may correspond to different versions of the code of the video game112in the trend for the KPI data may reflect changes in the code of the video game112. Moreover, in certain embodiments, the code for the video game112may be the same across test sessions, but data provided to the video game112or commands performed with respect to the video game112may differ across test sessions as different aspects of the video game112are tested. Further, in some embodiments, the test sessions may be for different video games that share the same game engine.

In some embodiments, a trend may be determined for KPI data within a single test session. For example, a trend for memory utilization may be determined during a test session. Advantageously, in certain embodiments, by determining a performance trend within a single test session, it can be determined if a particular portion of the video game does not satisfy the trend, which may indicate a coding error or may indicate that system requirements for the video game may need to change. For example, if a particular level within the video game112utilizes more than a baseline amount of memory, it may be determined that the video game112requires a greater than expected amount of memory to execute. However, if it is determined from KPI data that the amount of data objects created for the level satisfies a trend, then the increased memory utilization may instead indicate a memory leak in the code corresponding to the level.

It is possible in some embodiments to aggregate data across different systems that are testing the video game112. For example, tests performed on a first set of systems that include a particular video card model may be aggregated separately from tests performed on a second set of systems that include another video card model. Advantageously, in certain embodiments, a baseline performance can be established for each of the video card models enabling testing to determine whether both brands of video cards perform equally or perform to a minimum desired level of performance.

At decision block408, the KPI analysis system150determines whether the KPI data obtained at the block404satisfies the trend determined at the block406. Determining whether the KPI data obtained at the block404satisfies the trend may include determining whether the KPI data or aggregated KPI data is within a threshold of the trend. For example, an average, or other statistical value, for a set of KPI data may be calculated and this average may be compared to a trend to determine whether the average is within a threshold of the trend.

In some embodiments, the threshold may be with respect to a particular phase. For example, if the KPI data relates to the number of frames per second that are generated over time by the video game112, the threshold may correspond to a reduction in the number of frames per second compared to the trend. Thus, if the reduction in the number of frames per second exceeds the threshold, the KPI analysis system150would determine that the KPI data does not satisfy the trend. However, KPI data indicating any increase, or less than a threshold decrease, in the number of frames per second generated by the video game112to be considered to satisfy the trend by the KPI analysis system150.

Different trends may be established for different time periods. Alternatively, or in addition, the trend may identify performance of an aspect of the video game112over time. For example, KPI data relating to login time for the test sessions of each day may be aggregated to determine a trend for login time over a number of days. KPI data for test sessions of a particular day can be compared to the trend to determine whether login time is becoming faster, staying the same, or becoming slower across a number of days.

If it is determined at the decision block408that the KPI data does satisfy the trend, the KPI analysis system150performs post test processing at the block410. The post test processing may include storing the KPI data or an indication that the KPI data satisfies the trend at a log. This log may be stored at a repository, such as the KPI repository122. Further, in some embodiments, the post test processing may include updating the trend for the KPI. Moreover, the posttest processing may include outputting a result of the test, identifying the builder code associated with the video game112as verified, uploading or otherwise marking the code associated with the video game112as a stable build, or any other process that may be performed for a successful test of the video game112. In some embodiments, the process400may end after completing operations associated with the block410. However, in other embodiments, the process400may instead proceed to the block412to perform additional testing. This additional testing may be initiated by user or may be performed automatically based on, for example, the particular KPI data obtained at the block404.

If it is determined at the decision block408that the KPI data does not satisfy the trend, the test configuration system142adjusts the selected test based at least in part on the KPI data obtained at the block404for the previously performed test at block412. Adjusting the selected test may include using the test selector140to select a new test to perform on the video game112. Alternatively, or in addition, adjusting the selected test may include adjusting code corresponding to the video game112, adjusting data provided or commands performed with respect to the video game112, or adjusting a testing environment within which the video game112executes. In some embodiments, adjusting the selected test may include providing the KPI data to a parameter or prediction function generated by the model generation system146to identify a test to perform on the video game112.

In some embodiments, a test may fail. In such cases, the failed test may be treated similarly as when the KPI data does not satisfy a trend or baseline value. Further, a user may be alerted that the test failed. Alternatively, or in addition, further testing may automatically be performed by the testing system132, which can perform tests selected by the test selector140using a parameter function generated by the model generation system146.

Advantageously, in certain embodiments, by using the parameter function generated by the model generation system146, testing the video game112can be automated. For example, the process400may be performed with a particular test of the video game112. KPI data generated from the initial test can be supplied to a parameter function to identify further tests to be performed with respect to the video game112. KPI data selected from these further tests can be supplied to additional parameter functions to identify further tests to perform with respect to the video game112. This process may repeat for a number of iterations or until a particular condition is satisfied. This condition may relate to whether the KPI data satisfies the trend, whether a difference between the KPI data in the trend satisfies or does not satisfy a threshold, a particular number of tests has been completed.

In some embodiments, the test selector140may automatically select new tests to perform based on KPI data applied to a parameter function regardless of whether the KPI data satisfies a trend. Further, the parameter function may use the KPI data to predict subsequent tests that should be performed to establish that performance of a video game satisfies a set of performance or operation criteria, such as executing without crashing or executing with respect to a set of desired minimal system or resource requirements.

Advantageously, in certain embodiments, by comparing the KPI data to a trend that is generated over a plurality of test sessions, it is possible to determine whether the code corresponding to a video game112is improving. The code corresponding to the video game112may change over time due to further development to add new features or to try and improve efficiency. Further, the code may change as bugs or coding errors are corrected. Moreover, in some cases, developers may decide to change from one version of a game engine to another version, such as when a newer version of the game engine is released during development of the video game. All of the aforementioned code changes may affect code performance negatively or positively. By establishing a baseline for a particular KPI, and comparing KPI data of a number of test sessions to the baseline, it is possible to determine whether development of code corresponding to the video game112is resulting in performance improvements or whether new features are not causing performance degradation.

Moreover, as previously mentioned, the trend may change over time. Thus, as the code is changed, the baseline for a KPI that KPI test data is compared against may change. In some cases, a user may accept a worse baseline because, for example, new features decrease performance by an acceptable amount as determined by a developer or tester. In other cases, the baseline may improve due to performance improvements in the code.

In certain embodiments, because some video games share a game engine, it is possible to use the process400to determine whether multiple video games are using the game engine to the same level of efficiency. For example, if two video games are using the same rendering system included in a game engine, but the performance is vastly different, it may be determined that one of the video games is not efficiently using the game engine.

Example Model Generation System

FIG. 5illustrates an embodiment of a model generation system146ofFIG. 1. The model generation system146may be used to determine one or more prediction models560based on historical data552for a number of different KPI. Typically, although not necessarily, the historical data552includes a large amount of data associated with the KPI. For example the historical data552may include KPI data for hundreds, or more, iterations of a test session for a video game or game engine. Further, the historical data552can include data received from one or more data sources, such as, for example, one or more video games that share a game engine. In some embodiments, the historical data552may include a very large number of data points, such as millions of data points, which may be aggregated into one or more data sets. In some cases, the historical data552may be accessed from a KPI repository122. Further, in some embodiments, one or more subsets of the historical data are limited by a date restriction, such as for example, limited to include only data from the last 6 months. The historical data may also be restricted to a particular number of previous builds of a video game or game engine.

The historical data552may include an identification of tests performed in response to different KPI data. Further, the historical data552may include changes to data supplied to a video game, operations performed with respect to a video game, and changes to a test environment for a video game associated to different KPI data.

The model generation system146may, in some cases, also receive feedback data554. This data may be received as part of a supervised model generation process that enables a user, such as an administrator, to provide additional input to the model generation system146that may be used to facilitate generation of the prediction model560. For example, if an anomaly exists in the historical data552, the user may tag the anomalous data enabling the model generation system146to handle the tagged data differently, such as applying a different weight to the data or excluding the data from the model generation process.

Further, the model generation system146may receive control data556. This control data556may identify one or more features or characteristics for which the model generation system146is to determine a model. Further, in some cases, the control data556may indicate a value for the one or more features identified in the control data556. For example, suppose the control data556indicates that a prediction model is to be generated using the historical data552to select a test of frame rendering rate. If the frame render rate for a number of videos within a video game is know, this data may be provided as part of the control data556, or as part of the historical data552.

The model generation system146may generally include a model generation rule set570for generation of the prediction model560. The rule set570may include one or more parameters562. Each set of parameters562may be combined using one or more mathematical functions to obtain a parameter function. Further, one or more specific parameters may be weighted by the weights564. In some cases, the parameter function may be obtained by combining a set of parameters with a respective set of weights564. The prediction model560and/or the respective parameters562of the prediction models560may be derived during a training process based on particular input data, such as the historical data552, feedback data554, and control data556, and defined output criteria, which may be included with the control data556, used for training purposes. The model generation rule set570can define the specific machine learning rules and/or algorithms the model generation system146uses to generate the model based on a defined objective function, such as determining a test to perform or a test environment in which to test a video game. The test environment may refer to the computing resources available to the video game112or a state of the video game when initiating the test of the video game112.

In some embodiments, initial parameters562and weights564can be manually provided during the initiation of the model generation process. The parameters562and weights564can be updated and modified during the model generation phase to generate the prediction model560. In some embodiments, weights may be applied to the parameter functions or prediction models themselves. For example, the mathematical complexity or the number of parameters included in a particular prediction model560may affect a weight for the particular prediction model560, which may impact the generation of the model and/or a selection algorithm or a selection probability that the particular prediction model560is selected.

The model generation system146can filter and categorize the historical data sets according to various characteristics and parameters of the data. For example, the data can be categorized by the data source (such as, for example, companion application interaction data, game application data, host application data, or user profile data), information type (such as, for example, utterance commands, utterance statements, utterance queries, gameplay information, transaction information, interaction information, or game account information), opponent data (such as, for example, skill of opponent, role selected or played by opponent, or success rate verse opponent), teammate data (such as, for example, skill of teammates, roles selected or played by teammates, or success rate when playing with a particular teammate) or other categories associated with the data. The model generation system146can filter the information to identify the information for further processing. In some embodiments, the model generation system146is configured to filter and separate the historical data552into a plurality of data types or categories before further processing. Moreover, in some cases, some of the historical data552may be filtered out or removed from the historical data552based on the data being associated with a relevance that does not satisfy a threshold relevance as determined by the model generation system146.

After the prediction model560has been generated, the model can be used during runtime of the test selector140to select a test to perform on the video game112. Further, the prediction model560may be used to modify the video game112, to modify a state of the video game112, to modify data provided to the video game112, and/or to modify a test environment for the video game112.

Example Test Selector

FIG. 6illustrates an embodiment of a test selector140ofFIG. 1. The test selector140can apply or use one or more of the prediction models660generated by the model generation system146. Although illustrated as a separate system, in some cases, the features of the test selector140are performed by the test configuration system142, the testing system132, or the video game test system130. The test selector140may use one or more prediction models660A,660B,660N (which may be referred to collectively as “prediction models660” or in the singular as “prediction model660”) to process the input data672to obtain the output data674.

The test selector140may apply the prediction model(s)660during determination of a test to perform on a video game112in response to obtaining KPI data from the video game or from a game engine used by the video game112. In some cases, the prediction models660are applied after a trigger occurs. For example, the prediction models160may be used to select a test after determining that KPI data does not satisfy a trend for the KPI or is not within a threshold of the trend. The input data672can include one or more pieces of data associated with the video game112or that may be used to facilitate test selection, such as an expected number of simultaneous players, a target minimum hardware requirements, or a game engine used in development of the video game112.

In some embodiments, a single prediction model660may exist for the test selector140. However, as illustrated, it is possible for the test selector140to include multiple prediction models660. The test selector140can determine which prediction model, such as any of models660A-N, to use based on input data672and/or additional identifiers associated with the input data672. Additionally, the prediction model660selected may be selected based on the specific data provided as input data672. The availability of particular types of data as part of the input data672can affect the selection of the prediction model660. For example, the identification of a particular game engine as part of the input data may result in the use of prediction model660A. However, if a game engine was not used, or if a new game engine not previously used in previous video games is used, then prediction model660B may be used instead.

The output data674can be a test selection. Alternatively, or in addition, the output data674can be an identification of a test environment or a particular KPI to measure or test. In some cases, the output data674is a value corresponding to a particular test to perform. For example, if the output if value ‘1’, then test one should be performed or a particular KPI should be evaluated.

The prediction models660A,660B,660N may generally include a set of one or more parameters662A,662B,662N, respectively (which may be referred to collectively as “parameters662”). Each set of parameters662(such as parameters662A) may be combined using one or more mathematical functions to obtain a parameter function. Further, one or more specific parameters from the parameters662A,662B,662N may be weighted by the weights664A,664B,664N (which may be referred to collectively as “weights664”). In some cases, the parameter function may be obtained by combining a set of parameters (such as the parameters662A) with a respective set of weights664(such as the weights664A).

Example Prediction Model Generation Process

FIG. 7presents a flowchart of an embodiment of a prediction model generation process. The process700can be implemented by any system that can generate one or more parameter functions or prediction models that include one or more parameters. In some cases, the process700serves as a training process for developing one or more parameter functions or prediction models based on historical data or other known data. The process700, in whole or in part, can be implemented by, for example, an interactive computing environment101, a test selector140, a model generation system146, or a user computing system110, among others. Although any number of systems, in whole or in part, can implement the process700, to simplify discussion, the process700will be described with respect to particular systems. Further, it should be understood that the process700may be updated or performed repeatedly over time. For example, the process700may be repeated once per month, with the addition or release of a new video game, an update to a game engine, a threshold number of modifications to a video game under development, or the identification of new KPIs of interest. However, the process700may be performed more or less frequently.

The process700begins at block702where the model generation system146receives historical data652comprising KPI data and/or the identity of tests performed on one or more video games. The historical data652may comprise data for video games that were developed using the same game engine. This historical data652may serve as training data for the model generation system146. Further, the historical data652may include video game state information for previously tested video games or iterations of a video game.

At block704, the model generation system146receives control data556indicating a desired prediction criteria corresponding to the historical data652. This control data556may indicate one or more features or characteristics for which the model generation system146is to determine a model. Alternatively, or in addition, the control data556may include a value for the features or characteristics that are associated with the received historical data652. For example, the control data556may identify KPIs that are important to the development of a video game, such as processor utilization, number of frames per second that can be rendered, simultaneous users that can be supported by the video game engagement level, churn rate, or retention rate, as the desired KPIs to be improved or to be tested. In some embodiments, the control data156may include multiple KPI metrics to be tested by a tester or test process.

At block706, the model generation system146generates one or more prediction models660based on the historical data652and the control data556. The prediction models660may include one or more variables or parameters662that can be combined using a mathematical algorithm or model generation ruleset570to generate a prediction model660based on the historical data652and, in some cases, the control data556. Further, in certain embodiments, the block706may include applying one or more items of feedback data554. For example, if the prediction model660is generated as part of a supervised machine learning process, a user (for example, an administrator) may provide one or more inputs to the model generation system146as the prediction model660is being generated and/or to refine the prediction model660generation process. For example, the user may be aware that an update was made to a game engine. In such a case, the user may supply feedback data554to reduce the weight of a portion of the historical data552that may correspond to data supplied for instances of the video game that used the prior version of the game engine. Further, in some cases, one or more of the variables or parameters may be weighted using, for example, weights664. The value of the weight for a variable may be based at least in part on the impact the variable has in generating the prediction model660that satisfies, or satisfies within a threshold discrepancy, the control data556and/or the historical data152. In some cases, the combination of the variables and weights may be used to generate a prediction model160.

The model generation system146, at block708, based at least in part on an accuracy of the prediction model660and, optionally, any associated penalty or weighting selects a prediction model660. In some embodiments, the model generation system146selects a prediction model660associated with a lower penalty compared to another prediction model660. However, in some embodiments, the model generation system146may select a prediction model associated with a higher penalty if, for example, the output of the prediction model660is a threshold degree more accurate than the prediction model associated with the lower penalty. In certain embodiments, the block708may be optional or omitted. For example, in some cases, the prediction models660may not be associated with a penalty. In some such cases, a prediction model may be selected from a plurality of prediction models based on the accuracy of the output generated by the prediction model or may be selected at random.

Overview of Computing System

FIG. 8illustrates an embodiment of a user computing system110, which may also be referred to as a gaming system. As illustrated, the user computing system110may be a single computing device that can include a number of elements. However, in some cases, the user computing system110may include multiple devices. For example, the user computing system110may include one device that includes a central processing unit and a graphics processing unit, another device that includes a display, and another device that includes an input mechanism, such as a keyboard or mouse.

The user computing system110can be an embodiment of a computing system that can execute a game system. In the non-limiting example ofFIG. 8, the user computing system110is a touch-capable computing device capable of receiving input from a user via a touchscreen display802. However, the user computing system110is not limited as such and may include non-touch capable embodiments, which do not include a touchscreen display802.

The user computing system110includes a touchscreen display802and a touchscreen interface804, and is configured to execute a game application. This game application810may be the video game112. Although described as a game application810, in some embodiments the application810may be another type of application that may be capable of interacting with multiple users across multiple user computing systems, such as educational software or language software. While user computing system110includes the touchscreen display802, it is recognized that a variety of input devices may be used in addition to or in place of the touchscreen display802.

The user computing system110can include one or more processors, such as central processing units (CPUs), graphics processing units (GPUs), and accelerated processing units (APUs). Further, the user computing system110may include one or more data storage elements. In addition, the user computing system110may include one or more volatile memory elements, such as random-access memory (RAM). In some embodiments, the user computing system110can be a specialized computing device created for the purpose of executing game applications810. For example, the user computing system110may be a video game console. The game applications810executed by the user computing system110may be created using a particular application programming interface (API) or compiled into a particular instruction set that may be specific to the user computing system110. In some embodiments, the user computing system110may be a general purpose computing device capable of executing game applications810and non-game applications. For example, the user computing system110may be a laptop with an integrated touchscreen display or desktop computer with an external touchscreen display. Components of an example embodiment of a user computing system110are described in more detail with respect toFIG. 8.

The touchscreen display802can be a capacitive touchscreen, a resistive touchscreen, a surface acoustic wave touchscreen, or other type of touchscreen technology that is configured to receive tactile inputs, also referred to as touch inputs, from a user. For example, the touch inputs can be received via a finger touching the screen, multiple fingers touching the screen, a stylus, or other stimuli that can be used to register a touch input on the touchscreen display802. The touchscreen interface804can be configured to translate the touch input into data and output the data such that it can be interpreted by components of the user computing system110, such as an operating system and the game application810. The touchscreen interface804can translate characteristics of the tactile touch input touch into touch input data. Some example characteristics of a touch input can include, shape, size, pressure, location, direction, momentum, duration, and/or other characteristics. The touchscreen interface804can be configured to determine the type of touch input, such as, for example a tap (for example, touch and release at a single location) or a swipe (for example, movement through a plurality of locations on touchscreen in a single touch input). The touchscreen interface804can be configured to detect and output touch input data associated with multiple touch inputs occurring simultaneously or substantially in parallel. In some cases, the simultaneous touch inputs may include instances where a user maintains a first touch on the touchscreen display802while subsequently performing a second touch on the touchscreen display802. The touchscreen interface804can be configured to detect movement of the touch inputs. The touch input data can be transmitted to components of the user computing system110for processing. For example, the touch input data can be transmitted directly to the game application810for processing.

In some embodiments, the touch input data can undergo processing and/or filtering by the touchscreen interface804, an operating system, or other components prior to being output to the game application810. As one example, raw touch input data can be captured from a touch input. The raw data can be filtered to remove background noise, pressure values associated with the input can be measured, and location coordinates associated with the touch input can be calculated. The type of touch input data provided to the game application810can be dependent upon the specific implementation of the touchscreen interface804and the particular API associated with the touchscreen interface804. In some embodiments, the touch input data can include location coordinates of the touch input. The touch signal data can be output at a defined frequency. Processing the touch inputs can be computed many times per second and the touch input data can be output to the game application for further processing.

A game application810can be configured to be executed on the user computing system110. The game application810may also be referred to as a video game, a game, game code and/or a game program. A game application should be understood to include software code that a user computing system110can use to provide a game for a user to play. A game application810might comprise software code that informs a user computing system110of processor instructions to execute, but might also include data used in the playing of the game, such as data relating to constants, images and other data structures. For example, in the illustrated embodiment, the game application includes a game engine812, game data814, and game state information816. As previously stated, the embodiments described herein may be used for applications other than video games, such as educational software or videoconferencing. Thus, in some such cases, the game application810may be substituted with other types of applications that may involve multiple users communicating over a network and selecting a server, or one of the plurality of user computing systems, to act as a host.

The touchscreen interface804or another component of the user computing system110, such as the operating system, can provide user input, such as touch inputs, to the game application810. In some embodiments, the user computing system110may include alternative or additional user input devices, such as a mouse, a keyboard, a camera, a game controller, and the like. Further, the user computing system110may include a virtual reality display and/or an augmented reality display. A user can interact with the game application810via the touchscreen interface804and/or one or more of the alternative or additional user input devices. The game engine812can be configured to execute aspects of the operation of the game application810within the user computing system110. Execution of aspects of gameplay within a game application can be based, at least in part, on the user input received, the game data814, and game state information816. The game data814can include game rules, prerecorded motion capture poses/paths, environmental settings, constraints, animation reference curves, skeleton models, and/or other game application information. Further, the game data814may include information that is used to set or adjust the difficulty of the game application810.

The game engine812can execute gameplay within the game according to the game rules. Some examples of game rules can include rules for scoring, possible inputs, actions/events, movement in response to inputs, and the like. Other components can control what inputs are accepted and how the game progresses, and other aspects of gameplay. During execution of the game application810, the game application810can store game state information816, which can include character states, environment states, scene object storage, and/or other information associated with a state of execution of the game application810. For example, the game state information816can identify the state of the game application at a specific point in time, such as a character position, character action, game level attributes, and other information contributing to a state of the game application.

The game engine812can receive the user inputs and determine in-game events, such as actions, collisions, runs, throws, attacks and other events appropriate for the game application810. During operation, the game engine812can read in game data814and game state information816in order to determine the appropriate in-game events. In one example, after the game engine812determines the character events, the character events can be conveyed to a movement engine that can determine the appropriate motions the characters should make in response to the events and passes those motions on to an animation engine. The animation engine can determine new poses for the characters and provide the new poses to a skinning and rendering engine. The skinning and rendering engine, in turn, can provide character images to an object combiner in order to combine animate, inanimate, and background objects into a full scene. The full scene can be conveyed to a renderer, which can generate a new frame for display to the user. The process can be repeated for rendering each frame during execution of the game application. Though the process has been described in the context of a character, the process can be applied to any process for processing events and rendering the output for display to a user.

In some cases, at least some of the video game engine812may reside on a server, such as one of the video game servers152. Further, in some cases, the complete video game engine812may reside on the server. Thus, in some cases, the video game engine812may be omitted from the portion of the video game application810hosted on the user computing system110. Similarly, in some embodiments, video game state information816and video game data814may be hosted on a server in addition to or instead of on the user computing system110. Further, in some cases, actions of the user performed within the video game application810may be transmitted to a server that is hosting a portion of the video game810. The server may compute or determine the result of the user's interaction with respect to the video game application810, such as collisions, attacks, or movements. The server may then send a result of the user's actions to the video game application810on the user computing system110. The video game application810may then perform an action in response to the result, such as displaying the result to the user.

Example Hardware Configuration of Computing System

FIG. 9illustrates an embodiment of a hardware configuration for the user computing system110ofFIG. 8. Other variations of the user computing system110may be substituted for the examples explicitly presented herein, such as removing or adding components to the user computing system110. The user computing system110may include a dedicated game device, a smart phone, a tablet, a personal computer, a desktop, a laptop, a smart television, a car console display, and the like. Further, (although not explicitly illustrated inFIG. 9) as described with respect toFIG. 8, the user computing system110may optionally include a touchscreen display802and a touchscreen interface804.

As shown, the user computing system110includes a processing unit20that interacts with other components of the user computing system110and also components external to the user computing system110. A game media reader22may be included that can communicate with game media12. Game media reader22may be an optical disc reader capable of reading optical discs, such as CD-ROM or DVDs, or any other type of reader that can receive and read data from game media12. In some embodiments, the game media reader22may be optional or omitted. For example, game content or applications may be accessed over a network via the network I/O38rendering the game media reader22and/or the game media12optional.

The user computing system110may include a separate graphics processor24. In some cases, the graphics processor24may be built into the processing unit20, such as with an APU. In some such cases, the graphics processor24may share Random Access Memory (RAM) with the processing unit20. Alternatively, or in addition, the user computing system110may include a discrete graphics processor24that is separate from the processing unit20. In some such cases, the graphics processor24may have separate RAM from the processing unit20. Further, in some cases, the graphics processor24may work in conjunction with one or more additional graphics processors and/or with an embedded or non-discrete graphics processing unit, which may be embedded into a motherboard and which is sometimes referred to as an on-board graphics chip or device.

The user computing system110also includes various components for enabling input/output, such as an I/O32, a user I/O34, a display I/O36, and a network I/O38. As previously described, the input/output components may, in some cases, including touch-enabled devices. The I/O32interacts with storage element40and, through a device42, removable storage media44in order to provide storage for the user computing system110. Processing unit20can communicate through I/O32to store data, such as game state data and any shared data files. In addition to storage40and removable storage media44, the user computing system110is also shown including ROM (Read-Only Memory)46and RAM48. RAM48may be used for data that is accessed frequently, such as when a game is being played, or for all data that is accessed by the processing unit20and/or the graphics processor24.

User I/O34is used to send and receive commands between processing unit20and user devices, such as game controllers. In some embodiments, the user I/O34can include touchscreen inputs. As previously described, the touchscreen can be a capacitive touchscreen, a resistive touchscreen, or other type of touchscreen technology that is configured to receive user input through tactile inputs from the user. Display I/O36provides input/output functions that are used to display images from the game being played. Network I/O38is used for input/output functions for a network. Network I/O38may be used during execution of a game, such as when a game is being played online or being accessed online.

Display output signals may be produced by the display I/O36and can include signals for displaying visual content produced by the user computing system110on a display device, such as graphics, user interfaces, video, and/or other visual content. The user computing system110may comprise one or more integrated displays configured to receive display output signals produced by the display I/O36, which may be output for display to a user. According to some embodiments, display output signals produced by the display I/O36may also be output to one or more display devices external to the user computing system110.

The user computing system110can also include other features that may be used with a game, such as a clock50, flash memory52, and other components. An audio/video player56might also be used to play a video sequence, such as a movie. It should be understood that other components may be provided in the user computing system110and that a person skilled in the art will appreciate other variations of the user computing system110.

Program code can be stored in ROM46, RAM48, or storage40(which might comprise hard disk, other magnetic storage, optical storage, solid state drives, and/or other non-volatile storage, or a combination or variation of these). At least part of the program code can be stored in ROM that is programmable (ROM, PROM, EPROM, EEPROM, and so forth), in storage40, and/or on removable media such as game media12(which can be a CD-ROM, cartridge, memory chip or the like, or obtained over a network or other electronic channel as needed). In general, program code can be found embodied in a tangible non-transitory signal-bearing medium.

Random access memory (RAM)48(and possibly other storage) is usable to store variables and other game and processor data as needed. RAM is used and holds data that is generated during the play of the game and portions thereof might also be reserved for frame buffers, game state and/or other data needed or usable for interpreting user input and generating game displays. Generally, RAM48is volatile storage and data stored within RAM48may be lost when the user computing system110is turned off or loses power.

As user computing system110reads game media12and provides a game, information may be read from game media12and stored in a memory device, such as RAM48. Additionally, data from storage40, ROM46, servers accessed via a network (not shown), or removable storage media46may be read and loaded into RAM48. Although data is described as being found in RAM48, it will be understood that data does not have to be stored in RAM48and may be stored in other memory accessible to processing unit20or distributed among several media, such as game media12and storage40.

It is to be understood that not necessarily all objects or advantages may be achieved in accordance with any particular embodiment described herein. Thus, for example, those skilled in the art will recognize that certain embodiments may be configured to operate in a manner that achieves, increases, or optimizes one advantage or group of advantages as taught herein without necessarily achieving other objects or advantages as may be taught or suggested herein.

All of the processes described herein may be embodied in, and fully automated via, software code modules executed by a computing system that includes one or more computers or processors. The code modules may be stored in any type of non-transitory computer-readable medium or other computer storage device. Some or all the methods may be embodied in specialized computer hardware.

Many other variations than those described herein will be apparent from this disclosure. For example, depending on the embodiment, certain acts, events, or functions of any of the algorithms described herein can be performed in a different sequence, can be added, merged, or left out altogether (for example, not all described acts or events are necessary for the practice of the algorithms). Moreover, in certain embodiments, acts or events can be performed concurrently, for example, through multi-threaded processing, interrupt processing, or multiple processors or processor cores or on other parallel architectures, rather than sequentially. In addition, different tasks or processes can be performed by different machines and/or computing systems that can function together.

The various illustrative logical blocks and modules described in connection with the embodiments disclosed herein can be implemented or performed by a machine, such as a processing unit or processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A processor can be a microprocessor, but in the alternative, the processor can be a controller, microcontroller, or state machine, combinations of the same, or the like. A processor can include electrical circuitry configured to process computer-executable instructions. In another embodiment, a processor includes an FPGA or other programmable device that performs logic operations without processing computer-executable instructions. A processor can also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Although described herein primarily with respect to digital technology, a processor may also include primarily analog components. A computing environment can include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a portable computing device, a device controller, or a computational engine within an appliance, to name a few.

Conditional language such as, among others, “can,” “could,” “might” or “may,” unless specifically stated otherwise, are otherwise understood within the context as used in general to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.

Disjunctive language such as the phrase “at least one of X, Y, or Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (for example, X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to each be present.

Any process descriptions, elements or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or elements in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown, or discussed, including substantially concurrently or in reverse order, depending on the functionality involved as would be understood by those skilled in the art.

Unless otherwise explicitly stated, articles such as “a” or “an” should generally be interpreted to include one or more described items. Accordingly, phrases such as “a device configured to” are intended to include one or more recited devices. Such one or more recited devices can also be collectively configured to carry out the stated recitations. For example, “a processor configured to carry out recitations A, B and C” can include a first processor configured to carry out recitation A working in conjunction with a second processor configured to carry out recitations B and C.

It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure.

Claims

  1. A computer-implemented method comprising: as implemented by an interactive computing system configured with specific computer-executable instructions, receiving a byte stream for a test session of a video game, the test session of the video game generating a first volume of data exceeding a threshold quantity, and associated with an execution of an instance of the video game;decoding a portion of the byte stream into a data object, the data object comprising data encapsulating an event associated with the execution of the instance of the video game;determining that the data object corresponds to a key performance indicator;generating a plurality of prediction models by using as training data historical data associated with the key performance indicator, the historical data including a data portion associated with execution of one or more of a plurality of additional test sessions of the video game on an older version of a video game engine than a version of the video game engine on which the test session was executed;refining generation of the plurality of prediction models by reducing a weight of at least one weighted parameter associated with the data portion of the historical data corresponding to the older version of the video game engine in response to receiving user feedback to reduce the weight, thereby decreasing impact of the data portion of the historical data on the generation of the plurality of prediction models;aggregating the data object with additional data objects decoded from the byte stream that correspond to the key performance indicator to obtain a first set of aggregated data objects, the first set of aggregated data objects associated with a second volume of data that is smaller than and is a subset of the first volume of data, the second volume of data not exceeding the threshold quantity;evaluating the first set of aggregated data objects against a trend generated from an occurrence of the plurality of additional test sessions of the video game;in response to determining that the first set of aggregated data objects does not satisfy the trend, selecting a first test procedure based at least in part on a result of said evaluating as an active test procedure to be applied to the video game, wherein selecting the first test procedure comprises: applying the plurality of prediction models to the first set of aggregated data objects, each prediction model having weighted parameters and each prediction model configured to output a particular test procedure;selecting, using an automated test selector and without user input, a first prediction model from the plurality of prediction models, wherein the automated test selector is configured to select, without user input, different prediction models from the plurality of prediction models, and wherein the automated test selector is generated based at least in part on a machine learning algorithm configured to select as the first prediction model a prediction model associated with a highest accuracy of generated output of the plurality of prediction models;and using the particular test procedure output by the first prediction model as the active test procedure;performing the active test procedure on the video game automatically and without user input, wherein performing the active test procedure comprises: modifying code corresponding to the video game;and initiating another test session of the video game using the modified code of the video game;and in response to determining that a second set of aggregated data objects determined for the first test procedure does not satisfy the trend: repeating the selecting to select a second test procedure different from the first test procedure, wherein selecting the second test procedure comprises selecting from the plurality of prediction models a second prediction model associated with a next highest accuracy of generated output;and performing the second test procedure on the video game automatically and without user input.
  1. The computer-implemented method of claim 1 , wherein said decoding of the portion of the byte stream occurs as one or more additional portions of the byte stream are being received.
  2. The computer-implemented method of claim 1 , wherein decoding the portion of the byte stream into the data object comprises decoding the portion of the byte stream into a human-readable format.
  3. The computer-implemented method of claim 1 , wherein evaluating the first set of aggregated data further comprises: determining, based at least in part on the first set of aggregated data objects, a value corresponding to the key performance indicator, the value comprising a statistical value generated from the first set of aggregated data objects;and comparing the value to the trend.
  4. The computer-implemented method of claim 1 , wherein the active test procedure comprises: modifying data supplied to the video game during execution of the video game;and initiating a second test session of the video game using the modified data.
  5. The computer-implemented method of claim 1 , wherein applying the plurality of prediction models to the first set of aggregated data objects comprises evaluating the first set of aggregated data objects with a parameter function generated based at least in part on an additional machine learning algorithm.
  6. The computer-implemented method of claim 1 , wherein the event comprises a utilization of at least a portion of a computing resource in response to one or more operations performed by the video game during execution of the test session of the video game.
  7. The computer-implemented method of claim 1 , wherein the key performance indicator comprises a metric associated with utilization of a computing resource during the execution of the instance of the video game.
  8. The computer-implemented method of claim 1 , wherein the data object and the additional data objects correspond to utilization of a particular computing resource during the execution of the instance of the video game.
  9. The computer-implemented method of claim 1 , further comprising: aggregating the first set of aggregated data objects with data objects generated from additional test sessions of the video game to obtain multi-session aggregated data;and selecting the active test procedure based at least in part on the multi-session aggregated data.
  10. A system comprising: an electronic data store configured to store data objects corresponding to key performance indicator data;and an interactive computing system comprising one or more hardware processors, the interactive computing system configured to execute specific computer-executable instructions to at least: receive a byte stream for a test session of a video game, the test session of the video game associated with an execution of an instance of the video game;decode a portion of the byte stream into a data object, the data object comprising data encapsulating an event associated with the execution of the instance of the video game;determine that the data object corresponds to a key performance indicator;generate a plurality of prediction models by using as training data historical data associated with the key performance indicator, the historical data including a data portion associated with execution of one or more of a plurality of additional test sessions of the video game on an older version of a video game engine than a version of the video game engine on which the test session was executed;refine generation of the plurality of prediction models by reducing a weight of at least one weighted parameter associated with the data portion of the historical data corresponding to the older version of the video game engine in response to receiving user feedback to reduce the weight, thereby decreasing impact of the data portion of the historical data on the generation of the plurality of prediction models;store the data object at the electronic data store;aggregate the data object with additional data objects stored at the electronic data store that correspond to the key performance indicator to obtain a first set of aggregated data objects;generate a user interface element based at least in part on the set of aggregated data objects;output the user interface element for display to a user;evaluate the first set of aggregated data objects against a trend generated from an occurrence of the plurality of additional test sessions of the video game;in response to a determination that the first set of aggregated data objects does not satisfy the trend, select a first test procedure based at least in part on a result of said evaluating by at least: applying the plurality of prediction models to the first set of aggregated data objects, each prediction model having weighted parameters and each prediction model configured to output a particular test procedure;selecting, using an automated test selector and without user input, a first of the prediction models from the plurality of prediction models, wherein the automated test selector is configured to select, without user input, different prediction models from the plurality of prediction models, and wherein the automated test selector is generated based at least in part on a machine learning algorithm configured to select the first prediction model as a prediction model associated with a highest accuracy of generated output of the plurality of prediction models;and using the particular test procedure output by the first prediction model as the test procedure;perform the test procedure on the video game by: modifying code corresponding to the video game;and initiating another test session of the video game using the modified code of the video game;and in response to a determination that a second set of aggregated data objects determined for the first test procedure does not satisfy the trend: repeat the selecting to select a second test procedure different from the first test procedure, wherein selecting the second test procedure comprises selecting from the plurality of prediction models a second prediction model associated with a next highest accuracy of generated output;and perform the second test procedure on the video game.
  11. The system of claim 11 , wherein the interactive computing system is further configured to evaluate the first set of aggregated data objects by at least: determining, based at least in part on the first set of aggregated data objects, a statistical value corresponding to the key performance indicator;and determining whether the statistical value satisfies the trend.
  12. The system of claim 11 , wherein the interactive computing system is further configured to perform the selected test procedure by at least: obtaining a modified test environment by modifying code corresponding to the video game, modifying data accessed by the video game, or modifying computing resources available to the video game;and initiating a second test session of the video game using the modified test environment.
  13. The system of claim 11 , wherein applying the plurality of prediction models to the first set of aggregated data objects comprises evaluating the first set of aggregated data objects with a parameter function generated based at least in part on an additional machine learning algorithm.
  14. The system of claim 11 , wherein the event corresponds to generation of a frame of video output by the video game.
  15. The system of claim 11 , wherein the key performance indicator comprises a metric associated with performance of the instance of the video game with respect to computing resources available to the instance of the video game.
  16. A non-transitory computer-readable storage medium storing computer executable instructions that, when executed by one or more computing devices, configure the one or more computing devices to perform operations comprising: receiving a byte stream for a test session of a video game, the test session of the video game associated with an execution of an instance of the video game;decoding a portion of the byte stream into a data object, the data object comprising data encapsulating an event associated with the execution of the instance of the video game, wherein decoding the portion of the byte stream into the data object transforms data from a machine-readable format to a human-readable format;determining that the data object corresponds to a key performance indicator;generating a plurality of prediction models by using as training data historical data associated with the key performance indicator, the historical data including a data portion associated with execution of one or more of a plurality of additional test sessions of the video game on an older version of a video game engine than a version of the video game engine on which the test session was executed;refining generation of the plurality of prediction models by reducing a weight of at least one weighted parameter associated with the data portion of the historical data corresponding to the older version of the video game engine in response to receiving user feedback to reduce the weight, thereby decreasing impact of the data portion of the historical data on the generation of the plurality of prediction models;aggregating the data object with additional data objects that correspond to the key performance indicator to obtain a first set of aggregated data objects;outputting the first set of aggregated data objects as a performance metric corresponding to performance of the test session of the video game;evaluating the first set of aggregated data objects against a trend generated from an occurrence of the plurality of additional test sessions of the video game;in response to determining that the first set of aggregated data objects does not satisfy the trend, selecting a first modified test procedure based at least in part on a result of said evaluating, wherein selecting the first modified test procedure comprises: applying the plurality of prediction models to the set of aggregated data objects, each prediction model having weighted parameters and each prediction model configured to output a particular test procedure;selecting, using an automated test selector and without user input, a particular prediction model from the plurality of prediction models, wherein the automated test selector is configured to select, without user input, different prediction models from the plurality of prediction models, and wherein the automated test selector is generated based at least in part on a machine learning algorithm configured to select as the particular prediction model a prediction model associated with a highest accuracy of generated output of the plurality of prediction models;using the particular test procedure output by the particular prediction model as the first modified test procedure;and performing the first modified test procedure on the video game by: modifying code corresponding to the video game;and initiating another test session of the video game using the modified code of the video game;and in response to determining that a second set of aggregated data objects determined for the first modified test procedure does not satisfy the trend: repeating the selecting to select a second modified test procedure different from the first modified test procedure, wherein selecting the second modified test procedure comprises selecting from the plurality of prediction models a second prediction model associated with a next highest accuracy of generated output;and performing the second modified test procedure on the video game.

Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.