U.S. Pat. No. 11,517,816
SYSTEM FOR TESTING COMMAND EXECUTION LATENCY WITHIN A VIDEO GAME
AssigneeElectronic Arts Inc.
Issue DateMay 6, 2021
Illustrative Figure
Abstract
A video game test system can determine an objective measure of elapsed time between interaction with a video game controller and the occurrence of a particular event within the video game. This objective measure enables a tester to determine whether a video game is objectively operating slowly or just feels slow to the tester, and may indicate the existence of coding errors that may affect execution speed, but not cause visible errors. The system may obtain the objective measure of elapsed time by simulating a user's interaction with the video game. Further, the system may identify data embedded into a frame of an animation by the video game source code to identify the occurrence of a corresponding event. The system can then measure the elapsed time between the simulated user interaction and the occurrence or triggering of the corresponding event.
Description
DETAILED DESCRIPTION OF EMBODIMENTS Introduction One aspect of video games that can be important to test is responsiveness of the video game. Responsiveness of the video game may relate to an amount of time between when a user interacts with a video game and a corresponding action is performed by the video game. This time difference may be referred to as a “latency” or “command latency” of the video game. For example, it can be important to determine an amount of time between the user pressing a button of a user interface device (for example, a controller of a video game console or a keyboard of a laptop) and the occurrence of a corresponding event or action, such as the firing of a bullet, the accelerating of a vehicle, or a shooting of a basketball. There are a number of reasons that an interaction with the video game and the occurrence of an event may have a particular latency. Some of the causes of latency may be due to coding errors or design errors. For example, the code may include incorrect function calls, errors in configuring certain state variables, poor or less than optimal object design, the selection of less efficient data structures than substitute options, the selection of less efficient methods, functions, libraries than substitute options, the use of less efficient code than other possible code options, or use of deprecated functionality within a game engine or a programming language. Other causes of latency may be due to decisions that may not necessarily be errors, but can modify the latency between interaction with the video game and the occurrence of an event. For example, an animator may extend the length of an animation associated with an event, such as shooting a basketball. This extended animation may make the shot ...
DETAILED DESCRIPTION OF EMBODIMENTS
Introduction
One aspect of video games that can be important to test is responsiveness of the video game. Responsiveness of the video game may relate to an amount of time between when a user interacts with a video game and a corresponding action is performed by the video game. This time difference may be referred to as a “latency” or “command latency” of the video game. For example, it can be important to determine an amount of time between the user pressing a button of a user interface device (for example, a controller of a video game console or a keyboard of a laptop) and the occurrence of a corresponding event or action, such as the firing of a bullet, the accelerating of a vehicle, or a shooting of a basketball.
There are a number of reasons that an interaction with the video game and the occurrence of an event may have a particular latency. Some of the causes of latency may be due to coding errors or design errors. For example, the code may include incorrect function calls, errors in configuring certain state variables, poor or less than optimal object design, the selection of less efficient data structures than substitute options, the selection of less efficient methods, functions, libraries than substitute options, the use of less efficient code than other possible code options, or use of deprecated functionality within a game engine or a programming language. Other causes of latency may be due to decisions that may not necessarily be errors, but can modify the latency between interaction with the video game and the occurrence of an event. For example, an animator may extend the length of an animation associated with an event, such as shooting a basketball. This extended animation may make the shot appear more smooth, but may delay the response to interactions by the user with user interface device. In the basketball shooting example, there is not necessarily an error that relates to extended latency, but instead the latency may relate to the design choice between a smoother animation and a faster response time.
Moreover, for different video games, there may be a different acceptable level of latency between issuance of a command, such as through interaction with the user interface device, and the triggering of an action or event. In some cases, it may even be desirable to have a greater amount of latency. For example, latency may be introduced to mimic environmental effects, such as an increase in gravity, to reflect an injury to a user playable character, or to instill a certain atmosphere or sense of dread (for example, latency may be purposefully introduced at points where a monster, such as a zombie, is chasing the user playable character). As another example, in a racing game latency may be introduced between different vehicles to reflect different levels of acceleration. In certain embodiments, some actions or input sequences may occur more quickly than other actions or input sequences, which can create a feeling of inconsistency in a responsiveness of the video game. Thus, to address this inconsistency in responsiveness of input sequences, in certain embodiments, latency may be added to create delay consistency across a plurality of input sequences.
Regardless of whether the latency is intentional or not, it is important for users, such as designers or testers, to be able to determine a latency between an interaction with a video game and an event triggered by the interaction with the video game. Often a tester will play a different iterations of a video game to determine a latency between interacting with the video game and a corresponding event associated with the interaction with the video game. However, this is often not sufficient. In some cases, a user's evaluation of the responsiveness of a video game, or the latency between a user interaction with the video game and the occurrence of a corresponding event in the video game is subjective. For example, one user may feel the responsiveness of a video game is slow, or feels sluggish, while another user may find the video game to not be sluggish. As another example, a user may find a game to be responsive on one day, but believe it sluggish on another day despite no change in the objective latency between interaction with the video game and a corresponding event. This subjective evaluation of latency adds to the challenge in testing a video game during development. Thus, it is desirable to have an objective measure of latency.
It is often not sufficient to obtain a subjective sense of latency. Instead, it is desirable to have an objective measure of latency. One solution to obtain a measure of latency is to use a high-speed camera to capture frames displayed on a display of user computing system that executes the video game under test in conjunction with a controller status board that uses lights to reflect the status of a video game controller, or other user interface device. A user can count the number of frames occurring between a first frame when the controller status board indicates a particular status of the controller and a second frame corresponding to a particular event. This solution can be cumbersome and difficult, and is less than ideal. One drawback of this solution is that it requires a large setup that can be both expensive and cumbersome to position and operate. In addition, although more objective than a user's perception of latency, the solution is not wholly objective as it relies on a user to identify when a user interaction was captured by the controller and to manually count the number of frames until a particular event is displayed. Further, the event may be limited to events related to animation or that affect the content output for display. In addition, the measurement is not easily repeatable across different tests because, for example, of the reliance on a user to determine when to begin the frame count and to perform the frame count. Moreover, each time the video game is modified during development, a user (for example a tester or designed) must manually perform and repeat the tests. Having a user perform the tests may result in test errors or inconsistencies in measurement. In addition, because the measurement relies on a user viewing the displayed frames, the latency measurement is inexact and does not account for system variabilities and variable refresh rates of different displays.
Discrepancies between system variabilities can be reduced by using identical test machine configurations. However, using identical test machine configurations is not always possible because a video game is often designed for play by different user computing machines and/or different video game consoles. For example, a video game may be configured for play on a Sony® PlayStation® machine, a Microsoft® Xbox® machine, one or more portable devices (such as smartphones), or varying configurations of personal computer setups.
Embodiments disclosed herein present systems and processes for obtaining an objective measurement of latency between an input to a user computing system that hosts or executes a video game and the occurrence of an event that corresponds to or is otherwise triggered by the input. The system can include a front-end system that communicates directly with the user computing system via, for example, an input port of the user computing system that is configured to receive input form a user interface device. Thus, in certain embodiments, the front-end system may substitute for a controller that plugs into a video game console or for a keyboard the plugs into a computer. This front-end system may provide one or more commands to the user computing system that may function as a substitute for, or may emulate, a user interacting with the user interface device to interact with a video game being executed by the user computing system. At substantially the same time (for example, at the same time or within less than a threshold amount of time, such as 50 nanoseconds, 10, nanoseconds, 5 nanoseconds, or less) as a command is provided to the user computing system, the system may trigger a timer that counts an amount of time until an event corresponding to or otherwise triggered by the command occurs.
The system can further include a back-end system that obtains output from the user computing system via, for example, an output port of the user computing system that is configured to provide an output to a display for presentation to a user on the display. In certain embodiments, the output includes a set of signals output by the output port of the user computing system. These signals may be electrical signals communicated through a port connection to an electronic device, such as a monitor. The back-end system can obtain the signals from the output port and decode the signals to obtain pixels for a frame or image. Further, the back-end system can identify data embedded in the pixels to determine the occurrence of the event that corresponds to or is otherwise triggered by the input. Upon identifying the occurrence of the event, the back-end system can determine a time that has elapsed since the input that triggered the event determining an objective measure of latency between the input and the event.
To simplify discussion, the present disclosure is primarily described with respect to a video game. However, the present disclosure is not limited as such may be applied to other types of applications. For example, embodiments disclosed herein may be applied to educational applications or other applications where it may be desirable to measure a latency between interaction with a user input device and an event corresponding to or otherwise triggered by the interaction with the user input device. Further, the present disclosure is not limited with respect to the type of video game. The use of the term “video game” herein includes all types of games, including, but not limited to web-based games, console games, personal computer (PC) games, computer games, games for mobile devices (for example, smartphones, portable consoles, gaming machines, or wearable devices, such as virtual reality glasses, augmented reality glasses, or smart watches), or virtual reality games, as well as other types of games.
Moreover, while primarily described with respect to testing a video game during development, embodiments disclosed herein may be used for other use cases where the measurement of latency may be desirable. For example, in competitive events involving video games, sometimes referred to as “esports,” it is important for the responsiveness of systems used by the players to be identical, or as close to identical as possible given current technologies. As such, each computing system and display system of each player will typically be configured the same. However, errors in configuration can sometimes lead to inconsistencies in execution of the video game. Further, differences in the ambient environment may affect operation of the host computing systems. For example, a computing system positioned nearer a window than another computing system positioned under an air conditioning vent may run hotter and consequently, a slower. With the large amounts of money that can sometimes be spent during these competitions (for example as prize money, advertising money, sponsorships, television rights, and the like) it is important for balance among the systems used by competitors. Even a small difference in the speed of operation of a system due, for example, to differences in temperature of the computing systems can have consequences in terms of the perceived legitimacy and fairness of the competition. Thus, it is important for stakeholders (for example, players, spectators, sponsors, and the like) to have confidence in the fairness of the competition. In certain embodiments, the systems disclosed herein can be used to test each computing system hosting an instance of the video game to confirm that each computing system is running identically and that there is not a difference in command execution latency between different competitors user computing systems.
Example Video Game Test Environment
FIG.1illustrates a video game test environment100in accordance with certain embodiments. The video game test environment100can include an environment for testing a video game, such as the video game112, or a system, such as the user computing system110, that hosts the video game112. For example, the video game test environment100may be configured to test a video game112under development to determine an objective measure of latency between issued or received commands, and the execution of the commands, or the occurrence of an event that may directly or indirectly correspond to or be triggered by the commands. For example, the video game test environment100may be used to determine a measure of time or latency between when a user, such as a player, developer, or tester, pushes or otherwise interacts with a button on a user interface device (for example, a video game console controller, a keyboard, or a touchscreen interface) and the video game112performs an action corresponding to the interaction with the button. However, the video game test environment100may also test a latency for events that are triggered by a combination of button interactions or a combination of one or more button presses and a particular state of the video game112. In some embodiments, the video game test environment100may enable testing of a latency between multiple states of the video game112. These multiple states of the video game112may or may not be triggered by user interaction with the video game. For example, in some cases, the change in state of the video game112may relate to a passage of time, execution of code within the video game112itself, or the execution of an application other than the video game112that may cause a change in state of the video game112, such as an auction application that enables users to auction items obtained within the video game112.
Moreover, as stated above, in some cases, the video game test environment100may be used to test the user computing system110itself. For example, the video game test environment may be used to determine whether execution of the video game112on multiple user computing systems110result in the same latency. As previously described, ensuring that when the same interactions with a video game112occur on multiple user computing systems110the latency is equal can be important in competitive environments, such as for esports.
The video game test environment100includes a video game test system102that is configured to test a video game112and/or a user computing system110that hosts or executes at least a portion of the video game112. As illustrated inFIG.1, the video game test system102may be divided into multiple sub-systems. For example the video game test system102may be divided into a front-end test system104and a back-end test system106. The front-end test system104and the back-end test system106may be implemented as separate systems that are housed separately. Alternatively, the front-end test system104and the back-end test system106may be a single system that is enclosed in a single housing. Regardless of whether the video game test system102is implemented as a single system or as separate systems, the two subsystems can, in some cases, be conceptually thought of as one system or as multiple distinct systems.
Moreover, as is described in more detail below, the video game test system102may be implemented using multiple different hardware processors. At least some of the hardware processors may be of different types. Further, at least some of the hardware processors may be implemented in different application-specific hardware that is configured to perform particular functions associated with the processes described herein. In other embodiments, the functionality of the video game test system102may be implemented by a single hardware processor configured to perform the particular processes described herein. In certain embodiments, the single hardware processor may be a general purpose processor that can execute one or more instructions to perform the processes described herein.
The front-end test system104may include a user interface circuit108and a command sequence repository114. The user interface circuit108may serve as a substitute for, or may simulate, a user interface device of the user computing system110. For example, if the user computing system110is a console, such as a PlayStation® or an Xbox®, the user interface circuit108may simulate a controller for the console. Alternatively, the user interface circuit108may simulate a keyboard, a mouse, a touchscreen input device, or any other input device that may be used to interact with a video game hosted by the use computing system110. The user interface circuit108may obtain a command that corresponds to a user interacting with a user interface device, and provide the command to the user computing system110. This command may be formatted the same or similar to what a user interface device (for example, an Xbox® controller) communicates to a user computing system110. In some embodiments, the command may be a status of buttons or interface elements of the user interface device instead of or in addition to a command. For example, the user interface circuit108may communicate a data structure that includes a status of one or more user interface elements of the user interface device that is being simulated by the user computing system110.
In certain embodiments, the user interface circuit108may obtain a sequence of commands and may provide the sequence of commands to the user computing device110. The sequence of commands may be provided in series simulating a user's performance of a series of interactions with a user interface device. Alternatively, the sequence of commands may be provided in parallel simulating a user's ability to perform a combination of interactions with a user interface device, such as the pressing of a direction button or an analog stick while simultaneously pressing an action button on a gamepad or controller. In yet other embodiments, at least some of the sequence of commands may be provide in parallel while other commands are provided serially. The commands that the user interface circuit108provides to the user computing device110may be the same commands that a user interface device would provide to the user computing device110if a user were interacting with the user interface device to perform the same interactions.
The commands or command sequence may be provided to the front-end test system104by the test server124. A user, such as a designer or tester of the video game112may generate a sequence of commands to test the video game112using the test server124. The test server124may then provide the sequence of commands to the front-end test system104, which may store the commands at the command sequence repository114. The command sequence repository114may store multiple sequences of commands. Each of the sequences of commands may be associated with a separate label or identifier. A particular sequence of commands may be selected by the front-end test system104or the user interface circuit108based on the selection or identification of a particular desired test.
During execution of a latency test, the user interface circuit108may obtain the command or sequence of commands used during the test from the command sequence repository114. Advantageously, in certain embodiments, by obtaining the commands from the command sequence repository114that is included as part of the front-end test system104, latency that may occur by communicating with the test server124may be eliminated. Further, the front-end test system104can be pre-loaded with test command sequences, eliminating the need for the test server124to be present during performance of the test. Accordingly, the video game test system102may have increased portability compared to a system that receives commands from the test server124during performance of the testing process.
Moreover, storing command sequences at the command sequence repository114enables a particular test to be repeated a number of times on the video game112, or on multiple iterations or versions of the video game112. For example, each time a change is made to the video game112during development, or when an update or expansion is development for the video game112, tests can be more easily repeated using stored test sequences stored at the command sequence repository114. Further, by storing commands at the command sequence repository114, tests can be performed using an automated process or with reduced or no user involvement compared to systems that may require a user to interact with the video game112and to measure latency using by counting frames captured by a high-speed camera.
The back-end test system106may capture output from the user computing system110. This output may be signals that are output from the user computing system to a display system122. In certain embodiments, the back-end test system106may replace the display system122. In other embodiments, a splitter or other electronics (not shown) may be used to provide a copy of the output signals provided to the display system122to the back-end test system106. By splitting the signal, a user can observe output on the display system122while the back-end test system106measures the latency between issued commands and corresponding triggered events occurring at the video game112. It should be understood that, unlike prior attempts that use high-speed cameras to measure latency within a video game112, it is unnecessary for the output of the video game112to be displayed on a display system to measure latency using certain embodiments disclosed herein.
The back-end system106may include a decoder116, a controller118, and a timer system120. The decoder116of the back-end system106may connect to the user computing system110via an output port, such as a display port, of the user computing system110. For example, the decoder116may connect to a DisplayPort, a Digital Visual Interface (DVI) port, or a High-Definition Multimedia Interface (HDMI) port. Generally, the decoder116connects via a wired connection to an output port of the user computing system110. By connecting via a wired connection, latency that may be introduced due, for example, to interference in a wireless connection, may be avoided. However, in certain embodiments, the decoder116may connect to the user computing system110using a wireless connection.
The decoder116may include any circuit or system that can obtain signals from the user computing system110, via an output port of the user computing system110, and can convert the signals to pixels. For example, the decoder116may be configured to convert HDMI signals to a set of pixels representing a frame of an animation generated by the video game112. This frame may be part of an animation that the developer of the video game112intended to be displayed on a display, such as the display provided by the display system122.
The decoder116may provide the pixels to the controller118. In certain embodiments, the decoder116provides the pixels a frame at a time to the controller118. In other embodiments, the decoder116provides the pixels to the controller118as the controller118converts the output signals to pixels. Thus, in some cases, the controller118may receive portions of a frame while the decoder116continues to convert received signals to additional pixels included in the frame.
The controller118may include any system or circuit that can process pixels received from the decoder116to identify a subset of pixels, which may store embedded data. In some cases, the entire set of pixels representing a frame may be used with embodiments disclosed herein. However, typically only a subset of pixels are used because the remaining pixels are designated, for example, to depict an image or frame of an animation generated by the video game112.
Processing the pixels to identify the subset of pixels may include filtering the received pixels to obtain a subset of pixels. Filtering the pixels may include identifying particular pixels included in the set of pixels generated by the decoder116. This subset of pixels may be the first ‘n’ pixels, where ‘n’ is some number. For example, the subset of pixels may be the first 1024 pixels, the first 2048 pixels, or any other number of pixels less than the total number of pixels that form a frame of an image. The pixels may be received in a particular order. For example, the pixels may be received starting from the top left corner of a frame and proceeding from left to right and from top to bottom, similar to the order that words are written in an English-language book. Thus, in the previous example, the first 1024 pixels may include 1024 pixels beginning from the top left of a frame and extending 1024 pixels towards the right of the first line in an image. Alternatively, in certain embodiments, the subset of pixels may be the first ‘n’ bits or bytes of data that stores pixel information. Thus, for example, the subset of pixels may be the set of pixels stored in the first 1024 or 2048 bytes of data obtained from the decoder116, which may correspond to 341 pixels or 682 pixels assuming a 24-bit or 3-byte RGB image. It should be understood that other bit or byte amounts may be used to represent each pixel resulting in a different amount of pixels per 1024 or 2048 bytes, or other number of bytes that stores embedded data.
The identified subset of pixels may include pixels that are configured to embed information used as part of a testing process, such as a process to test or measure latency between the issuance of commands and the occurrence of corresponding events. This embedded information may identify when particular events have occurred in the video game112. The information may be embedded in the frame based on the value set for the subset of pixels. For example, the subset of pixels may be configured to depict particular colors or images in order to indicate that a particular event has occurred within the video game112. As another example, the subset of pixels may be configured to have a particular opaqueness to indicate the occurrence of an event in the video game112. The event may relate to an occurrence of a particular animation or a particular frame in an animation. However, although the embedded information that identifies the occurrence of an event is embedded in an image or animation frame, the event can include non-animation based events that occur during execution of the video game112. For example, the event can relate to the playing of a particular sound, the setting of a particular state variable, or the occurrence of any other event related to the execution of the video game112.
The timer system120may include any system or circuit that can determine whether the identified subset of pixels includes the embedded data and/or whether the embedded data includes particular information, and based on the determination, can stop a timer initiated by the front-end test system104. The particular information may include any information that can be inserted by the video game112into one or more pixels of a frame or image to be output. For example, the information may include a stop command or tag that indicates that the timer system120is to stop a timer. The information may be inserted into one or more pixels of a frame or image by calling, executing, or otherwise instantiating a function or method from an Application Programming Interface (API) or a Software Development Kit (SDK) used to program the video game112.
The timer system120may initiate one or more timers in response to a trigger received from the front-end test system104. The front-end test system104may trigger a timer when providing a command from the command sequence repository114to the user computing system110. When the timer system120identifies a particular tag or piece of data embedded in pixels of a frame, the timer system120may stop the timer. The timer system120may provide a measure of the elapsed time to the test server124, which may present the measure of the elapsed time to a user. This measure of elapsed time may correspond to a latency between when a command is provided to the user computing system110by the user interface circuit and when a corresponding event occurs at the video game112. In some embodiments, the timer may be a counter that counts the occurrence of a number of events that have occurred within the video game112since the counter has been initiated until an event corresponding to the command has occurred. Alternatively, or in addition, the counter may measure a number of frames output by the user computing system110until the corresponding event occurs at the video game112. Thus, in some embodiments, the command latency may be a measure of time, a measure of events occurred, a measure of frames output, or any other metric that may be measured with respect to the execution of a video game under test and/or a command provided to the user computing system110hosting the video game112under test.
In some embodiments, the timer system120may modify or adjust the measured time to account for measured delays within the video game test system102. For example, in some cases, there is a non-infinitesimal amount of time that occurs between the decoded pixels being provide to the controller118and the processed or filtered subset of pixels being communicated to the timer system120. For instance, in one prototype the communication time between the controller118and the timer system120was consistently determined to be 3.8 milliseconds. Thus, the timer system120can be configured to adjust the measured time by 3.8 milliseconds to account for delays introduced by the video game test system102. In certain embodiments, additional delays may occur due to limitations of the user computing system110or the particular game engine being used to create the video game112. In some cases, the timer system120can modify the measured elapsed time by the additional delays.
The user interface circuit108may be implemented as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller (for example, a Cortex® M4 from ARM® as used in a prototype of the front-end test system104), or any other type of special purpose computing system or integrated circuit. Further, the user interface circuit108may interface with a port of the user computing system110. This port may be a proprietary port or a standardized port, such as a Universal Serial Bus (USB) port. The use of a special purpose circuit enables the front-end test system104to be miniaturized. For example, the front-end test system may be as small as or smaller than a user interface device being simulated by the front-end test system104. Alternatively, in certain embodiments, the user interface circuit108may be a general purpose computer. Further, the command sequence repository114may be implemented in any type of a volatile or non-volatile memory, such as a ROM, RAM, SRAM, flash memory, or a magnetic hard disk drive. In certain embodiments, the command sequence repository may be implemented in memory of the user interface circuit108. Thus, in certain embodiments, the user interface circuit108and the command sequence repository114may be combined.
The decoder116may be implemented using an ASIC, FPGA, microcontroller, or any other type of special purpose computing system or integrated circuit. For example, the decoder116may be a digital signal processor specifically designed to convert HDMI signals to pixels. In a prototype implementation of the back-end test system106, a custom built HDMI decoder board that includes an ADV7611ASIC from Analog Devices® was used to implemented the decoder116. However, the decoder116is not limited as such, and any special purpose system or integrated circuit may be used to decode the output of the user computing system110into pixels.
The controller118may be implemented using an ASIC, FPGA, microcontroller, or any other type of special purpose computing system or integrated circuit. Further, the controller118may receive pixels from the decoder116as the output signals of the user computing system110are converted or decoded. In other words, in certain embodiments, the pixels may be streamed in a particular order (for example, top left to bottom right for an image) to the controller118. By streaming the pixels to the controller118as they are generated, the controller118can more easily identify a subset of pixels to provide to the timer system120.
The timer system120may be implemented using an ASIC, FPGA, microcontroller, or any other type of special purpose computing system or integrated circuit. Further, the timer system120may receive a subset of pixels from the controller118. The timer system120may extract data from the subset of pixels to determine whether a stop condition or other data has been embedded into the subset of pixels. In some cases, extracting the data from the subset of pixels may include comparing pixel values to a library of pixel values stored at the timer system120that are associated with particular data or conditions. For example, the timer system120may compare the pixels values of the subset of pixels to a value or set of values indicating that an event has occurred in the video game112. This value or set of values may be stored in a memory of the timer system120.
The front-end test system104may interface between, or otherwise communicate with, a user computing system110, a test server124, and the back-end test system106. The front-end test system104may communicate with the test server124via a direct connection or over a network (not shown). Typically, the front-end test system104will communicate via a direct connection, such as a physical wire, with the user computing system110and the back-end test system106. It is desirable to have a direct connection between each of the front-end test system104, the back-end test system106, and the user computing system110to reduce or eliminate communication latency. This communication latency can add errors in the measurement of latency between interaction with a user input device and an occurrence of a corresponding event in the video game112. Although it is generally desirable for the connections between each of the front-end test system104, the back-end test system106, and the user computing system110to be direct or wired connections, it is possible, and sometimes even desirable, for at least one of the connections to be wireless connections. For example, it may be desirable to test the amount of latency introduced by use of a wireless controller to determine whether the video game112has a desired responsiveness when using a wireless controller. In some such cases, the front-end test system104may be configured to communicate wirelessly with the user computing system110to obtain test measurements of latency between interaction with a user input device and the occurrence of a corresponding event in the video game112.
As previously stated, the user computing system110may include or host a video game112. In some cases, the video game112may execute entirely on the user computing system110. In other cases, the video game112may execute at least partially on the user computing system110and at least partially on another computing system, such as a server. In some cases, the video game112may execute entirely on the server, but a user may interact with the video game112via the user computing system110. For example, the game may be a massively multiplayer online role-playing game (MMORPG) that includes a client portion executed by the user computing system110and a server portion executed by one or more application host systems that may be included as part of a network-based interactive computing system. As another example, the video game112may be an adventure game played on the user computing system110without interacting with another system.
The user computing system110may include hardware and software components for establishing communications over a communication network (not shown). For example, the user computing system110may be equipped with networking equipment and network software applications (for example, a web browser) that facilitate communications via a network (for example, the Internet) or an intranet. The user computing system110may have varied local computing resources, such as central processing units and architectures, memory, mass storage, graphics processing units, communication network availability and bandwidth, and so forth. Further, the user computing system110may include any type of computing system. For example, the user computing system110may include any type of computing device(s), such as desktops, laptops, video game platforms or consoles (such as a PlayStation®, an Xbox®, or a Nintendo Switch™) television set-top boxes, televisions (for example, Internet TVs), network-enabled kiosks, car-console devices, computerized appliances, wearable devices (for example, smart watches and glasses with computing functionality), and wireless mobile devices (for example, smart phones, PDAs, tablets, or the like), to name a few. In some embodiments, the user computing system110may include one or more of the embodiments described below with respect toFIGS.6and7.
The display system122can include any system for displaying output of the user computing system110. In some embodiments, the display system122may be part of the user computing system110. For example, if the user computing system110is a portable game system, the display system122may be built into the user computing system110. In other embodiments, the display system122may be separate from the user computing system110. For example, if the user computing system110is a game console, the display system122may be a television that may be manufactured or sold by a different entity than the user computing system122.
The test server124may include any type of computing system that can interface with the video game test system102to provide a series of instructions or commands to the video game test system102to perform during a latency testing or determination process, such as the process200. For example, the test server124may be a server computing system, a desktop, a laptop, a network-based or cloud computing system, or any other computing system that a tester may use to facilitate testing a video game112or a user computing system110hosting the video game112using the video game test system102.
As previously described, to reduce communication latency between the video game test system102and the user computing system110, the video game test system102may be in direct communication through a wired connection. Although, in certain embodiments, one or more elements of the video game test system may communicate wirelessly with the user computing system110enabling a tester to determine an effect on latency for user's that use wireless user interface devices to interact with the video game112.
The test server124may communicate directly with the video game test system102, or via a network (not shown). The network can include any type of communication network. For example, the network can include one or more of a wide area network (WAN), a local area network (LAN), a cellular network, an ad hoc network, a satellite network, a wired network, a wireless network, and so forth. Further, in some cases, the network can include the Internet.
Example Command Latency Testing Process
FIG.2presents a flowchart of a command execution latency testing process200in accordance with certain embodiments. The process200can be implemented by any system that can determine a latency, or measure of time, between an interaction with a video game112and the occurrence of an event corresponding to or otherwise triggered by the interaction with the video game112. The process200, in whole or in part, can be implemented by, for example, a video game test system102, a front-end test system104, a back-end test system106, a user interface circuit108, a decoder116, a controller118, or a timer system120, among others. Although any number of systems, in whole or in part, can implement the process200, to simplify discussion, the process200will be described with respect to particular systems.
The process200begins at block202where the front-end test system108receives a set of instructions corresponding to interactions with a user interface. The instructions may include a single instruction, a sequence of instructions, or multiple sequences of instructions. In some cases, each sequence of instructions may be associated with a separate test, a test of a different part of the video game112, or a test of the video game112under different conditions or states. The received instructions may correspond to interactions with a user interface device that a user may perform when playing the video game112. For example, the instructions may represent the commands provided to a user computing system110hosting the video game112when a user interacts with the user interface device. For instance, when a user presses the “up” button on a game controller, the game controller may provide a particular command to the user computing system110informing the user computing system110that the user pressed the “up” button. The instructions received from the front-end test system108may include the same particular command. Thus, the received instructions may simulate a user interacting with the game controller.
The user interface device may include any device that a user can use to play or interact with the video game112. For example, the user interface device may be a gamepad or game controller, a keyboard, a mouse, or a touch sensitive display.
At block204, the front-end test system104stores set of instructions received at the block202at a storage of the video game test system102. For example, the front-end test system104may store the set of instructions at the command sequence repository114and/or at a memory or storage of the user interface circuit108. In some embodiments, storing the set of instructions may include storing a label or tag identifying the set of instructions. For example, a tag may indicate or identify the commands included in the set instructions, an action performed at the video game112based on a set of instructions, a portion of the video game112that may be tested by the set of instructions, or any other information that may distinguish the set of instructions from another set or sequence of instructions stored at the command sequence repository114.
At block206, the front-end test system104receives a trigger to initiate a latency test. The trigger may be received from a user or may be an automated trigger, such as part of an automated testing process. Further, the trigger may be received in response to a user interacting directly with the video game test system102or may be received from the test server124. In some embodiments, the user computing system110may provide the trigger at the block206. In some cases, the trigger may be or may be received in response to a change in the code of the video game112. In certain embodiments, the trigger may include an identification of a command or a sequence of commands stored at the command sequence repository114. For example, the trigger may include a label, tag, or other reference that distinguishes a commander sequence of commands from another commander sequence of commands stored at, for example, the command sequence repository114.
At block208, the front-end test system104triggers a latency timer at the timer system120. Triggering the latency timer at the timer system120may include starting multiple timers at the timer system120. For example, in some cases, it may be desirable to measure the amount of time until a plurality of events occur at the video game112corresponding one or more commands provided by the user interface circuit108to the user computing system110. Further, in certain embodiments, triggering the latency timer at the timer system120may include identifying particular stop conditions for the timer system120indicating when the timer system120is to stop one or more of the latency timers. Each latency timer may be associated with a different stop condition that is monitored by the timer system120as described in more detail below.
In some cases, triggering multiple timers at the timer system120may include identifying an order or rank for each timer. The timer system120may stop the active timer with the highest rank each time a stop condition detected. Thus, once a first rank timer is stopped, a second rank timer may become the highest ranked timer and may be stopped upon the timer system120identifying a second stop condition. Advantageously, in certain embodiments, by triggering a plurality of timers each associated with different stop conditions or configured to be stopped at different times, it is possible for a latency or a measure of time between a command being provided to the user computing system110and the occurrence or triggering of a corresponding event to be measured for multiple events that may be triggered by the command.
At block210, the front-end test system104loads one or more instructions from the storage used to store the instructions received at the block202. For example, the front-end test system104may load the one or more instructions from the command sequence repository114. In some embodiments, the front-end test system104may load a single instruction at a time as part of the block210. In other embodiments, the front end test system104may load a sequence of instructions corresponding to a particular test or a subset of the sequence of instructions at a time. The front end test system104may determine instructions or sequence of instructions to load based on the trigger received at the block206or on a label included with the trigger.
At block212, the user interface circuit108communicates the one or more instructions to a user computing system110that is executing a video game112under test. Communicating the one or more instructions to the user computing system110may include transmitting corresponding data or instructions that a user interface device would communicate to the user computing system110when providing the instruction to the user computing system110. For example, if the user interface circuit108is to communicate an instruction associated with pressing and holding a particular button on a game controller, the user interface circuit108may communicate the same data or instructions that the game controller would communicate to the user computing system110. Accordingly, in certain embodiments, the user interface circuit108may simulate the game controller or other user interface device of the user computing system110.
In certain embodiments, operations associated with the block212and/or the block208may include triggering a different latency timer for each instruction communicated to the user computing system. In other embodiments, the operations associated with the block212and/or the block208may include triggering a latency timer for the first instruction, the last instruction, or a particular subset of instructions communicated to the user computing system110.
At block214, the timer system120determines a latency time measurement for performing the one or more instructions based at least in part on the trigger occurring at the block208and a detected stop condition. This detected stop condition may be detected based on an output of the user computing system110. This output may be an output provided to or intended to be provided to the display system122, but which is captured by the backend test system106. As previously described, the latency time measurement may be associated with an amount of time that elapses between a command being provided by the user interface circuit108to the user computing system110and an event occurring at the video game112. In certain embodiments, the latency time measurement is measured by a number of events within the video game112that have occurred and/or a number of frames that have been output since the command was provided by the user interface108to the user computing system110and the event occurring at the video game112. Additional details relating to detecting the stop condition in determining the latency time measurement are described below with respect toFIG.3.
Although described as a single process, it should be understood that the process200may be divided into multiple processes and/or operations associated with the process200may be performed at different times. For example, operations associated with the blocks202and204may occur at some time prior to the remaining operations of the process200. For instance, during a first period of time, one or more sequences of instructions may be received for storage at the command sequence repository114. During a second period of time occurring sometime after the first period of time one or more tests may be performed on one or more iterations of the video game112. The one or more tests may include performing operations associated with the blocks206through214of the process200.
Further, it should be understood that operations associated with the process200may be performed in a different order, serially, or at least partially in parallel. For example, operations associated with the block208may be performed subsequent to operations associated with the block210. As another example, operations associated with the block208may be performed at least partially in parallel with operations associated with the block212. For instance, the user interface circuit108may trigger a timer at the timer system120as part of the block208at the same time, or at substantially the same time, that the user interface circuit108communicates at least one instruction to the user computing system110as part of the block212.
Example Latency Determination Process
FIG.3presents a flowchart of a latency determination process300in accordance with certain embodiments. The process300can be implemented by any system that can determine a latency, or measure of time, between an interaction with a video game112and the occurrence of an event corresponding to or otherwise triggered by the interaction with the video game112by, at least in part, detecting an embedded stop condition in an output. The process300, in whole or in part, can be implemented by, for example, a video game test system102, a front-end test system104, a back-end test system106, a user interface circuit108, a decoder116, a controller118, or a timer system120, among others. Although any number of systems, in whole or in part, can implement the process300, to simplify discussion, the process300will be described with respect to particular systems.
In certain embodiments, the process300can be combined with, or executed as part of, the process200. For example, the operations associated with the blocks306-322may be performed as, or as part of, the operations associated with the block214of process200. Further, the blocks302and304may correspond to the blocks210and212, respectively.
The process300begins at block302where the front end test system104loads an instruction from storage, such as the command sequence repository114. In certain embodiments, the block302may include one or more of the embodiments described with respect to the block210.
At block304, the user interface circuit108communicates the instruction to a user computing system110that hosts or executes at least part of the video game112under test. In certain embodiments, the block304may include one or more of the embodiments described with respect to the block212.
At block306, the backend test system106receives output signals from the user computing system110. The output signals may be received from an output port of the user computing system110that is configured to provide output to a display system122. For example, the output port may be an HDMI port, a DisplayPort, or any other video output port. In certain embodiments, the backend test system106is connected to the user computing system110in place of the display system122. In other embodiments, a signal capture device or splitter may be used to obtain a copy of the signals output to the display system122without preventing the signals from being provided to the display system122. Thus, in certain embodiments, backend test system106may be used to measure latency while a user observes content output to the display system122. By enabling a user to view the display system122as the backend test system measures latency, a user can determine whether to modify a test being performed based at least in part on the view displayed on the display system122. The output signals received from the user computing system110may correspond to an image or frame of an animation is output by the video game112for display.
At block308, the decoder116converts the output signals to pixels. Inverting the output signals to pixels may include generating an image or a frame of an animation based on the output signals.
At block310, the controller118post-processes the pixels to obtain a subset of pixels associated with embedded data. Post-processing the pixels may include filtering the pixels that form the image or the frame of the animation generated at the block308to obtain the subset of pixels associated with the embedded data. Alternatively, or in addition, post-processing the pixels may include selecting or otherwise obtaining the subset of pixels that are designated to include the embedded data. In certain embodiments, the post-processing involves cropping the image or the frame to isolate the subset of pixels that are designated to include the embedded data. The subset of pixels may include a particular number of pixels from the image such as the first 1024 or 2048 pixels. Alternatively, the subset of pixels may be the pixels associated with the particular amount of data, such as 2048 bits or 2048 bytes of data included in the image or frame.
The subset of pixels for the image or the frame of the animation includes the pixels that are designated to include the embedded data, but may or may not include embedded data for the particular image. In other words, in certain embodiments, certain images or frames may include embedded data while other images are frames may not include embedded data.
At decision block312, the timer system120determines whether embedded data includes a stop condition. In certain embodiments, determining whether the embedded data includes a stop condition may include determining whether the subset of pixels includes embedded data. If it is determined that the subset of pixels does not include embedded data, the decision block312determines that a stop condition has not occurred.
The stop condition may include any data that may be inserted into an image or frame of an animation by an API or SDK when code associated with the video game112is executed. For example, when a particular event occurs during execution of the video game112, a function included in the API may be called to insert particular data into an image or frame of an animation to be output to alert a user or the backend test system106of the occurrence of the event. This particular data may be a particular color or opaqueness for a set of pixels within the image or frame. In some cases, the stop condition may be a particular label or other value stored in the bytes of memory configured to store pixel data.
The API or SDK may be part of or may provide test code or test tools for testing or facilitating testing code of the video game. For example, for particular code snippets that a developer desires to test, the developer may insert a call to a function in the API or SDK that facilitates testing of the code snippet at the end of the code snippet. In a case where the developer wants to test the latency of a particular event, the developer may insert a call to a function at the end (or other desired location) of the code to be tested that inserts or embeds data into a frame that is to be output. This embedded data may substitute some of the pixels in the frame with information (for example a stop condition, a label, or other marker) that enables the video game test system102to detect that the particular event has occurred. Thus, once the information indicating that the event has occurred has been decoded from the frame, the video game test system102can use the timer to determine an amount of elapsed time since the timer was initiated or the simulated user interaction was received that caused the particular event to occur.
In some embodiments, the stop condition may be inserted when a particular event, which may be referred to as a “target event” occurs. This target event may correspond to or triggered by the instruction provided to the user computing system110at the block304. In some embodiments, the event is triggered when the video game112is in a particular state when the instruction is received by the user computing system110.
In some embodiments, a programmer may insert a library call to a function included in an API associated with testing of the video game112. This function may be a function made available by the API to programmers or coders enabling the programmers to extract data from an application being executed to facilitate a testing process. If there is a particular portion of the code that the programmer desires to test, the programmer may insert the function call into the particular portion of the code. Alternatively, the test function may be built-in to or included with some or all of the functions available from the API. In some such cases, a flag or other trigger may be used to activate or deactivate the test functions during execution of the video game112.
If it is determined at the decision block312that the embedded data does not include a stop condition, the process300may proceed to the block302. At the block302another instruction may be loaded from the storage to be provided to the user computing system110. Alternatively, the process300may proceed to the block304. For example, if a series of instructions or commands are loaded at the block302initially, process300may return to the block304to communicate one of the previously loaded instructions to the user computing system110. As yet another alternative, the process300may proceed to the block306. For example, in some cases, additional instructions may not be provided to the user computing system110is part of the test, but the event corresponding to or otherwise triggered by the previously provided instruction may not yet have occurred at the video game112. Accordingly, the process300may return to the block306after the decision block312to continue to process output until embedded data with the stop condition is identified.
If it is determined at the decision block312that the embedded data does include a stop condition, the timer system120stops a latency timer at the block314. Stopping the latency timer may include stopping one of a plurality of ongoing timers. The latency timer stopped may be associated with the stop condition identified at the decision block312. Other timers managed by the timer system120may continue to run. In some embodiments, the timer system120does not stop the latency timer at the block314, but instead records the time elapsed since the latency timer was initiated. Advantageously, in certain embodiments, by recording the time value of the latency timer while permitting the latency timer to continue to run, it is possible for the timer system120to monitor the occurrence of multiple events triggered by a single instruction or corresponding to a single instruction communicated to the user computing system using a single timer.
At block316, the timer system120determines a latency time measurement. Determining the latency time measurement may include determining a difference between a point in time when the instruction is communicated to the user computing system or when the instruction is executed by the video game112, and a point in time when an event occurs at the video game112that is triggered by her correspondence to the instruction. As previously described, the event may be a particular animation being played, particular frame within the animation being displayed, a sound being played, a change in state of a particular state variable within the video game112, a change in state of the video game112, or any other aspect of the video game112that may be modified based at least in part on the instruction provided at the block304. In certain embodiments, the block316may include one or more of the embodiments described with respect to the block214.
At block318, the timer system120adjusts the latency time measurement for delay introduced by the video game test system102. Adjusting the latency time measurement for delay introduced by the video game test system102may include reducing the latency time measurement by an amount of time associated with elements of the video game test system102. For example, it was determined during evaluation of a prototype of the video game test system102that communication between the controller118and the timer system120required 3.8 ms. The value of 3.8 ms was a deterministic measurement of the communication between the controller118and the timer system120. Thus, in this particular example, the latency time measurement may be reduced by 3.8 ms. The communication time between the decoder116and the controller118in the prototype was negligible. However, in embodiments where the communication time between the decoder116and the controller118is determined to be non-negligible, the latency time measurement may be adjusted by the determined communication time.
Because, in certain embodiments, the video game test system102is implemented using one or more application specific hardware devices that are directly connected via pins and or short wires, the amount of latency between the hardware elements of the video game test system102may be both deterministic and substantially invariable compared to the use of generic computing hardware. Accordingly, the measurement of latency for a particular instruction in a particular implementation of the video game112may be repeatable may provide substantially identical results among multiple test iterations. In certain embodiments, operations associated with the block318may be optional or omitted.
At block320, the timer system120filters latency noise from the adjusted latency time measurement. Latency noise may include latency or delays that are unrelated to or not specific to the particular code, resources or assets of the video game112. In other words, latency noise may be noise that is unrelated to the code created by the programmer or the various animations that may be generated by the graphic artists in developing video game112. For example, latency noise may include latency introduced by the configuration of the user computing system110itself or by a coding engine used to develop the video game112. For instance, some user computing systems may be configured designed to only display a new frame every 60th of a second. However, in some cases, the video game112may be able to generate a new frame at a faster rate than once every 60th of a second. In such cases, although a frame may be ready for output, the output may be stalled until the user computing system110is ready to output another frame. This delay between when a frame is ready for output and when the user computing system110can output the frame may be measured and subtracted at the block320from the latency time measurement determined at the block316or from the adjusted latency time measurement of block318. In certain embodiments, the delay between when the frame is ready for output and when the user computing system110is ready to output the frame may be intentional to prevent screen tearing and may be referred to as vertical synchronization or Vsync.
In some cases, because different types of user computing systems110may introduce different latency noise, a developer of the video game112may perform latency testing using embodiments disclosed herein on different types further, in certain embodiments, the developer of the video game112may make or program different versions of the video game112for execution on different types of user computing systems110. For example, inherent differences between the PlayStation® game system and the Xbox® game system may result in an identical instruction provided to the video game112under an identical state being executed with a different latency. In certain embodiments, operations associated with the block320may be optional or omitted.
At block322, the backend test system106outputs the filtered adjusted latency time measurement as a latency or a measure of time between receipt of an instruction at the user computing system110and the occurrence of a particular event at the video game112. This output may be displayed on a screen, such as a display of the test server124. Alternatively, or in addition, the output may be stored at a repository, which may later be accessed by a user or an automated testing system. In some embodiments, a user, such as a developer, may modify code associated with the video game112based on the latency time measurement output at the block322. In certain embodiments, an automated testing or development system may automatically adjust the code or parameters of the video game112based on the latency time measurement to obtain a particular target time measurement. In certain embodiments, such as when testing computing systems used in a video game competition (for example in an esports competition), a user may modify the configuration of the user computing system110based at least in part on the output of the latency time measurement at the block322.
In some embodiments, the front-end test system104may determine a subsequent instruction to load and/or communicate to the use computing system executing the video game under test based at least in part on the filtered adjusted latency time measurement determined at the block320. This subsequent instruction may be a repeat of the same instruction previously communicated at the block304, or may be a different instruction.
Further, in some embodiments, a determination of the instruction to load, or when to provide the instruction to the user computing system may be made based at least in part on embedded data included in the subset of pixels. Advantageously, in certain embodiments, automated testing can be performed that includes determining particular instructions to provide and a timing of when to provide the instructions based on a detection of a particular state of the video game under test. This particular state of the video game under test may be communicated to the front-end test system104be the data embedded in the subset of pixels.
In some embodiments, a measure of efficiency of the code of the video game can be determined based at least in part on the measured latency of one or more instructions provided at the block304. Further, efficiency at different points within the video game can be compared by measuring the latency of commands performed during different states of the video game. In addition, the latency of commands performed for different versions of the video game can be compared to determine a relative efficiency between different versions of the video game that may operate on different types of computing systems and/or on the same type of computing system, but with changes in the code of the video game112.
Although this disclosure primarily relates to using visual output to test the video game112, in certain embodiments audio output may be used. For example, the video game112may play particular sound when an event occurs. The sound may be captured by the backend test system106indicate that a timer should cease and no latency measures should be determined.
Example Output Frame with Embedded Data
FIG.4illustrates an example output frame400including embedded data in accordance with certain embodiments. The output frame400may be one frame of an animation that is output by a video game112. This animation may be part of a video or non-interactive animation being played or may be part of an interactive scene that changes in response to inputs from a user. A portion of the output frame400may be configured to include embedded data that has been inserted into the frame400by an API, SDK, or library used during the development of the video game112. This portion of the output frame may be a particular set of pixels designated to have certain colors or opaqueness values that correspond to information that the developer wants to obtain from the video game112. This information may be the occurrence or triggering of an event within the video game112. This portion of the frame400may be referred to as the payload402and the embedded information may be referred to as the payload data.
In certain embodiments, different colors or opaqueness values may indicate different information or the occurrence of different events within the video game112. Similarly, different pixels within the payload402may be associated with different events being monitored. It should be understood that the payload402of the frame400is the medium of communication that identifies the event that has been triggered by, for example, a received command. However, the event may or may not be the occurrence of the animation or frame of the animation itself. For example, the event may be the setting of a particular state within the video game112that may not be visible to a user, such as the adding of an item to a character's inventory. Although a user may access the inventory to see the item, the inventory may not necessarily be displayed at the time that the item is added to the inventory. A tester may want to determine how quickly the item is added to the user's inventory when the user interacts with the item to pick it up. As another example, the event may be the playing of a sound or the shooting of an enemy that is not visible at a particular point in time when the projectile hits the enemy.
FIG.5illustrates an expanded view of a portion of the example output frame ofFIG.4in accordance with certain embodiments. The line502may indicate the occurrence of an event in the video game112. Comparing the line502to the line504, it can be seen that the embedded data may be of different colors and may be spread across a different number of pixels. Each of the line502and504may be associated with a different event being monitored. In some embodiments, the embedded data is a stop tag or includes data identifying that a stop condition has occurred. The gap506between the line502and504may correspond to events that have not yet occurred or triggered and thus, no embedded data is included in the gap506.
Although the pixels that include the embedded data are visible inFIGS.4and5, it should be understood that the pixels that include the embedded data may in some cases not be visible. For example, the pixels may be few enough in number to not be visible to an observer. As another example, the pixels with the embedded data may blend into the animation frame being displayed.
Overview of Computing System
FIG.6illustrates an embodiment of a user computing system110, which may also be referred to as a gaming system. As illustrated, the user computing system110may be a single computing device that can include a number of elements. However, in some cases, the user computing system110may include multiple devices. For example, the user computing system110may include one device that includes that includes a central processing unit and a graphics processing unit, another device that includes a display, and another device that includes an input mechanism, such as a keyboard or mouse.
The user computing system110can be an embodiment of a computing system that can execute a game system. In the non-limiting example ofFIG.6, the user computing system110is a touch-capable computing device capable of receiving input from a user via a touchscreen display602. However, the user computing system110is not limited as such and may include non-touch capable embodiments, which do not include a touchscreen display602.
The user computing system110includes a touchscreen display602and a touchscreen interface604, and is configured to execute a game application610. This game application may be the video game112or an application that executes in conjunction with or in support of the video game112, such as a video game execution environment. Although described as a game application610, in some embodiments the application610may be another type of application that may have a variable execution state based at least in part on the preferences or capabilities of a user, such as educational software. While user computing system110includes the touchscreen display602, it is recognized that a variety of input devices may be used in addition to or in place of the touchscreen display602.
The user computing system110can include one or more processors, such as central processing units (CPUs), graphics processing units (GPUs), and accelerated processing units (APUs). Further, the user computing system110may include one or more data storage elements. In some embodiments, the user computing system110can be a specialized computing device created for the purpose of executing game applications610. For example, the user computing system110may be a video game console. The game applications610executed by the user computing system110may be created using a particular application programming interface (API) or compiled into a particular instruction set that may be specific to the user computing system110. In some embodiments, the user computing system110may be a general purpose computing device capable of executing game applications610and non-game applications. For example, the user computing system110may be a laptop with an integrated touchscreen display or desktop computer with an external touchscreen display. Components of an example embodiment of a user computing system110are described in more detail with respect toFIG.7.
The touchscreen display602can be a capacitive touchscreen, a resistive touchscreen, a surface acoustic wave touchscreen, or other type of touchscreen technology that is configured to receive tactile inputs, also referred to as touch inputs, from a user. For example, the touch inputs can be received via a finger touching the screen, multiple fingers touching the screen, a stylus, or other stimuli that can be used to register a touch input on the touchscreen display602. The touchscreen interface604can be configured to translate the touch input into data and output the data such that it can be interpreted by components of the user computing system110, such as an operating system and the game application610. The touchscreen interface604can translate characteristics of the tactile touch input touch into touch input data. Some example characteristics of a touch input can include, shape, size, pressure, location, direction, momentum, duration, and/or other characteristics. The touchscreen interface604can be configured to determine the type of touch input, such as, for example a tap (for example, touch and release at a single location) or a swipe (for example, movement through a plurality of locations on touchscreen in a single touch input). The touchscreen interface604can be configured to detect and output touch input data associated with multiple touch inputs occurring simultaneously or substantially in parallel. In some cases, the simultaneous touch inputs may include instances where a user maintains a first touch on the touchscreen display602while subsequently performing a second touch on the touchscreen display602. The touchscreen interface604can be configured to detect movement of the touch inputs. The touch input data can be transmitted to components of the user computing system110for processing. For example, the touch input data can be transmitted directly to the game application610for processing.
In some embodiments, the touch input data can undergo processing and/or filtering by the touchscreen interface604, an operating system, or other components prior to being output to the game application610. As one example, raw touch input data can be captured from a touch input. The raw data can be filtered to remove background noise, pressure values associated with the input can be measured, and location coordinates associated with the touch input can be calculated. The type of touch input data provided to the game application610can be dependent upon the specific implementation of the touchscreen interface604and the particular API associated with the touchscreen interface604. In some embodiments, the touch input data can include location coordinates of the touch input. The touch signal data can be output at a defined frequency. Processing the touch inputs can be computed many times per second and the touch input data can be output to the game application for further processing.
A game application610can be configured to be executed on the user computing system110. The game application610may also be referred to as a video game, a game, game code and/or a game program. A game application should be understood to include software code that a user computing system110can use to provide a game for a user to play. A game application610might comprise software code that informs a user computing system110of processor instructions to execute, but might also include data used in the playing of the game, such as data relating to constants, images and other data structures. For example, in the illustrated embodiment, the game application includes a game engine612, game data614, and game state information616.
The touchscreen interface604or another component of the user computing system110, such as the operating system, can provide user input, such as touch inputs, to the game application610. In some embodiments, the user computing system110may include alternative or additional user input devices, such as a mouse, a keyboard, a camera, a game controller, and the like. A user can interact with the game application610via the touchscreen interface604and/or one or more of the alternative or additional user input devices. The game engine612can be configured to execute aspects of the operation of the game application610within the user computing system110. Execution of aspects of gameplay within a game application can be based, at least in part, on the user input received, the game data614, and game state information616. The game data614can include game rules, prerecorded motion capture poses/paths, environmental settings, constraints, animation reference curves, skeleton models, and/or other game application information. Further, the game data614may include information that is used to set or adjust the difficulty of the game application610.
The game engine612can execute gameplay within the game according to the game rules. Some examples of game rules can include rules for scoring, possible inputs, actions/events, movement in response to inputs, and the like. Other components can control what inputs are accepted and how the game progresses, and other aspects of gameplay. During execution of the game application610, the game application610can store game state information616, which can include character states, environment states, scene object storage, and/or other information associated with a state of execution of the game application610. For example, the game state information616can identify the state of the game application at a specific point in time, such as a character position, character action, game level attributes, and other information contributing to a state of the game application.
The game engine612can receive the user inputs and determine in-game events, such as actions, collisions, runs, throws, attacks and other events appropriate for the game application610. During operation, the game engine612can read in game data614and game state information616in order to determine the appropriate in-game events. In one example, after the game engine612determines the character events, the character events can be conveyed to a movement engine that can determine the appropriate motions the characters should make in response to the events and passes those motions on to an animation engine. The animation engine can determine new poses for the characters and provide the new poses to a skinning and rendering engine. The skinning and rendering engine, in turn, can provide character images to an object combiner in order to combine animate, inanimate, and background objects into a full scene. The full scene can conveyed to a renderer, which can generate a new frame for display to the user. The process can be repeated for rendering each frame during execution of the game application. Though the process has been described in the context of a character, the process can be applied to any process for processing events and rendering the output for display to a user.
Example Hardware Configuration of Computing System
FIG.7illustrates an embodiment of a hardware configuration for the user computing system110ofFIG.6. Other variations of the user computing system110may be substituted for the examples explicitly presented herein, such as removing or adding components to the user computing system110. The user computing system110may include a dedicated game device, a smart phone, a tablet, a personal computer, a desktop, a laptop, a smart television, a car console display, and the like. Further, (although not explicitly illustrated inFIG.7) as described with respect toFIG.6, the user computing system110may optionally include a touchscreen display602and a touchscreen interface604.
As shown, the user computing system110includes a processing unit20that interacts with other components of the user computing system110and also components external to the user computing system110. A game media reader22may be included that can communicate with game media12. Game media reader22may be an optical disc reader capable of reading optical discs, such as CD-ROM or DVDs, or any other type of reader that can receive and read data from game media12. In some embodiments, the game media reader22may be optional or omitted. For example, game content or applications may be accessed over a network via the network I/O38rendering the game media reader22and/or the game media12optional.
The user computing system110may include a separate graphics processor24. In some cases, the graphics processor24may be built into the processing unit20, such as with an APU. In some such cases, the graphics processor24may share Random Access Memory (RAM) with the processing unit20. Alternatively, or in addition, the user computing system110may include a discrete graphics processor24that is separate from the processing unit20. In some such cases, the graphics processor24may have separate RAM from the processing unit20. Further, in some cases, the graphics processor24may work in conjunction with one or more additional graphics processors and/or with an embedded or non-discrete graphics processing unit, which may be embedded into a motherboard and which is sometimes referred to as an on-board graphics chip or device.
The user computing system110also includes various components for enabling input/output, such as an I/O32, a user I/O34, a display I/O36, and a network I/O38. As previously described, the input/output components may, in some cases, including touch-enabled devices. The I/O32interacts with storage element40and, through a device42, removable storage media44in order to provide storage for computing device110. Processing unit20can communicate through I/O32to store data, such as game state data and any shared data files. In addition to storage40and removable storage media44, computing device110is also shown including ROM (Read-Only Memory)46and RAM48. RAM48may be used for data that is accessed frequently, such as when a game is being played.
User I/O34is used to send and receive commands between processing unit20and user devices, such as game controllers. In some embodiments, the user I/O34can include touchscreen inputs. As previously described, the touchscreen can be a capacitive touchscreen, a resistive touchscreen, or other type of touchscreen technology that is configured to receive user input through tactile inputs from the user. Display I/O36provides input/output functions that are used to display images from the game being played. Network I/O38is used for input/output functions for a network. Network I/O38may be used during execution of a game, such as when a game is being played online or being accessed online.
Display output signals may be produced by the display I/O36and can include signals for displaying visual content produced by the computing device110on a display device, such as graphics, user interfaces, video, and/or other visual content. The user computing system110may comprise one or more integrated displays configured to receive display output signals produced by the display I/O36, which may be output for display to a user. According to some embodiments, display output signals produced by the display I/O36may also be output to one or more display devices external to the computing device110.
The user computing system110can also include other features that may be used with a game, such as a clock50, flash memory52, and other components. An audio/video player56might also be used to play a video sequence, such as a movie. It should be understood that other components may be provided in the user computing system110and that a person skilled in the art will appreciate other variations of the user computing system110.
Program code can be stored in ROM46, RAM48, or storage40(which might comprise hard disk, other magnetic storage, optical storage, solid state drives, and/or other non-volatile storage, or a combination or variation of these). At least part of the program code can be stored in ROM that is programmable (ROM, PROM, EPROM, EEPROM, and so forth), in storage40, and/or on removable media such as game media12(which can be a CD-ROM, cartridge, memory chip or the like, or obtained over a network or other electronic channel as needed). In general, program code can be found embodied in a tangible non-transitory signal-bearing medium.
Random access memory (RAM)48(and possibly other storage) is usable to store variables and other game and processor data as needed. RAM is used and holds data that is generated during the play of the game and portions thereof might also be reserved for frame buffers, game state and/or other data needed or usable for interpreting user input and generating game displays. Generally, RAM48is volatile storage and data stored within RAM48may be lost when the user computing system110is turned off or loses power.
As user computing system110reads game media12and provides a game, information may be read from game media12and stored in a memory device, such as RAM48. Additionally, data from storage40, ROM46, servers accessed via a network (not shown), or removable storage media46may be read and loaded into RAM48. Although data is described as being found in RAM48, it will be understood that data does not have to be stored in RAM48and may be stored in other memory accessible to processing unit20or distributed among several media, such as game media12and storage40.
Example Embodiments
Embodiments of the present disclosure can be described in view of the following clauses:
Clause 1. A video game test system configured to test an event latency during execution of a video game, the video game test system comprising:a front-end system configured to:access a first command that emulates an interaction by a user with a user interface device of a user computing system; andprovide the first command to the user computing system to interact with a video game hosted by the user computing system, wherein providing the first command to the user computing system triggers a timer; anda back-end system configured to:receive one or more output signals from the user computing system;convert the one or more output signals to a set of pixels, the set of pixels corresponding to a frame output for display by the user computing system;identify a presence of a stop condition embedded in the set of pixels; anddetermine an event latency of an event based at least in part on a first time when the timer is triggered and a second time associated with identification of the presence of the stop condition, wherein the event is triggered at least in part by execution of the first command.
Clause 2. The video game test system of clause 1, wherein the front-end system comprises a non-volatile storage configured to store one or more command sequences corresponding to one or more interaction sequences with the user interface device, at least one of the one or more command sequences including the first command.
Clause 3. The video game test system of clause 1, wherein the front-end system comprises a storage configured to store the first command and a user interface circuit configured to provide the first command to the user computing system, and wherein the storage is collocated with the user interface circuit reducing input latency associated with simulating user input to the user computing system.
Clause 4. The video game test system of clause 1, wherein the front-end system is further configured to trigger the timer substantially in parallel with providing the first command to the user computing system.
Clause 5. The video game test system of clause 1, wherein the front-end system is further configured to trigger a second timer when providing a second command to the user computing system.
Clause 6. The video game test system of clause 1, wherein the front-end system comprises a user interface circuit configured to emulate the interaction by the user with the user interface by providing the first command to the user computing system via an interface port of the user computing system.
Clause 7. The video game test system of clause 1, wherein the back-end system comprises the timer.
Clause 8. The video game test system of clause 1, wherein the back-end system is further configured to identify the stop condition by:filtering a subset of pixels from the set of pixels, the subset of pixels configured to store embedded data;decoding the subset of pixels to obtain the embedded data; anddetermining whether the embedded data includes the stop condition.
Clause 9. The video game test system of clause 1, wherein the back-end system comprises a controller configured to provide a subset of pixels from the set of pixels to the timer.
Clause 10. The video game test system of clause 9, wherein the back-end system is further configured to determine the event latency of the event based at least in part on the first time, the second time, and a communication delay between the controller and the timer.
Clause 11. The video game test system of clause 1, wherein the front-end system comprises one or more integrated circuits and the back-end system comprises one or more integrated circuit that are separate from the front-end system.
Clause 12. The video game test system of clause 1, wherein the event comprises at least one of: output of an animation, output of a frame within the animation, output of a sound, a change in state of the video game, or a change in state of an element of the video game.
Clause 13. The video game test system of clause 1, wherein the front-end system is configured to modify a test of the video game based at least in part on the event latency of the event.
Clause 14. A method of testing an event latency during execution of a video game, the method comprising:as implemented by a video game test system implemented in hardware,receiving a trigger to test an event latency of an event within a video game, wherein the event latency comprises an amount of time between interaction with a user interface device of a user computing system hosting the video game and an occurrence of the event;responsive to receiving the trigger, accessing a first command from a command sequence storage, the first command emulating the interaction by a user with the user interface device;providing the first command to the user computing system via an interface of the user computing system configured to communicate with the user interface device, wherein the video game test system interfaces with the user computing system as a substitute for the user interface device;receiving a set of output signals from an output port of the user computing system;converting the set of output signals to a set of pixels;identifying a stop flag embedded in the set of pixels; andresponsive to identifying the stop flag, determining the event latency of the event based at least in part on a first time associated with providing the first command and a second time associated with identifying the stop flag.
Clause 15. The method of clause 14, wherein determining the event latency further comprises modifying a determined latency by a communication overhead time associated with communication between elements of the video game test system.
Clause 16. The method of clause 14, wherein determining the event latency further comprises modifying a determined latency by a vertical synchronization latency associated with the user computing system.
Clause 17. The method of clause 14, further comprising filtering the set of pixels to obtain a subset of pixels comprising embedded data, wherein identifying the stop flag embedded in the set of pixels comprises extracting data embedded in the subset of pixels and determining whether the extracted data includes the stop flag.
Clause 18. The method of clause 14, further comprising outputting the event latency for presentation to a user on a user interface.
Clause 19. The method of clause 14, further comprising selecting a second command to provide to the user computing system based at least in part on the event latency.
Clause 20. A video game test system configured to test command execution latency during execution of a video game, the video game test system comprising:storage configured to store one or more commands that emulate interaction by a user with a user interface device of a user computing system; andprocessing circuitry configured to:access a command from the storage;provide the command to the user computing system to interact with a video game hosted by the user computing system;initiate a timer at a first time when providing the command to the user computing system;obtain a set of output signals from the user computing system, the output signals associated with a frame output for display on a display;convert the output signals to a set of pixels;process the set of pixels to obtain embedded data included in a subset of the set of pixels;stop the timer at a second time when it is determined that the embedded data includes a stop condition; anddetermine a command execution latency associated with the command based at least in part on the first time and the second time.
Additional embodiments of the present disclosure can be described in view of the following clauses:
Clause 1. An application test system configured to test code efficiency of an application, the application test system comprising:a hardware processor configured to:provide a user command to a computing system hosting an application, the user command simulating user interaction with the application at the computing system;initiate a timer in parallel with providing the user command to the computing system, wherein there is less than a threshold time difference between when the timer is initiated and the user command is provided to the computing system;capture output from the computing system that is output via a display port of the computing system;determine whether the output includes embedded data associated with a stop event; andupon determining that the output includes the embedded data associated with the stop event, determine a command latency value based on the timer.
Clause 2. The application test system of clause 1, wherein the timer measures a passage of time.
Clause 3. The application test system of clause 1, wherein the timer measures a number of events occurring between initiation of the timer and detection of the stop event.
Clause 4. The application test system of clause 3, wherein the events comprise frames output on a display.
Clause 5. The application test system of clause 1, wherein capturing the output from the computing system does not prevent the output from being provided to a display via the display port.
Clause 6. The application test system of clause 1, wherein the stop event comprises an event performed by the application in response to the user command.
Clause 7. The application test system of clause 1, wherein the application comprises a video game.
Clause 8. The application test system of clause 1, wherein the hardware processor is further configured to determine whether the output includes embedded data associated with the stop event by:converting the output to a set of pixels of an animation frame;decoding at least a portion of the set of pixels to obtain a decoded subset of pixels; anddetermining whether the decoded subset of pixels includes the embedded data associated with the stop event.
Clause 9. The application test system of clause 1, wherein the command latency value comprises a measure of time between an event trigger and an occurrence of a corresponding event, wherein the event trigger comprises providing the user command to the computing system.
Clause 10. The application test system of clause 1, wherein the hardware processor is further configured to select a second user command to provide to the computing system based at least in part on the command latency value.
Clause 11. A method of testing code efficiency of an application, the method comprising:as implemented by an application test system configured with specific computer-executable instructions,providing a user command to a computing system hosting an application, the user command simulating user interaction with the application at the computing system;initiating a counter substantially in parallel with providing the user command to the computing system;capturing output from the computing system that is output via an output port of the computing system;determining whether the output includes data associated with a target event; andupon determining that the output includes the data associated with the target event, determining a command latency based on a value of the counter.
Clause 12. The method of clause 11, wherein the counter counts an amount of time that has elapsed between initiation of the counter and determination that the output includes the data associated with the target event.
Clause 13. The method of clause 11, wherein the counter counts a number of events that have occurred or frames that have been output between initiation of the counter and determination that the output includes the data associated with the target event.
Clause 14. The method of clause 11, wherein the target event comprises an event performed by the application in response to the user command and a state of the application.
Clause 15. The method of clause 11, wherein determining whether the output includes data associated with a target event comprises:converting the output to a set of pixels of an image;decoding at least a portion of the set of pixels to obtain a decoded subset of pixels; anddetermining whether the decoded subset of pixels includes the data associated with the target event.
Clause 16. The method of clause 15, wherein the data associated with the target event is inserted into the image as a substitute for pixel data of the image by test code inserted into the application.
Clause 17. The method of clause 11, further comprising selecting a second user command to provide to the computing system based at least in part on the command latency.
Clause 18. A non-transitory computer-readable storage medium storing computer executable instructions that, when executed by one or more computing devices, configure the one or more computing devices to perform operations comprising:providing a user command to a computing system hosting an application, the user command simulating user interaction with the application at the computing system;initiating a counter substantially in parallel with providing the user command to the computing system;capturing output from the computing system that is output via an output port of the computing system;determining that the output includes data associated with a target event; andresponsive to determining that the output includes the data associated with the target event, determining a command latency based on a value of the counter.
Clause 19. The computer-readable, non-transitory storage medium of clause 18, wherein determining that the output includes data associated with a target event comprises:converting the output to a set of pixels of an image;decoding at least a portion of the set of pixels to obtain a decoded set of pixels; anddetermining that the decoded set of pixels includes the data associated with the target event.
Clause 20. The computer-readable, non-transitory storage medium of clause 18, wherein the operation further comprise selecting an automated test to perform with respect to the application based at least in part on the command latency.
Additional Embodiments
It is to be understood that not necessarily all objects or advantages may be achieved in accordance with any particular embodiment described herein. Thus, for example, those skilled in the art will recognize that certain embodiments may be configured to operate in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other objects or advantages as may be taught or suggested herein.
All of the processes described herein may be embodied in, and fully automated via, software code modules executed by a computing system that includes one or more computers or processors. The code modules may be stored in any type of non-transitory computer-readable medium or other computer storage device. Some or all the methods may be embodied in specialized computer hardware.
Many other variations than those described herein will be apparent from this disclosure. For example, depending on the embodiment, certain acts, events, or functions of any of the algorithms described herein can be performed in a different sequence, can be added, merged, or left out altogether (for example, not all described acts or events are necessary for the practice of the algorithms). Moreover, in certain embodiments, acts or events can be performed concurrently, for example, through multi-threaded processing, interrupt processing, or multiple processors or processor cores or on other parallel architectures, rather than sequentially. In addition, different tasks or processes can be performed by different machines and/or computing systems that can function together.
The various illustrative logical blocks and modules described in connection with the embodiments disclosed herein can be implemented or performed by a machine, such as a processing unit or processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A processor can be a microprocessor, but in the alternative, the processor can be a controller, microcontroller, or state machine, combinations of the same, or the like. A processor can include electrical circuitry configured to process computer-executable instructions. In another embodiment, a processor includes an FPGA or other programmable device that performs logic operations without processing computer-executable instructions. A processor can also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Although described herein primarily with respect to digital technology, a processor may also include primarily analog components. A computing environment can include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a portable computing device, a device controller, or a computational engine within an appliance, to name a few.
Conditional language such as, among others, “can,” “could,” “might” or “may,” unless specifically stated otherwise, are otherwise understood within the context as used in general to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.
Disjunctive language such as the phrase “at least one of X, Y, or Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (for example, X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to each be present.
Any process descriptions, elements or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or elements in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown, or discussed, including substantially concurrently or in reverse order, depending on the functionality involved as would be understood by those skilled in the art.
Unless otherwise explicitly stated, articles such as “a” or “an” should generally be interpreted to include one or more described items. Accordingly, phrases such as “a device configured to” are intended to include one or more recited devices. Such one or more recited devices can also be collectively configured to carry out the stated recitations. For example, “a processor configured to carry out recitations A, B and C” can include a first processor configured to carry out recitation A working in conjunction with a second processor configured to carry out recitations B and C.
It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure.
Claims
- An application test system configured to test an event latency during execution of an application, the application test system comprising: a hardware processor configured to: provide a first command to a user computing system in communication with the application test system, wherein the user computing system is configured to execute the application;trigger a timer associated with providing the first command to the user computing system;receive a first output signal from the user computing system;convert the first output signal to a first set of pixels corresponding to a first frame of an animation generated by the application;decode a first subset of pixels to obtain first embedded data that is embedded in the first frame of the animation;determine that the first embedded data identifies an occurrence of a first event triggered at least in part by the execution of the first command provided to the user computing system;and determine a first event latency based at least in part on a difference between when the first embedded data that identifies the occurrence of the first event is obtained and when the timer is triggered.
- The application test system of claim 1, wherein the application comprises a video game.
- The application test system of claim 1, wherein the hardware processor is further configured to trigger a plurality of timers associated with providing the first command to the user computing system.
- The application test system of claim 3, wherein at least one timer of the plurality of timers is associated with a different stop condition than at least one other timer of the plurality of timers.
- The application test system of claim 3, wherein each timer of the plurality of timers is associated with a different event triggered at least in part by the execution of the first command provided to the user computing system.
- The application test system of claim 1, wherein the hardware processor is further configured to: receive a second output signal from the user computing system, wherein the second output signal is received later than the first output signal;convert the second output signal to a second set of pixels corresponding to a second frame of the animation generated by the application;decode a second subset of pixels to obtain second embedded data that is embedded in the second frame of the animation;determine that the second embedded data identifies an occurrence of a second event triggered at least in part by the execution of the first command provided to the user computing system;and determine a second event latency based at least in part on a difference between when the second embedded data that identifies the occurrence of the second event is detected and when the timer is triggered.
- The application test system of claim 1, wherein the first output signal is received from a video output port of the user computing system that is configured to provide the first output signal to a display system for presentation of the animation to a user.
- The application test system of claim 1, wherein the first output signal is received from a signal splitter positioned between a video output port of the user computing system and the application test system, and wherein the signal splitter is configured to provide the first output signal to the application test system without preventing the first output signal from being provided to a display system for presentation of the animation to a user.
- The application test system of claim 1, wherein the hardware processor is further configured to: receive a second output signal from the user computing system;convert the second output signal to a second set of pixels corresponding to a second frame of the animation generated by the application;determine that the second set of pixels do not include the first embedded data;and continue to run the timer at least until the first embedded data is determined to be present in the first set of pixels of the first output signal.
- The application test system of claim 1, wherein the difference is a time-based difference and wherein the first event latency corresponds to a passage of time between the triggering of the timer and the obtaining of the first embedded data that identifies the occurrence of the first event.
- The application test system of claim 1, wherein the difference corresponds to animation frames and wherein the first event latency corresponds to a count of frames of the animation output by the user computing system between the triggering of the timer and the obtaining of the first embedded data that identifies the occurrence of the first event.
- The application test system of claim 1, wherein the first event corresponds to a non-animation based event.
- The application test system of claim 1, wherein the hardware processor is further configured to adjust a value of the first event latency based at least in part on a delay associated with operation of the application test system.
- The application test system of claim 1, wherein the hardware processor is further configured to adjust a value of the first event latency based at least in part on latency noise associated with a configuration of the user computing system.
- A method of testing an event latency during execution of an application, the method comprising: as implemented by an application test system comprising a hardware processor, providing a first command to a user computing system in communication with the application test system, wherein the user computing system is configured to execute the application;triggering a timer associated with providing the first command to the user computing system;receiving a first output signal from the user computing system;converting the first output signal to a first set of pixels corresponding to a first frame of an animation generated by the application;decoding a first subset of pixels to obtain first embedded data that is embedded in the first frame of the animation;determining that the first embedded data identifies an occurrence of a first event triggered at least in part by the execution of the first command provided to the user computing system;and determining a first event latency based at least in part on a difference between when the first embedded data that identifies the occurrence of the first event is obtained and when the timer is triggered.
- The method of claim 15, wherein the timer is one of a plurality of timers triggered when providing the first command to the user computing system, and wherein each timer of the plurality of timers corresponds to a different event that occurs in response to the application performing the first command.
- The method of claim 15, further comprising: receiving a second output signal from the user computing system, wherein the second output signal is received later than the first output signal;converting the second output signal to a second set of pixels corresponding to a second frame of the animation generated by the application;decoding a second subset of pixels to obtain second embedded data that is embedded in the second frame of the animation;determining that the second embedded data identifies an occurrence of a second event triggered at least in part by the execution of the first command provided to the user computing system;and determining a second event latency based at least in part on a difference between when the second embedded data that identifies the occurrence of the second event is obtained and when the timer is triggered.
- The method of claim 15, further comprising receiving the first output signal from the user computing system without disrupting display of the animation on a display system in communication with the user computing system.
- The method of claim 15, further comprising: receiving a second output signal from the user computing system;converting the second output signal to a second set of pixels corresponding to a second frame of the animation generated by the application;determining that the second set of pixels do not include the first embedded data;and continuing to execute the timer at least until the first embedded data is determined to be present in the first set of pixels of the first output signal.
- The method of claim 15, further comprising adjusting a value of the first event latency based at least in part on a delay associated with operation of the application test system and/or latency noise associated with a configuration of the user computing system.
Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.