U.S. Pat. No. 11,465,051
SYSTEMS AND METHODS FOR COACHING A USER FOR GAME PLAY
AssigneeSony Interactive Entertainment Inc
Issue DateJune 26, 2020
Illustrative Figure
Abstract
A method for processing a self-coaching interface is described. The method includes identifying a gameplay event during gameplay by a user. The gameplay event is tagged as falling below a skill threshold. The method further includes generating a recording for a window of time for the gameplay event and processing game telemetry for the recording of the gameplay event. The game telemetry is used to identify a progression of interactive actions before the gameplay event for the window of time. The method includes generating overlay content in the self-coaching interface. The overlay content is applied to one or more image frames of the recording when viewed via the self-coaching interface. The overlay content appears in the one or more image frames during a playback of the recording. The overlay content provides hints for increasing a skill of the user to be above the skill threshold.
Description
DETAILED DESCRIPTION Systems and methods for coaching a user for game play are described. It should be noted that various embodiments of the present disclosure are practiced without some or all of these specific details. In other instances, well known process operations have not been described in detail in order not to unnecessarily obscure various embodiments of the present disclosure. FIG. 1is a diagram of an embodiment of a system100to illustrate a play of a game. The system100includes a display device1that displays a virtual scene110of the game. The display device1has a camera101that captures one or more images of a real-world environment in front of the camera101. For example, the camera101captures one or more images within a field-of-view of the camera101. As an example, each virtual scene is an image frame. In one embodiment, the terms image and image frame are used herein interchangeably. Examples of a display device, as used herein, include a smart television, a television, a plasma display device, a liquid crystal display (LCD) device, and a light emitting diode (LED) display device. A user1is playing the game, such as a video game, by using a game controller1. Examples of the video game include a single player game or a multi-player game. Examples of a game controller1include a Playstation™ controller, an Xbox™ controller, and a Nintendo™ controller. As an example, a game controller includes multiple joysticks and multiple buttons. The display device1displays multiple images of the game. The user1logs into a user account1that is assigned to the user1by a server system to access the game from the server system. For example, a user identity (ID) assigned to a user and a password assigned to the user by the server system are authenticated by the server system to determine to allow the user to access the game. Upon accessing ...
DETAILED DESCRIPTION
Systems and methods for coaching a user for game play are described. It should be noted that various embodiments of the present disclosure are practiced without some or all of these specific details. In other instances, well known process operations have not been described in detail in order not to unnecessarily obscure various embodiments of the present disclosure.
FIG. 1is a diagram of an embodiment of a system100to illustrate a play of a game. The system100includes a display device1that displays a virtual scene110of the game. The display device1has a camera101that captures one or more images of a real-world environment in front of the camera101. For example, the camera101captures one or more images within a field-of-view of the camera101. As an example, each virtual scene is an image frame. In one embodiment, the terms image and image frame are used herein interchangeably. Examples of a display device, as used herein, include a smart television, a television, a plasma display device, a liquid crystal display (LCD) device, and a light emitting diode (LED) display device. A user1is playing the game, such as a video game, by using a game controller1. Examples of the video game include a single player game or a multi-player game. Examples of a game controller1include a Playstation™ controller, an Xbox™ controller, and a Nintendo™ controller. As an example, a game controller includes multiple joysticks and multiple buttons.
The display device1displays multiple images of the game. The user1logs into a user account1that is assigned to the user1by a server system to access the game from the server system. For example, a user identity (ID) assigned to a user and a password assigned to the user by the server system are authenticated by the server system to determine to allow the user to access the game.
Upon accessing the game, the user1plays the game by using the game controller1. During the play of the game, the virtual scene110is generated and a user ID1assigned to the user1is displayed in the virtual scene110.
In the game, the user1uses the game controller1to build a virtual ramp112on which a virtual user102can climb to have a height advantage over another virtual user104. The virtual user102, the virtual ramp112, and the virtual user104are within the virtual scene110. Also, in the game, movement of the virtual user102is controlled by the user1with the game controller1. The user1uses the game controller1to control a virtual gun108that is held by the virtual user102. For example, the user1uses the game controller1to control the virtual gun108to shoot at the virtual user104. The virtual gun108is a portion of the virtual scene110. The virtual user104is controlled by another user, such as a user2or a user3, in the multi-player game.
During the play of the game, in the virtual scene110, a virtual bullet106is directed towards the virtual user102while the virtual user102is shooting at the virtual user104and the virtual user104is shooting at the virtual user102. The virtual bullet106is directed towards to the virtual user102from a side of the virtual user102. The virtual scene110does not include a virtual user that shot the virtual bullet106at the virtual user102. The virtual scene110includes a virtual tree105and a virtual wall107. It should be noted that each of the virtual tree105, the virtual wall107, the virtual ramp112, the virtual user102, the virtual gun108, the virtual bullet108, and the virtual user104is an example of a virtual object.
In one embodiment, the server system authenticates a user ID and a password to allow a user to access multiple games for game play.
In an embodiment, instead of the display device1, a head-mounted display (HMD) is used. The HMD is worn by the user1on his/her head.
FIG. 2is a diagram of an embodiment of the system100to illustrate a virtual scene202of the game. The virtual scene202is displayed on the display device1and follows the virtual scene110during the play of the game. For example, both the virtual scenes110and202are displayed during the same gaming session. The user1is not able to save the virtual user102from being hit by the virtual bullet106(FIG. 1). After the virtual bullet108hits the virtual user102, the virtual user102dies in the game and is beamed by a virtual drone204of the virtual scene202. Also, before the virtual user102dies in the game, the user1manages to control the game controller1to further control the virtual user102and the virtual gun108(FIG. 1) to kill the virtual user104. When the virtual user104is killed, the virtual user104is beamed by another virtual drone206in the virtual scene202. Also, in the virtual scene202, the user ID1is displayed.
FIG. 3is a diagram of an embodiment of the system100to illustrate another virtual scene302of the game. The virtual scene302is displayed after the virtual scene202(FIG. 2) is displayed or before the virtual scene110(FIG. 1) is displayed. In the virtual scene302, the virtual user102is controlled by the user1via the game controller1. The virtual scene302is displayed on the display device1. The virtual user102is controlled to use the virtual gun108to shoot at the virtual user104. Also, the virtual scene302includes a virtual wall304built by the virtual user102. For example, the user1uses the game controller1to control the virtual user102to build the virtual wall304to protect the virtual user102from virtual bullets being shot by the virtual user104.
In the virtual scene302, a virtual bullet306hits the virtual user102from behind. The user1cannot see, from the virtual scene302, which virtual user shot the virtual bullet306at the virtual user102. The virtual scene302does not include a virtual user that shot the virtual bullet306at the virtual user102.
It should be noted that the virtual scene302is displayed on the display device1during a gaming session that is the same or different from a gaming session in which the virtual scenes110and202ofFIGS. 1 and 2are displayed. For example, after the virtual user102dies in the virtual scene202, a game program of the game is executed by a processor of the server system to provide a waiting time period in which the user1waits for the virtual user102to respawn or rejuvenate or to come alive. An example of a processor, as used herein, includes a microprocessor or a microcontroller or a central processing unit (CPU) or an application specific integrated circuit (ASIC) or a programmable logic device (PLD). The waiting time period occurs before the game displays the virtual scene302. In this example, the user1does not log out of the user account1between the display of the virtual scenes202and302, and so the virtual scenes110,202, and302are displayed during the same gaming session. As another example, after the virtual user102dies in the virtual scene202, the user1logs out of the user account1. The user1later logs into the user account1. After the user1logs into the user account1, the virtual scene302is displayed. In this example, the virtual scene302is displayed during a gaming session that is different from the gaming session in which the virtual scenes110and202are displayed.
In one embodiment, a gaming session ends at a time the virtual user102that is controlled by the user1either dies or wins the gaming session. The virtual user102wins the gaming session when the virtual user102kills all other virtual users in the gaming session or survives beyond a prestored time period during the gaming session. Once the virtual user102dies, the virtual user102can be respawned and another gaming session begins, and the other gaming session ends either when the virtual102dies or wins the other gaming session.
FIG. 4is a diagram of an embodiment of the system100to illustrate yet another virtual scene402in which the virtual user102dies. The virtual scene402is displayed after the virtual scene302(FIG. 3) is displayed. For example, both the virtual scenes302and402are displayed during the same gaming session. In the virtual scene402, the virtual users102and104are killed. The virtual user102is killed by the virtual bullet306(FIG. 3). The virtual scene402includes the virtual drones204and206used to beam in the virtual users102and104.
FIG. 5Ais a diagram of an embodiment of a system500to illustrate the recording of game event data510by a game recorder502. The system500includes a processor system505. An example of a processor system505is a server system, which includes one or more servers of a data center or of multiple data centers. Another example of the processor system505includes one or more processors of the server system. Yet another example of the processor system505includes one or more virtual machines. An example of the game recorder502is a digital video recorder that records both video and audio data of the game event data510. The processor system505includes the game recorder502, a game processor506and a coaching processor508. The game processor506is coupled to the game recorder502and to the coaching processor508. The coaching processor508is coupled to the game recorder502.
The game processor506executes a game program528to facilitate the play of the game by the user1and by other users. As an example, the game program528includes a game engine and a rendering engine. The game engine is executed for determining positions and orientations of various virtual objects of a virtual scene. The virtual scene includes a background, which is an example of the virtual object. The rendering engine is executed for determining graphics parameters, such as color, intensity, shading, and lighting of the virtual objects of the virtual scene. The game program528is stored in one or more memory devices of the processor system502and the one or more memory devices are coupled to the game processor506. When the game program528is executed, the game event data510is generated and the virtual scenes110,220,302, and402(FIGS. 1-4) are displayed on the display device1.
The game event data510generated is for the user ID1. For example, the virtual scenes110(FIG. 1),202(FIG. 2),302(FIG. 3), and402(FIG. 4) of the game event data510are generated upon execution of the game program528. The game program528is executed when the user ID1and other information, such as a password or a phone number or a combination thereof, of the user1are authenticated by the processor system505.
As an example, the game event data510includes multiple game events1through m leading up to a death1of the virtual user102is illustrated in the virtual scene202(FIG. 2), where m is a positive integer. To illustrate, the game event data510includes data of the virtual scenes110(FIG. 1) and202. To further illustrate, a game event512is data of the virtual scene110and a game event514is data of the virtual scene202. As another example, the game event data510includes multiple game events1through n leading up to a death x of the virtual user102illustrated in the virtual scene402(FIG. 4), where n and x are positive integers. To illustrate, the game event data510includes data of the virtual scenes302(FIG. 3) and402. To further illustrate, a game event516is data of the virtual scene302and a game event518is data of the virtual scene402. As another example, the game event data510includes multiple game events1through o leading up to a decrease in health level of the virtual user102to be below a predetermined threshold, where o is a positive integer. To illustrate, a game event520is data of a virtual scene in which the virtual user102has full health and a game event522is data of a virtual scene in which the virtual user102's health decreases to be below the predetermined threshold. As yet another example, the game event data510includes multiple game events1through p leading up to a decrease in health level of the virtual user to be below the predetermined threshold, where p is a positive integer. To illustrate, a game event524is data of a virtual scene in which the virtual user102has full health and a game event526is data of a virtual scene in which the virtual user102's health decreases to be below the predetermined threshold. It should be noted that a series of game events from the game event520to the game event522occurs during the same or a different gaming session then an occurrence of a series of game events from the game event524to the game event526.
During execution of the game program528, the game recorder502records the game event data510. For example, a processor of the game recorder502stores or writes the game event data510to one or more memory devices of the game recorder502. Examples of the memory device include a read-only memory, a random access memory, and a combination thereof. To illustrate, the memory device is a flash memory device or a redundant array of independent disks (RAID).
During or after execution of the game program528, the game event data510that is recorded is provided from the game recorder502to the coaching processor508. For example, the coaching processor508periodically requests the game recorder502for obtaining the game event data510from the game recorder502. In response to reception of a request from the coaching processor508, the game recorder502accesses, such as reads, the game event data510from the one or more memory devices of the game recorder502and sends the game event data510to the coaching processor508. As another example, the game recorder502periodically sends the game event data to the coaching processor508. To illustrate, without receiving any request from the coaching processor508, the game recorder502accesses the game event data510from the one or more memory devices of the game recorder502and sends the game event data510to the coaching processor508.
During or after execution of the game program528, a coaching program530is executed by the coaching processor508. For example, the coaching program530is executed after the display of the virtual scene202(FIG. 2) and before the display of the virtual scene302(FIG. 3) or after the display of the virtual scene402(FIG. 4).
It should be noted that the coaching program530performs the functions described herein as being performed by the coaching program530when the coaching program530is executed by the processor system505. It should further be noted that in an embodiment instead of the game recorder502, multiple game recorders are used. Similarly in one embodiment is that of the game processor506, multiple game processors are used. In one embodiment instead of the coaching processor508, multiple coaching processors are used.
In one embodiment, the game recorder502, the game processor506and the coaching processor508are coupled with each other via a bus.
In an embodiment, the coaching program530is a portion of the game program528and is integrated into the game program528. For example, a computer program code of the coaching program520is interspersed with a computer program code of the game program528.
In one embodiment, a game event is sometimes referred to herein as a gameplay event.
In an embodiment, each time a death of the virtual user102occurs, the coaching program530identifies from the game event data510, that the death has occurred. For example, the coaching program530determines that the virtual scene202includes the virtual drone204above the virtual user204to determine that the virtual user102has died. The coaching program520tags, such as assigns a keyword, such as death or demise, to the virtual scene202in which the virtual user102dies, and stores the keyword as metadata within one or more memory devices of the processor system505. Each time the virtual user102dies, the coaching program530determines that a skill level of the user1is below a preset threshold, which is stored in one or more memory devices of the processor system505. For example, each time the virtual user102dies in the game, the game program528reduces a skill level corresponding to the user ID1to be below the preset threshold.
FIG. 5Bis an embodiment of the game recorder502to illustrate recording of game event data504for user IDs2through N, where N is a positive integer. The user IDs2through N are assigned to other users2through N that are playing the game. For example, the user ID2is assigned to a user2and the user IDN is assigned to a user N.
The game event data504includes a series of game events1through q for the user IDX and another series of game events1through r recorded for the user IDN. For example, a game event550includes data for a virtual scene that is accessed upon execution of the game program528(FIG. 5A). The game event q includes a death q of the virtual user102. The game program528is accessed after the user IDN is authenticated by the processor system505(FIG. 5A). As another example, a game event552includes data for a virtual scene that is accessed upon execution of the game program528. The game event552includes a death1of a virtual user that is controlled by the user N via a game controller N. As another example, a game event554includes data for a virtual scene that is accessed upon execution of the game program528and a game event556includes data for a virtual scene that is accessed upon execution of the game program528. The game event556includes a death r of the virtual user that is controlled by the user N via the game controller N, where r is a positive integer. During or after execution of the game program528, the game event data504that is recorded is provided from the game recorder502to the coaching processor508(FIG. 5A) in the same manner in which the game event data510(FIG. 5A) is provided from the game recorder502to the coaching processor508.
FIG. 6is a diagram of an embodiment of a system600to illustrate a use of the coaching program530to generate a coaching session608for training the user1. The system600includes the game event data510for the user ID1. Also, the system600includes the coaching program530, the game event data504for the user IDs2-N, a client device602, selected game event data604, and metadata606. Examples of a client device, as used herein, include a desktop computer, a laptop computer, a tablet, a smart phone, a combination of a hand-held remote controller and smart television, a combination of a hand-held controller (HHC) and a game console, and a combination of a hand-held controller and a head mounted display (HMD). A game controller is an example of the HHC. The display device1is an example of a display device of the client device602.
The game event data510for the user ID1includes video data612and audio data614. As an example, video data of game event data includes a position, an orientation, and the graphics parameters of multiple virtual objects displayed in a virtual scene. Also, as an example, audio data of game event data includes phonemes, vocabulary, lyrics, or phrases that are uttered by one or more virtual objects in a virtual scene. Similarly, the game event data504for the user IDs2-N includes video data and audio data for each of the user IDs2-N.
Examples of the metadata606include game telemetry, such as one or more game states for generating virtual objects in the game. For example, the metadata606includes a game state based on which a virtual object, which is not displayed or output or uttered in a virtual scene, is displayed in a coaching scene. As another example, the metadata606is information associated with positions and orientations of one or more virtual objects in a virtual scene, such as the virtual scene110or202or302or402(FIGS. 1-4), that is displayed during execution of the game program528. To illustrate, the metadata606is a position at which a virtual rectangular frame is displayed around a position of the virtual bullet106that is displayed in the virtual scene110. The virtual rectangular frame is not displayed in the virtual scene110. As another example, the metadata606is a position and orientation of a virtual user that is not displayed in the virtual scene110or202or302or402(FIGS. 1-4). The virtual user, not displayed, shoots the virtual bullet106(FIG. 1) or306(FIG. 3). The coaching program530generates the virtual user that shot the virtual user102at a position next to, e.g., to the right of, a position of the virtual bullet106to show the virtual user as shooting the virtual bullet106towards the virtual user102. As yet another example, the metadata606is a username of the virtual user that is not displayed in the virtual scene110or202or302or402and that shoots a virtual bullet. As another example, the metadata606is not used to generate a camera view from view point of the virtual user that is not displayed in the virtual scene110or202or302or402. For example, the metadata606is not used to generate an action replay of the virtual scene110or202or302or402from the viewpoint of the virtual user that is not displayed in the virtual scene110or202or302or402.
Examples of the coaching session608includes a display of a coaching scene in which the user1cannot play the game but can learn based on his/her previous play of the game, or previous play of the game by the users2-N, or a combination thereof. For example, the coaching program530is executed by the processor system505(FIG. 5A) to generate the coaching session608for the user ID1. An example of the coaching scene is a virtual scene that is generated and rendered upon execution of the coaching program530. When the coaching session608is generated, one or more coaching scenes are displayed on the client device602. The coaching scenes, when displayed for a user ID, do not allow a play of the game to the user ID. For example, during a display of the coaching scenes for the user ID1, the game program528(FIG. 5A) is not executed by the processor system505for the user ID1. To illustrate, for the user ID1, the processor system505does not render a virtual scene that is generated by executing the game program528during a display of one or more coaching scenes for the user ID1. Instead, the processor system505executes the coaching program530for rendering the one or more coaching scenes for the user ID1.
In an operation610, the coaching program530receives an input signal616from the client device602to initiate the coaching session608. For example, during a play of the game, the user1selects a button on the client device1to generate the input signal616for initiating the coaching session608. To illustrate, after the virtual user102dies in the virtual scene202or402(FIG. 2 or 4), the user1selects a button of the game controller1(FIG. 1). Upon receiving the selection, the game controller1generates the input signal616for initiating the coaching session608. The input signal616is sent from the client device602to the coaching program530.
The coaching program530receives the input signal616and identifies from the input signal616that the coaching session608is to be initiated. Upon identifying to the coaching session608is to be initiated, the coaching program530generates an output signal618including a request620for one or more user IDs622for which the coaching program520is to be initiated. For example, the output signal618includes a query to the client device602for obtaining information regarding whether the coaching program530is to be initiated based on gameplay by the user ID1or user ID2or user IDN or a combination of two or more of the user IDs ID1, ID2, and IDN. To illustrate, the output signal618includes an inquiry for obtaining the user ID1or user ID2or the user IDN or a combination of two or more thereof that is accessed for gameplay of the game.
The client device602receives the request620for the one or more user IDs622from the coaching program530. For example, the client device602displays the request602for the one or more user IDs622on a display screen of the client device602. To illustrate, the client device602displays a list of the user IDs1-N on the display screen of the client device602. As another example, the client device602outputs a sound via a headphone worn by the user1asking the user1for the one or more user IDs622. The headphone is a part of the client device602or is coupled to the client device602. As yet another example, the client device602outputs a sound via one or more speakers of the client device602asking the user1for the one or more user IDs622.
The client device602receives the one or more user IDs622from the user1and generates an input signal623that includes the one or more user IDs622. For example, the user1selects one or more buttons on the game controller1to provide the one or more user IDs622. To illustrate, the user1selects one or more buttons on the game controller1to select one or more of the user IDs1-N displayed on the display screen of the client device602. The client device602sends the input signal623to the coaching program530.
Upon receiving the one or more user IDs622within the input signal623, the coaching program530generates an output signal624including a request626for identifying a game sequence type for which the coaching session608is to be generated for the one or more user IDs622. For example, the request626is an inquiry regarding whether the game sequence type is of a type 1 or a type 2. An example of type 1 of a game sequence includes multiple game events that lead to a death of a virtual user and an example of type 2 of a game sequence includes multiple game events that lead to a decrease in health level of the virtual user. To illustrate, the type 1 includes the game events512through514and the game events516and518, and the type 2 includes the game events520through522and524through526(FIG. 5A).
The client device602receives the output signal624including the request626for identifying the game sequence type and outputs the request to the user1via a user interface or another type of interface of the client device602. For example, the client device602displays or renders the request626via a display screen of the client device602. To illustrate, the client device602displays a list of game sequence types including a death of the virtual user102(FIGS. 1-4) and a reduction in a health level of the virtual user102below the predetermined threshold. As another example, the client device602outputs a sound via one or more speakers of the client device602and the sound includes the request626for the game sequence type. As yet another example, the client device602outputs a sound via the headphone to the user1for obtaining a response to the request626.
In response to the request626, the user1provides a game sequence type identifier628via a user interface or another type of interface of the client device602. For example, the user1selects one or more buttons of the game controller1to provide the game sequence type identifier628. To illustrate, the user1selects one or more buttons to select either the game sequence type of death of the virtual user102or the reduction in the health level of the virtual user102. Upon receiving the selection of the game sequence type identifier628, the client device602generates an input signal630including the game sequence type identifier628and sends the input signal630to the coaching program530.
The coaching program530receives the input signal630and identifies the game sequence type identifier628from the input signal630. Upon identifying the game sequence identifier628, the coaching program530generates an output signal634including a request632for identifying a game event of the sequence type identified by the game sequence identifier628. For example, to determine whether the coaching session608is to be generated for the game event514or518(FIG. 5A) of death of the virtual user102, the coaching program530generates the request632for identifying the game event514or518. The coaching program530sends the output signal634to the client device602.
Upon receiving the output signal634, the client device602identifies the request632from the output signal634and provides the request632via a user interface or another interface of the client device602to the user1. For example, the client device602displays a list of the game events514and518on the display screen of the client device602. To illustrate, the client device602displays the virtual scene202and the virtual scene402on the display screen of the client device602to allow the user1to select one of the virtual scenes202and402.
The user1responds to the request632to identify the game event for which the coaching session608is to be generated. The game event is identified to provide a game event identifier636. For example, the user1selects via one or more buttons of the game controller1, the virtual scene202to identify the game event514or the virtual scene402to identify the game event518. In response to receiving the selection of the virtual scene202or402, the client device602generates an input signal638having the game event identifier636, and sends the input signal638to the coaching program530.
Upon receiving the input signal638, the coaching program530identifies the game event identifier636and selects the game event data604for which the coaching session608is to be generated for the game user ID622. For example, the coaching program530identifies the game event identifier636identifying the game event514(FIG. 5A) recorded based on the virtual scene202(FIG. 2) that is displayed when the game program528is executed for the user ID1. An example of the selected game event data604includes the game event514or the game event518(FIG. 5A).
Upon identifying the game event identifier636for which the coaching session608is to be generated, the coaching program530is executed by the processor system505to store a portion of the game event data510that is recorded within a predetermined amount of time interval from a time of recording of the game event identified by the game event identifier636to output the selected game event data604. For example, upon identifying the game event514for which the coaching session608is to be generated, the coaching program530stores one or more game events that are recorded within a predetermined time interval before a time at which the game event514is recorded. The coaching program530also stores the game event514. The one or more game events lead up to or result in an occurrence of the game event514. To illustrate, the coaching program530deletes from one or more memory devices of the processor system505one or more game events that are recorded outside the predetermined time interval before the time at which the game event514is recorded. The game event514and the game events within the predetermined time interval are stored by the coaching program530as the selected game event data604. Also, in this illustration, the processor system505retains within the one or more memory devices of the processor system505one or more game events that are recorded within the predetermined time interval before the time at which the game event514is recorded to retain the selected game event data604. The one or more game events that are deleted lead to the one or more game events that are retained, and the one or more game events that are retained lead to the game event514. The one or more game events that are retained are the selected game event data604.
To explain further, during a gaming session of the game, a series of virtual scenes are generated and recorded upon execution of the game program528. The series includes a first series, a second series, and a third series. The first series includes a first set of consecutive virtual scenes in which the virtual user102collects items, such as virtual weapons and virtual objects, such as bricks or metal bars, to defend the virtual user102from virtual bullets. The second series includes a second set of consecutive virtual scenes in which the virtual user102fights with other virtual users. The third series includes a third set of consecutive virtual scenes, such as the virtual scenes110and202(FIGS. 1 and 2), for which the game events512and514(FIG. 5A) are recorded. The second series is consecutive to the first series and the third series is consecutive to the second series. The processor system505deletes the first and second series of virtual scenes and retains the third series of virtual scenes. The third series is the selected game event data604.
As another illustration, the coaching program530stores the selected game event data604, which includes a portion of the game event data510, within one or more memory devices of the processor system505. The one or more memory devices of the processor system505in which the selected game event data604is stored are different from, e.g., not the same as, one or more memory devices in which the game event data510is recorded. In one embodiment, instead of being stored at different one or more memory device, the selected game event data604is stored in a memory device at different memory addresses than memory addresses at which the game event data510is recorded in the memory device.
As yet another example, upon identifying the game event514for which the coaching session608is to be generated, the coaching program530stores one or more game events that are recorded within a predetermined time interval before a time at which the game event514is recorded, one or more game events recorded within a preset time interval after the time at which the game event514is recorded, and the game event514. The one or more game events that are recorded within the predetermined time interval before the time at which the game event514is recorded, the game event514, and the one or more game events recorded within the preset time interval after the time at which the game event514is recorded are the selected game event data604. For example, upon identifying the game event514for which the coaching session608is to be generated, the coaching program530stores one or more game events that are recorded within the preset time interval after the time at which the game event514is recorded. The game event514is followed by the one or more game events recorded within the preset time interval after the time at which the game event514is recorded. To illustrate, the coaching program530deletes from one or more memory devices of the processor system505one or more game events that are recorded outside the preset time interval after the time at which the game event514is recorded. The one or more game events that occur within the preset time interval after the time at which the game event514is recorded lead to the one or more game events that are deleted. Also, in this illustration, the processor system505retains within the one or more memory devices of the processor system505one or more game events that are recorded within the preset time interval after the time at which the game event514is recorded.
To explain further, during a gaming session of the game, a series of virtual scenes are generated upon execution of the game program528. The series includes a first series, a second series, and a third series. The first series includes a first set of consecutive virtual scenes in which the virtual user102collects items, such as virtual weapons and virtual objects, such as bricks or metal bars, to defend the virtual user102from virtual bullets after the virtual user102dies in the game event514as indicated in the virtual scene202. The second series includes a second set of consecutive virtual scenes in which the virtual user102dances with other virtual users. The third series includes a third set of consecutive virtual scenes in which the virtual user102continues to dance with the other virtual users. The second series is consecutive to the first series and the third series is consecutive to the second series. The processor system505deletes the second and third series of virtual scenes and retains the first series of virtual scenes. The first series is the selected game event data604.
Based on the selected game event data604, the coaching program530processes the metadata606to generate and render virtual objects, such as overlays, that can increase game skills of the user1by enabling the user1to see one or more reasons for occurrence of the game event identified by the game event identifier636. For example, the coaching program530processes the metadata606to identify a position and orientation of a virtual user that shot the virtual user102and determines to include an overlay of the virtual user that shot the virtual user102. Once the overlay is displayed, the user1can see a reason for the death of the virtual user102. The reason for the death is the virtual user that shot the virtual user102. The overlay is an example of a virtual object. Also, the coaching program530generates a virtual frame surrounding the virtual user to highlight the virtual user. As another example, the coaching program530identifies, from the metadata606, a position of the virtual bullet106and determines that the virtual user102would not have been killed by the virtual bullet106if the virtual user102would have jumped or been protected by a virtual wall. The coaching program530generates a coaching comment for displaying to the user1to control the virtual user102to jump while shooting or to build a wall around the virtual user102. The coaching comment enables the user1to protect the virtual user102from being hit by the virtual bullet106. The user1can review the coaching comment. When faced with a similar situation in which the virtual user102is about to be killed by another virtual bullet, if the user1follows the coaching comment, the virtual user102can be protected from the other virtual bullet.
The coaching program530generates overlays of virtual objects according to the metadata606for overlaying on the selected game event data604to generate the coaching session608. For example, the coaching program530overlays a virtual frame around a virtual object in a virtual scene stored as the selected event data604. As another example, the coaching program530overlays a virtual object in the virtual scene202. The virtual object overlaid in the virtual scene202shot the virtual bullet106(FIG. 1). As yet another example, the coaching program530adds, as an overlay, a username or a user ID of the virtual object that shot the bullet106.
The coaching program530provides the coaching session608to the client device602. For example, the processor system505executes the coaching program530to generate one or more coaching scenes in which one or more virtual objects generated based on the metadata606are overlaid on the selected game event data604. To illustrate, the one or more coaching scenes include an overlay of one or more virtual objects on the virtual scene202. The one or more coaching scenes are displayed on the display screen of the client device602.
The coaching session608continues until an input640is received from the client device602. For example, the input640is generated when the user1selects one or more buttons on the game controller1to end the coaching session608. The input640is sent within an input signal642to the coaching program530. The coaching program530identifies the input640to end the coaching session608from the coaching program530and ends the coaching session608. For example, the coaching program530ends the coaching session530for the user ID1in response to the reception of the input640. To illustrate, execution of the game program528for the user ID1continues after the coaching session608ends for the user ID1.
In one embodiment, the coaching program530does not request the client device602for identifying a game sequence type and for identifying a game event of the game sequence type. For example, the coaching program530does not send the signals624and634to the client device602. There is no need for the user1to identify a game sequence type and a game event of the game sequence type. Rather, the coaching program530determines that the input signal616for initiating the coaching session608is received immediately after the game event514or518of death of the virtual user102. For example, the coaching program530determines that the input signal616is received at an end of occurrence of the game event514or518and before an occurrence of a game event consecutive to the game event514are518. Upon determining so, the coaching program530determines that the coaching session608is to be generated for the game event514or518.
In one embodiment, the coaching session608occurs during a play of the game. For example, both the coaching program530and the game program528are executed simultaneously by the processor system505. Before the user1plays the game, an input signal indicating a selection for simultaneous execution of the coaching program530with the game program528is received from the game controller1of the client device602by the processor system505. Upon receiving the input signal, the processor system505executes the coaching program530. When the coaching program530is executed, one or more virtual objects are generated based on the metadata606and rendered by the coaching program530of the processor system506for display along with virtual objects of virtual scenes that are generated by execution of the game program528. For example, one or more virtual objects generated based on the metadata606are displayed in the virtual scenes110,202,302, and402during the coaching session608.
In one embodiment, a coaching scene is referred to herein as a self-coaching interface, such as a self-coaching user interface.
FIG. 7Ais a diagram of an embodiment of a coaching scene700for illustrating use of one or more virtual objects with the virtual scene110(FIG. 1). The one or more virtual objects are generated based on the metadata606(FIG. 6). The coaching scene700includes the virtual user102, the virtual gun108, the virtual bullet106, and the virtual ramp112. The coaching scene700is generated upon execution of the coaching program530(FIG. 6), and is displayed on the display device1.
The coaching scene700includes a virtual comment702, a virtual frame704, another virtual frame706, a gun ID708, a virtual frame710, a bullet ID712, a gun ID714, a virtual frame716, a virtual gun718, a virtual user720, a virtual frame722, a virtual user ID724, a virtual game controller726, and a virtual button ID728. The coaching scene700excludes the virtual tree105and the virtual wall107so as to not clutter the coaching scene700with virtual objects unnecessary for coaching the user1. The virtual comment702, the virtual frame704, the virtual frame706, the gun ID708, the virtual frame710, the bullet ID712, the gun ID714, the virtual frame716, the virtual gun718, the virtual user720, the virtual frame722, the virtual user ID724, the virtual game controller726, and the virtual button ID728are examples of one or more virtual objects, such as overlays, generated based on the metadata606. It should be noted that the virtual user720is not in the virtual scene110(FIG. 1A).
The virtual comment702is a coaching comment, “Jump Now! OR Start building a wall now!” for the user1to avoid being hit by the virtual bullet106. The virtual frame704extends around the virtual user102to highlight the virtual user102. Similarly, the virtual frame706extends around the virtual gun108. The gun ID708identifies a type of the virtual gun108to highlight the virtual gun108. For example, the gun ID708identifies whether the virtual gun108is a shotgun or a pistol or a semiautomatic gun or a double-barrel gun or an AK-47. The virtual frame710extends around the virtual bullet106to highlight the virtual bullet106. The bullet ID712identifies a type of the virtual bullet106. Examples of types of a virtual bullet include a lead round nose bullet, a semi-jacketed bullet, and a full metal jacket bullet.
The virtual frame716extends around the virtual gun718that is held by the virtual user720. The gun ID714identifies a type of the virtual gun718. The virtual frame722extends around the virtual user720to highlight the virtual user720. The virtual user ID724is a user ID that is assigned to another user who controls the virtual user720that shot the virtual user102with the virtual bullet106. The user ID724is assigned by the processor system505(FIG. 5A). The virtual game controller726is an image of the game controller1that is used or held by the user1playing the game and controlling the virtual user102. The virtual button ID728is a button that is selected by the user1during an occurrence of the virtual scene110(FIG. 1) leading to a death of the virtual user102. By providing the metadata606in the virtual scene700, the coaching program530facilitates self-coaching of the user1to increase the skill level of the user1.
The coaching scene700is displayed along or simultaneously with a timeline scrubber730, which is a time bar or a time scale or a time axis or a timeline. The coaching program530generates and renders the time scrubber730, which is a portion of the coaching session608. The user1uses the game controller1to select one of many segments, such as segments732,734, and736, of the timeline scrubber732to access the coaching scene700from the processor system505and view the coaching scene700. For example, when the segment732is selected, the coaching scene700is displayed on the display device1.
The timeline scrubber730includes segments for displaying the selected game event data604on the display screen1. As an example, the timeline scrubber730includes segments and each segment can be selected by the user1via the game controller1to generate an input signal. When the input signals are received, a playback of the virtual scenes leading up to a game event, a virtual scene of the game event, and virtual scenes occurring after the game event are played back by the coaching program530with overlay content superimposed on one or more of the virtual scenes. As another example, the timeline scrubber730includes a set738of segments that can be selected by the user1via the game controller1to enable a display of coaching scenes in which one or more virtual objects generated based on the metadata606are overlaid on portions of virtual scenes that are recorded during the predetermined time period before the time at which a virtual scene, such as the virtual scene202, of death of the virtual user102is recorded. The predetermined time period extends from −P time units to 0 time units, where time units can be seconds or minutes. Moreover, the timeline scrubber730includes a segment736that can be selected by the user1via the game controller1to view a coaching scene in which one or more virtual objects generated based on the metadata606are overlaid on a portion of a virtual scene, such as the virtual scene202, that is recorded at a time of death of the virtual user102. When the segment736of the timeline scrubber730is selected by the user1via the game controller1, the coaching scene including one or more virtual objects generated based on the metadata606are overlaid on one or more of the virtual objects in the virtual scene202in which the virtual user102died is rendered by the coaching program530and displayed on the display device1. The death of the virtual user102is recorded at 0 units. The timeline scrubber730includes a set740of segments that can be selected by the user1via the game controller1to enable a display of coaching scenes in which one or more virtual objects generated based on the metadata606are overlaid on portions of virtual scenes that are recorded during the preset time period after the time at which a virtual scene, such as the virtual scene202, of death of the virtual user102is recorded. The preset time period extends from 0 time units to M time units, where time units can be seconds or minutes. There is a window of time between the time units −P and M.
In one embodiment, the coaching scene700excludes one or more of the virtual frame704, the virtual frame706, and the gun ID708. For example, the coaching program530determines that one or more of the virtual frame704, the virtual frame706, and the gun ID708is not needed to increase the skill level of the user1, and therefore does not generate one or more of the virtual frame704, the virtual frame706, and the gun ID708.
In an embodiment, instead of extending a frame around a virtual object in a coaching scene, the coaching program530highlights the virtual object by rendering the virtual object in a substantially different color or intensity or shade compared to colors or intensities are shades of other virtual objects in the coaching scene. For example, instead of extending the frame710around the virtual bullet106and the frame722around the virtual user720, the coaching program530renders the virtual bullet106and the virtual user720to have a substantially greater intensity than intensities of the virtual user102and the virtual gun108. Again, highlighting the virtual bullet106and the virtual user720in comparison with the virtual user102and the virtual gun108facilitates self-coaching of the user1.
In an embodiment, the coaching scene700includes the virtual tree105and the virtual wall107.
In one embodiment, the coaching program530changes a time period between the time units −P and M based on a number of game events that lead up to an end game event, such as death or decrease in health level. For example, the coaching program520determines that a first length of time for which the virtual user102engages in a battle sequence that leads to a first death of the virtual user102is greater than a second length of time for which the virtual user102engages in a second battle sequence that leads to a second death of the virtual user102. The first battle sequence is fought with one or more virtual users and the second battle sequence is fought with one or more virtual users. The coaching program520stores the selected game event data604that leads up to the first death for the first length of time and stores the selected game event data604that leads up to the second death for the second length of time. In this manner, the entire first and second battle sequences are captured and stored by the coaching program520.
FIG. 7Bis a diagram of an embodiment of a coaching scene750that is rendered and displayed on the display device1. The coaching scene750is rendered by the coaching scene750in response to an input signal indicating a selection of a segment752on the timeline scrubber730. The segment752is selected by the user1using one or more buttons on the game controller1.
The segment752is consecutive to the segment734and the segment734is consecutive to the segment732. For example, the segment752includes one or more virtual objects from a virtual scene that is recorded between a time at which the virtual scene110(FIG. 1) is recorded and a time at which the virtual scene202(FIG. 2) is recorded. Each virtual scene corresponding to a segment of the timeline scrubber730is recorded after being displayed on the display screen1.
The coaching scene750includes a virtual comment754, a virtual frame756, a virtual bullet758, and a bullet ID760. The virtual comment754is generated by the coaching program530(FIG. 6) to coach the user1to protect the virtual user102from being shot by the virtual bullet758. For example, the virtual comment754includes “Jump Now!”. The virtual frame756extends around or surrounds the virtual bullet758that is shot by the virtual user720after the virtual bullet106(FIG. 7A) is shot. The virtual frame756highlights the virtual bullet758. The bullet ID760identifies a type of the virtual bullet758. The coaching scene750excludes the virtual tree105and the virtual wall107so as to not clutter the coaching scene750with virtual objects unnecessary for coaching the user1.
It should be noted that in the coaching scene750, a progression of virtual objects compared to the virtual objects in the coaching scene700ofFIG. 7Ais illustrated. For example, in the coaching scene750, the virtual frame756that surrounds the virtual bullet758shot from the virtual gun718is illustrated. The virtual frame756is generated after the virtual frame710that surrounds the virtual bullet106is generated. The virtual bullet758is shot after the virtual bullet106is shot. Also, in the coaching scene750, the virtual comment754indicates to the user1to control the virtual user102to jump without an option for building a virtual wall. The coaching program530determines that at the time the virtual bullet758is shot, it is too late for the user1to build the virtual wall.
In one embodiment, when a segment753next to the segment752is selected by the user1via the game controller1, the processor system505processes the metadata606to display a position of the virtual bullet758that is closer to the virtual user102compared to a position of the virtual bullet758illustrated in the coaching scene750. The segment753corresponds to a coaching scene that includes one or more virtual objects from a first virtual scene that is consecutive in time from a time at which a second virtual scene is recorded by the game recorder502. The positions of the virtual bullet758in the first and second virtual scenes define movement of the virtual bullet758. The coaching scene750includes virtual objects from a playback of the second virtual scene.
In an embodiment, the coaching scene750includes the virtual tree105and the virtual wall107.
In one embodiment, the virtual comments702and754are examples of hints to the user1to increase the skill level to be above the preset threshold. For example, during a next gaming session, when the user1controls the virtual user102or build a virtual wall on at least one side of the virtual user102, chances of the virtual user102from being hit by a virtual bullet from a side or behind the virtual user102are reduced. The virtual user102can stay alive longer during the game session and the game program528increases the skill level of the user1to be above the preset threshold. Each hint provides a reason for a death of the virtual user102. For example, the virtual user102died during a previous gaming session because the virtual user102did not build a virtual wall or did not jump during the previous gaming session.
FIG. 8Ais a diagram of an embodiment to illustrate a simultaneous display of the coaching scene700and another coaching scene800on the display device1. In addition to the coaching scene700, the coaching scene800is rendered by the coaching program530(FIG. 6) on the display device1. For example, the coaching session608(FIG. 6) includes the coaching scenes700and800. As another example, an input signal is received by the coaching program530from the client device602. The input signal indicates a selection of the game event identifier636(FIG. 6) of game event data recorded from the virtual scene110(FIG. 1) based on which the coaching scene700is generated and a selection of another game event identifier of game event data recorded from which the virtual scene302(FIG. 3) based on which the coaching scene800is generated. In this example, when a selection of the segment732on the timeline scrubber730is received from the client device602, both the coaching scenes700and800are generated and rendered by the coaching program530. Also in this example, the timeline scrubber730is generated by the coaching program530and rendered along the coaching scenes700and800.
The coaching scene800is generated based on the virtual scene302(FIG. 3). For example, the virtual scene302is recorded and one or more virtual objects from the virtual scene302are included in the coaching scene800. To illustrate, the virtual user102, the virtual bullet306, and the virtual gun108are included in the coaching scene800. As another example, the coaching scene800is generated based on the virtual scene302that is displayed and recorded at a time corresponding to the segment732. To illustrate, the virtual scene302is recorded a number of time segments prior to a time of the game event q of the virtual user102. The virtual scene302is recorded in the same manner as that of the virtual scene110from which the coaching scene700is generated.
The coaching scene800includes a user ID802, a virtual frame804, a virtual user806, a virtual gun808, a virtual frame810, a gun ID812, the virtual bullet306, a bullet ID816, and a virtual frame818. The user ID802, the virtual frame804, the virtual user806, the virtual gun808, the virtual frame810, the gun ID812, the bullet ID816, and the virtual frame818are examples of one or more overlays of virtual objects generated based on the metadata606(FIG. 6).
The virtual frame804extends around the virtual user806to highlight the virtual user806that is a reason for the death x of the virtual user102. The virtual user806is controlled by another user via a game controller. The virtual frame810extends around the virtual gun808held by the virtual user806to highlight the virtual gun808. The gun ID812identifies a type of the virtual gun808. The user ID802is assigned to the other user that controls the virtual user806. The user ID802is assigned by the processor system505(FIG. 5A).
The virtual frame818extends around the virtual bullet306to highlight the virtual bullet306that is directed towards the virtual user102by the virtual user806. The bullet ID816identifies the virtual bullet306. Also, the coaching scene800includes the virtual comment702to facilitate coaching of the user1.
In one embodiment, a virtual comment displayed within the coaching scene800is different from the virtual comment702. For example, the virtual comment displayed within the coaching scene800is “Jump!” or “Build a wall” instead of “Jump now! OR Start building a wall now!”.
In an embodiment, one or more virtual objects in the coaching scene800are not highlighted by the coaching program530. For example, the coaching program530determines that there is no need for highlighting the virtual user102and the virtual gun108to increase the skill level of the user1, and determines not to generate the virtual frame704and the virtual frame706.
FIG. 8Bis a diagram of an embodiment of the display device1to illustrate a simultaneous rendering and display of the coaching scene700associated with the user ID1and another coaching scene850associated with the user ID N. The coaching scene850is generated by the coaching program530(FIG. 5A). The coaching scene850is generated for the user ID N. For example, when the input signal623(FIG. 6) includes the user ID1identifying the user1and includes the user ID N identifying another user who is assigned a user ID N, the coaching program530accesses the game event data504recorded for the user ID N from one or more memory devices of the processor system505and stores a portion of the game event data504as a portion of the selected game event data604in one or more memory devices of the processor system505. To illustrate, the one or more memory devices in which the game event data504is recorded are different than the one or more memory devices in which the selected game event data604is recorded. The one or more virtual objects generated based on the metadata606for the user ID N are overlaid by the coaching program530on the portion of the selected game event data604for the user ID N to generate the coaching scene850for the user ID N. The game event data504is generated during game play of the game by the other user having the user ID N.
The coaching scene850includes a virtual tree852, a virtual user854, a virtual gun856, a gun ID858, a user ID860, a virtual frame862, another virtual frame864, a virtual user866, a virtual gun868, a virtual frame870, another virtual frame872, a gun ID874, a virtual bullet876, a virtual frame878, a gun ID880, a virtual user882, a virtual frame884, a virtual gun886, a virtual frame888, a virtual wall890, another virtual wall892, a virtual health894, a user ID897, and a virtual ramp899. The coaching scene850further includes a virtual controller896, and a virtual button898. The user ID860, the virtual frame862, the virtual frame864, the virtual user854, the virtual gun856, the gun ID858, the virtual frame872, the gun ID874, the virtual frame888, the virtual frame884, the virtual frame870, the virtual controller896, and the virtual button898are examples of one or more virtual objects generated based on the metadata606for overlay.
The coaching scene850is generated for the user ID N based on a virtual scene that is generated upon execution of the game program528(FIG. 5A), and leads up to a death of the virtual user882. For example, the virtual scene from which the coaching scene850is generated includes one or more of virtual objects displayed in the coaching scene850. The virtual scene from which the coaching scene850is generated excludes one or more virtual objects generated based on the metadata606for display in the coaching scene850. To illustrate, the virtual scene from which the coaching scene850is generated includes the virtual bullet876, the virtual gun886, the virtual user882, the virtual ramp899, the virtual walls890and892, the virtual tree852, the user ID897, the virtual user866, and the virtual gun868. The virtual scene for the user ID N is generated after the other user accesses the game program528. The game program528is accessed by the other user when the user ID N is authenticated by the processor system505.
The virtual frame862surrounds the virtual user854and the virtual frame864surrounds the virtual gun856. The gun ID858identifies a type of the virtual gun856. The user ID860is assigned to a user that controls the virtual user854via a game controller or another type of controller. The virtual frame878extends around the virtual bullet876that is shot from the virtual gun856towards the virtual user882.
The virtual health894is a health of the virtual user882. The virtual frame884extends around the virtual user882and the virtual frame888extends around the virtual gun886. The virtual user882is standing on the virtual ramp899and is surrounded on two sides by the virtual walls890and892. The gun ID880identifies a type of the virtual gun886. Also, the gun ID874identifies a type of the virtual gun868. The virtual frame872surrounds the virtual gun868and the virtual frame870surrounds the virtual user866. The user ID897identifies a user who is controlling the virtual user866via a game controller or another type of controller.
The virtual controller896is an image of a controller, such as a keyboard, that is used by the other user to control the virtual user882. The virtual button898identifies a button on the controller that is represented by the virtual controller896. The button is selected by the other user at the time corresponding to the segment732during the play of the game.
The timeline scrubber730is displayed along or simultaneous with a display of the coaching scenes700and850. For example, the coaching program530(FIG. 6) renders the timeline scrubber730along with the simultaneous display of the coaching scenes700and850, for display of the timeline scrubber730with the coaching scenes700and850. When a selection of the segment732is received from the game controller1(FIG. 1) of the client device602(FIG. 6), the coaching program530renders the coaching scene850, which is generated based on a virtual scene that is recorded at a time corresponding to the segment732. The segment736indicates the time 0 at which a death of the virtual user882occurs during a play of the game by the other user that controls the virtual user882.
The virtual user882is shooting at the virtual user866. During the shootout, the virtual user854is shooting at the virtual user882. The virtual wall892protects the virtual user882from being injured by the virtual bullet876. The user1can learn from the coaching scene850to build a virtual wall, such as the virtual wall890, on a side of the virtual user102to protect the virtual user102from the virtual bullet106shot by the virtual user720.
In one embodiment, instead of the coaching program530, a coaching engine is used. As an example, an engine is a combination of software and hardware for executing functions described herein as being performed by the engine. To illustrate, the engine is a PLD or an ASIC that is programmed to perform the functions described herein as being performed by the engine.
In one embodiment, any virtual frame, described herein, surrounds a virtual object to highlight the virtual object.
FIG. 8Cis a diagram of an embodiment of a system801to illustrate that instead of or in addition to rendering a coaching scene on the display screen1, the processor system505generates audio data805that is output as sound to the user1via a head phone803. The head phone803is worn to by the user1to be proximate to ears of the user1. During the coaching session608, the coaching program530generates the audio data805that is output as sound to the user1. For example, during the coaching session608, instead of or in addition to displaying the virtual comment702, the coaching program530sends the audio data805to the headphone803. The audio data805includes a message, such as “Jump while shooting”, that is output as sound simultaneously with or instead of the virtual comment702displayed on the display screen1. The headphone803outputs the audio data805as the message to the user1.
In an embodiment, the virtual objects702,704,706,708,710,712,714,716,718,720,722,724,726,728(FIG. 7A),756,758(FIG. 7B),802,804,806,808,810,812,816,818(FIG. 8A),854,856,858,860,864,878,880,884,888,870,872,874, and897are examples of overlay content that is generated by the coaching program530based on the metadata606.
FIG. 9is a diagram of an embodiment of a system900to illustrate use of an inferred training engine902for generating the coaching session608(FIG. 6). The system900includes an artificial intelligence (AI) processor904, the game event data510for the user ID1, and the game event data504for the user IDs2-N. The AI processor904is a processor of the processor system505(FIG. 5A).
The inferred training engine902includes a feature extractor906, a feature classifier908, and a model910that is to be trained. An example of each of the feature extractor906, the feature classifier908, and the model910is an ASIC. Another example of each of the feature extractor906, the feature classifier908, and the model910is a PLD. An example of the model910is a network of circuit elements. Each circuit element has one or more inputs and one or more outputs. An input of a circuit element is coupled to one or more outputs of one or more circuit elements. To illustrate, the model910is a neural network or an artificial intelligence model. The feature classifier908is coupled to the feature extractor906and to the model910.
The inferred training engine902accesses, such as reads, the game event data510and the game event data504from one or more memory devices of the game recorder502. The feature extractor906extracts features from the game event data510and504. Once the features are extracted, the feature classifier908classifies the features that are extracted. The features that are classified are used to train the model910to determine a game event, such as death or decrease in health level, for which to initiate the coaching session608and to identify one or more virtual objects of a virtual scene that are to be associated with one or more overlays of virtual objects generated based on the metadata606.
FIG. 10is a flowchart to illustrate an embodiment of a method1000for training the model910. In an operation1002of the method1000, the feature extractor906identifies features from the game event data510and the game event data504(FIG. 9). For example, the feature extractor906determines that a skeletal of the virtual user102in the virtual scene202(FIG. 2) lies in a horizontal plane instead of a vertical plane, or determines that the virtual scene202includes the virtual drone204above the skeletal of the virtual user102for beaming the virtual user102, or determines that words, such as, “I am going to die” or “I am dead” are uttered by the virtual user102in the virtual scene202, or a combination thereof to determine a death of the virtual user102in the virtual scene202. As another example, the feature extractor906determines that a health level of the virtual health of the virtual user102in the virtual scene202has decreased to zero to determine that the health level has decreased to be below the predetermined threshold.
An operation1004of the method1000occurs after the operation1002. In the operation1004, the feature classifier908classifies the features extracted from the game event data510and504. For example, the feature classifier908determines that because the skeletal of the virtual user102lies in the horizontal plane, the virtual user102has died. As another example, the feature classifier908determines that because the virtual scene202includes the virtual drone204above the skeletal of the virtual user102, the virtual user102has died. As yet another example, the feature classifier908determines that because the virtual scene202includes the word “I” and “dead” or “I” and “die” in the same sentence uttered by the virtual user102in the virtual scene202, the virtual user102has died. As another example, the feature classifier908determines that because the health level of the virtual user102in the virtual scene202has decreased to be below the predetermined threshold, health of the virtual user102is low.
An operation1006of the method1000occurs after the operation1004. In the operation1006, the model910is trained based on a number of game events of a sequence type to determine a probability of occurrence of a game event of the sequence type during a next gaming session. For example, the model910determines a probability that the virtual user102will die during a next gaming session or a probability that a health level of the virtual user102will decrease below the threshold during a next gaming session. To illustrate, upon determining that the virtual user102has died during at least 6 out of the past 10 gaming sessions, the model910determines that a probability that the virtual user102will die during a next gaming session is high, e.g., above a preset threshold. On the other hand, upon determining that the virtual user102has survived during at least 6 out of the past 10 gaming sessions, the model910determines that a probability that the virtual user102will die during a next gaming session is low, e.g., below the preset threshold.
An operation1008of the method1000follows the operation1006. The operation1008is executed by the processor system505. Upon determining that the probability of occurrence of a game event of a sequence type during the next gaming session is low, the processor system505does not recommend, in the operation1008, that the coaching session608be initiated for the next gaming session. On the other hand, upon determining that the probability of occurrence of a game event of a sequence type during the next gaming session is high, the processor system505recommends, in the operation1008, that the coaching session608be initiated for the next gaming session. For example, the processor system505generates a message and renders the message for display on the display device1. The message queries the user1whether the user1wishes to initiate the coaching session608. As another example, the processor system505generates audio data including the message to indicate to the user1whether the user1wishes to initiate the coaching session608, and sends the audio data to the head phone910(FIG. 9). The audio data is output as sound by the head phone910to the user1.
Upon viewing the message displayed on the display screen1or listening to the message, which is output as sound by the head phone910, the user1selects one or more buttons on the game controller1to indicate that the coaching session608be initiated to generate an input signal. Upon receiving the input signal indicating that the coaching session608be initiated, the processor system505initiates the coaching session608. For example, the processor system505renders the coaching scene700(FIG. 7A) or the coaching scene750(FIG. 7B) or both the coaching scenes700and800(FIG. 8A) or both the coaching scenes700and850(FIG. 8B) for display on the display screen1. On the other hand, upon receiving an input signal indicating that the coaching session608not be initiated, the processor system505does not initiate the coaching session608.
FIG. 11Ais a diagram of an embodiment of a system1100to illustrate a communication via a router and modem1104and a computer network1102between the processor system505and multiple devices, which include a display device1106and a hand-held controller1108. Examples of the display device1106include the display device1(FIG. 1), an LCD display device, an LED display device, a plasma display device, and an HMD. Examples of the hand-held controller1108include a touch screen, a keypad, a mouse, the game controller1, and a keyboard. Each of the display device1106and the hand-held controller1108is used by the user1. Also, each of the users2-N uses a hand-held controller having the same structure and function as that of the hand-held controller1108and a display device having the same structure and function as that of the display device1106. The display device1106and the hand-held controller1108are examples of the client device602(FIG. 6).
The system1100further includes the router and modem1104, the computer network1102, and the processor system505. The system1100also includes a headphone1110and the display device1106. The display device1106includes a display screen1112, such as an LCD display screen, and LED display screen, or a plasma display screen. The display device1(FIG. 1) is an example of the display device1106. An example of the computer network1102includes the Internet or an intranet or a combination thereof. An example of the router and modem1104includes a gateway device. Another example of the router and modem1104includes a router device and a modem device.
The display screen1112is coupled to the router and modem1104via a wired connection. Examples of a wired connection, as used herein, include a transfer cable, which transfers data in a serial manner, or in a parallel manner, or by applying a universal serial bus (USB) protocol.
The hand-held controller1108includes controls1118, a digital signal processor system (DSPS)1120, and a communication device1122. The controls1118are coupled to the DSPS1120, which is coupled to the communication device1122. Examples of the controls1118include buttons and joysticks. Examples of the communication device1122includes a communication circuit that enables communication using a wireless protocol, such as Wi-Fi™ or Bluetooth™, between the communication device1122and the router and modem1104.
The communication device1122is coupled to the headphone1110via a wired connection or a wireless connection. Examples of a wireless connection, as used herein, include a connection that applies a wireless protocol, such as a Wi-Fi™ or Bluetooth™ protocol. Also, the communication device1122is coupled to the router and modem1104via a wireless connection. Examples of a wireless connection include a Wi-Fi™ connection and a Bluetooth™ connection. The router and modem1104is coupled to the computer network1102, which is coupled to the processor system505.
During the play of the game, the processor system505executes the game program528to generate image frame data from one or more game states of the game and applies a network communication protocol, such as transfer control protocol over Internet protocol (TCP/IP), to the image frame data to generate one or more packets and sends the packets via the computer network1102to the router and modem1104. The modem of the router and modem1104applies the network communication protocol to the one or more packets received from the computer network1102to obtain or extract the image frame data, and provides the image frame data to the router of the router and modem1104. The router routes the image frame data via the wired connection between the router and the display screen1112to the display screen1112for display of one or more images of the game based on the image frame data received within the one or more packets.
During the display of one or more images of the game, the game recorder502records the game event data510and504, which is used to generate the coaching session608or train the model910or a combination thereof. Also, during execution of the game program528, the controls1118of the hand-held controller1108are selected or moved by the user1to generate an input signal, such as the input signal616, or623, or630, or638, or642, or an input signal indicating a selection of a segment of the timeline scrubber730, or any other input signal described herein, which is processed by the DSPS1120. The DSPS1120processes, such as measures or samples or filters or amplifies or a combination thereof, the input signal to output a processed input signal, which has the same information as that within the input signal. For example, the DSPS1120identifies a button of the game controller1selected by the user1. As another example, the DSPS1120identifies whether a joystick of the game controller1is moved or a button of the game controller1is selected by the user1. The processed input signal is sent from the DSPS1120to the communication device1122. The communication device1122applies the wireless protocol to the processed input signal to generate one or more wireless packets and sends the wireless packets to the router and modem1104. The wireless packets include the same information as that included within the processed input signal.
The router of the router and modem1104receives the wireless packets from the communication device1122, and applies the wireless protocol to obtain or extract the processed input signal from the wireless packets. The router of the router and modem604provides the processed input signal to the modem of the router and modem604. The modem applies the network communication protocol to the processed input signal and to generate one or more network packets. For example, the modem determines that the processed input signal is to be sent to the processor system502that is executing the game program506, and embeds a network address of the processor system505within the one or more network packets. The modem sends the one or more network packets via the computer network1102to the processor system505.
The processor system505applies the network communication protocol to the one or more network packets received from the router and modem1104to obtain or extract the information within the processed input signal, and processes the information in a manner explained above with reference toFIG. 6to initiate the coaching session608or to recommend initiation of the coaching session608(FIG. 6). The processor system505generates an output signal, such as the output signal618, or624, or634(FIG. 6), or any other output signal, described herein, or any other output signal including image frame data of a coaching scene, described herein, or an output signal including audio data, described herein, and applies the network communication protocol to the output signal to generate one or more network packets. The processor system505sends the one or more network packets via the computer network1102to the router and modem1104.
The modem of the router and modem1104applies the network communication protocol to the one or more network packets received via the computer network1102to obtain or extract the output signal. The router of the router and modem1104applies the wireless protocol to the output signal to generate one or more wireless packets and sends the wireless packets to the communication device1122of the hand-held controller1108. The communication device1122of the hand-held controller1108applies the wireless protocol to the one or more wireless packets received from the router and modem1104to obtain or extract the output signal and sends the output signal to the headphone1110for output of the audio data as sound to the user1. For example, the communication device1122applies a wired protocol, such as a universal serial bus (USB) protocol, to generate one or more packets having the audio data and sends the one or more packets via the wired connection to the headphone1110. As another example, the communication device1122applies the wireless protocol to the audio data to generate one or more wireless packets and sends the one or more wireless packets via the wireless connection to the headphone1110.
Also, the router of the router and modem1104sends the output signal to the display screen1112. Upon receiving the output signal, the display screen1112displays one or more image frames according to the image frame data of the output signal.
In one embodiment, the communication device1122communicates with the router and modem1104via a wired connection, such as a cable.
In one embodiment, the display screen1112is coupled to the router and modem1104via a communication device, such as a communication device that applies a wireless communication protocol. The communication device is a part of the display device1106. For example, the display screen1112is coupled to the communication device. The router1104applies the wireless protocol to the image frame data received via the computer network1102to generate one or more wireless packets and sends the one or more wireless packets to the communication device of the display device1106. The communication device applies the wireless protocol to the one or more wireless packets to extract or obtain the image frame data and sends image frame data to the display screen1112for display of one or more images of a coaching scene, described herein.
In one embodiment, the display device1106and the hand-held controller1108are integrated within a mobile device, such as a smartphone or a tablet or a laptop.
FIG. 11Bis a diagram of an embodiment of a system1120to illustrate that the processor system505can be implemented within a game console1122. The system1120includes the display device1106, the hand-held controller1108, the head phone1110, the game console1122, the router and modem1104, the computer network1102, and a server system1124. An example of the game console1124is a video game console or a computer or a combination of a central processing unit (CPU) and a graphics processing unit (GPU). To illustrate, the game console1124is a Sony PlayStation™ or a Microsoft Xbox™. The game console1124includes the processor system505and a communication device1126, such as Wi-Fi™ communication device or a Bluetooth™ communication device. As an example, a processor system as used herein, includes one or more CPUs and one or more GPUs, and the one or more CPUs are coupled to the one or more GPUs.
An example of the server system1124includes one or more servers within one or more data centers. To illustrate, a server can be a game console or a server blade. As another example, the server system1124includes one or more virtual machines. The communication device1126is coupled to the communication device of the display device1106via a wireless connection, such as a Wi-Fi™ connection or a Bluetooth™ connection. Moreover, the communication device1126is coupled to the communication device1122of the hand-held controller1108via a wireless connection. The communication device1126is coupled to the processor system505. The processor system505is coupled to the router and modem1104via a wired connection. The router and modem1104is coupled via the computer network1102to the server system1124.
The processor system505instead of or in conjunction with the server system1124executes the game for display of virtual scenes on the display screen1112of the display device1106. For example, in response to receiving login information that is provided by the user1via the game controller1, the processor system505sends a request to the server system1124via the computer network1102to determine whether the login information is valid. Upon receiving an indication from the server system1124via the computer network1102that the login information received from the game controller1is valid, the processor system505executes the game program528for play of the game by the user1via the game controller1and the game console1122. On the other hand, upon receiving an indication from the server system1124via the computer network1102that the login information received from the game controller1is invalid, the processor system505does not execute the game program528for play by the user1via the game controller1and the game console1122.
The communication device1126receives the wireless packets having the input signal, such as the input signal616, or623, or630, or638, or642(FIG. 6), or an input signal indicating a selection of a segment of the timeline scrubber730, or any other input signal described herein, from the hand-held controller1108, and applies the wireless protocol to the wireless packets to extract the input signal from the wireless packets, and provides the input signal to the processor system505. The processor system505generates an output signal, such as the output signal618, or624, or634(FIG. 6), or any other output signal, described herein, or any other output signal including image frame data of a coaching scene, described herein, or an output signal including audio data, described herein, based on the input signal in a manner described above. For example, the processor system505generates data for the coaching session608. The processor system505provides the output signal to the communication device1126. The communication device1126applies the wireless protocol to the output signal to generate one or more wireless packets and sends the wireless packets to the communication device1122of the hand-held controller1108or to the display device1106, or to both the display device1106and the hand-held controller1108.
In one embodiment, some of the functions described herein as being performed by the processor system505are performed by the processor system505of the game console1122and the remaining functions, described herein as being performed by the processor system505, are instead performed by the server system1124.
In an embodiment, the processor system505is coupled to the display device1106via a wired connection. The output signal is sent from the processor system via the wired connection to the display device1106for display of one or more images of a coaching scene, described herein, or a virtual scene, described herein, on the display device1106.
In one embodiment, the game is stored on the game console1122and is a non-networked game. In this embodiment, the game is a single player game or a multi-player game and the game console1122does not communicate to the server system1124to access any portion of the game from the server system1124.
FIG. 11Cis a diagram of an embodiment of a system1140to illustrate communication between a smart phone1142and the processor system505via the computer network1102without using the router and modem1104between the computer network1102and the smart phone1142. The system1140includes the smart phone1142, the head phone1110, a cellular network1144, the computer network1102, and the processor system505. The smart phone1142is an example of the client device602(FIG. 6).
The smart phone1142is coupled to the cellular network1144via a cellular wireless connection, such as a fourth-generation cellular wireless (4G) connection or a fifth cellular wireless (5G) connection. The cellular network1144is coupled to the computer network1102, which is coupled to the processor system505.
The smart phone1142generates one or more packets by applying a cellular communication protocol, such as the 4G or the 5G protocol, to the input signal, such as the input signal616, or623, or630, or638, or642, or an input signal indicating a selection of a segment of the timeline scrubber730, or any other input signal described herein, and sends the one or more packets to the cellular network1144. The cellular network1144receives the one or more packets and applies the cellular communication protocol to obtain or extract the input signal, and applies the network communication protocol to the input signal to generate one or more network packets. The one or more network packets generated by the cellular network1144are sent via the computer network1102to the processor system505. The processor system505processes the one or more network packets received from the cellular network1102in a manner described above to generate an output signal, such as the output signal618, or624, or634, or any other output signal, described herein, or any other output signal including image frame data of a coaching scene, described herein, or an output signal including audio data, described herein, and sends one or more network packets including the output signal via the computer network1102to the cellular network1144.
The cellular network1144applies the network communication protocol to the one or more network packets received from the processor system505to extract or obtain the output signal, and applies the cellular communication protocol to the output signal to generate one or more packets. The cellular network1102sends the one or more packets including the output signal to the smart phone1142.
FIG. 12Ais a diagram of an embodiment of the headphone1110. The headphone1110includes a communication device1202, a digital-to-analog (D/A) converter1204, an audio amplifier1206, and a speaker1208. An example of the communication device1202is a communication circuit that applies the wired protocol or the wireless protocol.
The communication device1202is coupled to the communication device1122(FIG. 11A) of the hand-held controller1108or to the smart phone1142(FIG. 11C). The digital-to-analog converter1204is coupled to the communication device1202and the audio amplifier1206is coupled to the digital-to-analog converter1204. Also, the speaker1208is coupled to the audio amplifier1206.
The communication device1202receives one or more packets having the audio data805(FIG. 8C) from the communication device1122(FIG. 11A) or from the smart phone1142(FIG. 11C), and applies a protocol, such as the wired protocol or the wireless protocol, to extract or obtain the audio data805from the one or more packets. The communication device1202sends the audio data805to the digital-to-analog converter1204. The digital-to-analog converter1204converts the audio data805from a digital format to an analog format to output analog audio signals. The digital-to-analog converter1204sends the analog audio signals output based on the audio data805to the audio amplifier1206. The audio amplifier1206amplifies, such as increases an amplitude or a magnitude, of the analog audio signals to output amplified audio signals, which are electrical signals. The speaker1208converts electrical energy of the amplified audio signals into sound energy to output sounds to be heard by the user1(FIG. 1).
In one embodiment, instead of the speaker1208, multiple speakers are used.
FIG. 12Bis a diagram of an embodiment of the display device1250to for displaying an image of a coaching scene or a virtual scene, described herein, on the display screen1112. The display device1250is an example of the display device1(FIG. 1). The display device1250includes a communication device1252and the display screen1112. Examples of the display device1250include an LCD display device, an LED display device, and a plasma display device. Examples of the display screen1112include an LCD display screen, and LED display screen, and a plasma display screen. To illustrate, the display device1250is a display device of the smart phone1142or of the game controller1or of a tablet or of a computer. Examples of a computer include a desktop computer and a laptop computer. Examples of the communication device1252include a communication circuit that applies the wired or wireless protocol for communication of data. The communication device1252is coupled to the display screen1112.
Instead of or in addition to generating other forms of data, such as the audio data805(FIG. 8C), the processor system505generates image frame data of a coaching scene or a virtual scene, described herein. In the same manner in which the processor system505generates one or more packets having the audio data505, the processor system505or the communication device1126(FIG. 11B) generates one or more packets by applying a protocol, such as the network communication protocol, the wired protocol, or the wireless protocol, to the image frame data and sends the one or more packets to the display device1250. For example, with reference toFIG. 11A, the processor system505applies the network communication protocol to the image frame data to generate one or more network packets and sends the one or more network packets via the computer network1102to the router and modem1104(FIG. 11A). The router and modem1104processes the one or more network packets having the image frame data in the same manner in which the router and modem1104processes the one or more network packets having the audio data805to obtain the image frame data from the one or more network packets, applies the wireless protocol to the image frame data to generate one or more wireless packets, and sends the one or more wireless packets to the communication device1252. As another example, with reference toFIG. 11B, the communication device1126of the game console1122applies the wireless protocol to the image frame data to generate one or more wireless packets, and sends the wireless packets to the communication device1252of the display device1250. As another example, with reference toFIG. 11C, the cellular network1144receives one or more network packets having the image frame data via the computer network1102from the processor system505and applies the network communication protocol to extract the image frame data from the one or more network packets, and applies the cellular communication protocol to the image frame data to generate one or more packets. The cellular network1144sends the one or more packets having the image frame data to the smart phone1142.
Referring back toFIG. 12B, the communication device1152receives the one or more packets having the image frame data and applies a protocol, such as the cellular communication protocol, the wired protocol, or the wireless protocol, to extract or obtain the image frame data from the one or more packets, and sends the image frame data to the display screen1112. Upon receiving the image frame data, the display screen1112displays the coaching scene or the virtual scene, described herein.
FIG. 13is a flow diagram conceptually illustrating various operations which are performed for streaming a cloud video game to a client device, in accordance with implementations of the disclosure. Examples of the client device include a game controller, a smart phone, a game console, and a computer. A game server1302executes a video game and generates raw (uncompressed) video1304and audio1306. The game event data510or the game event data504or a combination thereof is an example of a recording of combination of the video1004and audio1306. The game server1302is an example of the processor system505. The video1304and audio1306are captured and encoded for streaming purposes, as indicated at reference1308in the illustrated diagram. The encoding provides for compression of the video and audio streams to reduce bandwidth usage and optimize the gaming experience. Examples of encoding formats include H.265/MPEG-H, H.264/MPEG-4, H.263/MPEG-4, H.262/MPEG-2, WMV, VP6/7/8/9, etc.
Encoded audio1310and encoded video1312are further packetized into network packets, as indicated at reference numeral1314, for purposes of transmission over a computer network1320, which is an example of the computer network1102(FIG. 11A). In some embodiments, the network packet encoding process also employs a data encryption process, thereby providing enhanced data security. In the illustrated implementation, audio packets1316and video packets1318are generated for transport over the computer network1320.
The game server1302additionally generates haptic feedback data1322, which is also packetized into network packets for network transmission. In the illustrated implementation, haptic feedback packets1324are generated for transport over the computer network1320.
The foregoing operations of generating the raw video and audio and the haptic feedback data are performed on the game server1302of a data center, and the operations of encoding the video and audio, and packetizing the encoded audio/video and haptic feedback data for transport are performed by the streaming engine of the data center. As indicated at reference1320, the audio, video, and haptic feedback packets are transported over the computer network. As indicated at reference1326, the audio packets1316, video packets1318, and haptic feedback packets1324, are disintegrated, e.g., parsed, etc., by the client device602(FIG. 6) to extract encoded audio1328, encoded video1330, and haptic feedback data1322at the client device602from the network packets. If data has been encrypted, then the data is also decrypted. The encoded audio1328and encoded video1330are then decoded by the client device, as indicated at reference1334, to generate client-side raw audio and video data for rendering on a display device1340of the client device602. The haptic feedback data1322is processed by the processor of the client device602to produce a haptic feedback effect at a controller device1342or other interface device, e.g., the HMD, etc., through which haptic effects can be rendered. The controller device1342is an example of the game controller1. One example of a haptic effect is a vibration or rumble of the controller device1342.
It will be appreciated that a video game is responsive to user inputs, and thus, a similar procedural flow to that described above for transmission and processing of user input, but in the reverse direction from client device to server, is performed. As shown, the controller device1342or another input device, e.g., the body part of the user1, etc., or a combination thereof generates input data1348. Any of the control input signals616,623,630,638, and642(FIG. 6) is an example of the input data1348. The input data1348is packetized at the client device for transport over the computer network1320to the data center. Input data packets1346are unpacked and reassembled by the game server1302to define the input data1348on the data center side. The input data1348is fed to the game server1302, which processes the input data1348to generate a game state of the game.
During transport via the computer network1320of the audio packets1316, the video packets1318, and haptic feedback packets1324, in some embodiments, the transmission of data over the computer network1320is monitored to ensure a quality of service. For example, network conditions of the computer network1320are monitored as indicated by reference1350, including both upstream and downstream network bandwidth, and the game streaming is adjusted in response to changes in available bandwidth. That is, the encoding and decoding of network packets is controlled based on present network conditions, as indicated by reference1352.
FIG. 14is a block diagram of an embodiment of a game console1400that is compatible for interfacing with a display device of the client device and is capable of communicating via the computer network1320with a game hosting system, such as the processor system505(FIG. 5). The game console1122(FIG. 11) is an example of the game console1400. The game console1400is located within a data center A or is located at a location at which the user1is located. In some embodiments, the game console1400is used to execute a game that is displayed on an HMD1405. The game console1400is provided with various peripheral devices connectable to the game console1400. The game console1400has a cell processor1428, a dynamic random access memory (XDRAM) unit1426, a Reality Synthesizer graphics processor unit1430with a dedicated video random access memory (VRAM) unit1432, and an input/output (I/O) bridge1434. The game console1400also has a Blu Ray® Disk read-only memory (BD-ROM) optical disk reader1440for reading from a disk1440aand a removable slot-in hard disk drive (HDD)1436, accessible through the I/O bridge1434. Optionally, the game console1400also includes a memory card reader1438for reading compact flash memory cards, memory Stick® memory cards and the like, which is similarly accessible through the I/O bridge1434. The I/O bridge1434also connects to USB 2.0 ports1424, a gigabit Ethernet port1422, an IEEE 802.11b/g wireless network (Wi-Fi™) port1420, and a Bluetooth® wireless link port1418capable of supporting Bluetooth connections.
In operation, the I/O bridge1434handles all wireless, USB and Ethernet data, including data from a game controller and from the HMD1405. For example, when the user1is playing the game generated by execution of a portion of a game code, such as the game program528(FIG. 11A), the I/O bridge1434receives input data or an input signal, described herein, from the game controllers1342or1403and/or from the HMD1405via a Bluetooth link and directs the input data to the cell processor1428, which updates a current state of the game accordingly. As an example, a camera within the HMD1405captures a gesture of the user1to generate an image representing the gesture. The game controller1342is an example of the game controller1, which is an example of the HHC.
The wireless, USB and Ethernet ports also provide connectivity for other peripheral devices in addition to the game controllers1342and1403and the HMD1405, such as, for example, a remote control1404, a keyboard1406, a mouse1408, a portable entertainment device1410, such as, e.g., a Sony Playstation Portable® entertainment device, etc., a video camera, such as, e.g., an EyeToy® video camera1412, etc., a microphone headset1414, and a microphone1415. The portable entertainment device1410is an example of a game controller. In some embodiments, such peripheral devices are connected to the game console1400wirelessly, for example, the portable entertainment device1410communicates via a Wi-Fi™ ad-hoc connection, whilst the microphone headset1414communicates via a Bluetooth link. The microphone headset1414is an example of the head phone803(FIG. 8C).
The provision of these interfaces means that the game console1400is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital cameras, portable media players, Voice over Internet protocol (IP) telephones, mobile telephones, printers and scanners.
In addition, a legacy memory card reader1416is connected to the game console1400via the USB port1424, enabling the reading of memory cards1448of a kind used by the game console1400. The game controllers1342and1403and the HMD1405are operable to communicate wirelessly with the game console1400via the Bluetooth link1418, or to be connected to the USB port1424, thereby also receiving power by which to charge batteries of the game controller1342and1403and the HMD1405. In some embodiments, each of the game controllers1342and1403and the HMD1405includes a memory, a processor, a memory card reader, permanent memory, such as, e.g., flash memory, etc., light emitters such as, e.g., an illuminated spherical section, light emitting diodes (LEDs), or infrared lights, etc., microphone and speaker for ultrasound communications, an acoustic chamber, a digital camera, an internal clock, a recognizable shape, such as, e.g., a spherical section facing the game console1400, and wireless devices using protocols, such as, e.g., Bluetooth, Wi-Fi, etc.
The game controller1342is a controller designed to be used with two hands by the user1, and game controller1403is a single-hand controller with an attachment. The HMD1405is designed to fit on top of a head and/or in front of eyes of the user1. In addition to one or more analog joysticks and conventional control buttons, each game controller1342and1403is susceptible to three-dimensional location determination. Similarly, the HMD1405is susceptible to three-dimensional location determination. Consequently, in some embodiments, gestures and movements by the user1of the game controller1342and1403and of the HMD1405are translated as inputs to a game in addition to or instead of conventional button or joystick commands. Optionally, other wirelessly enabled peripheral devices, such as, e.g., the Playstation™ Portable device, etc., are used as a controller. In the case of the Playstation™ Portable device, additional game or control information, e.g., control instructions or number of lives, etc., is provided on a display screen of the device. In some embodiments, other alternative or supplementary control devices are used, such as, e.g., a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown), bespoke controllers, etc. Examples of bespoke controllers include a single or several large buttons for a rapid-response quiz game (also not shown).
The remote control1404is also operable to communicate wirelessly with the game console1400via the Bluetooth link1418. The remote control1404includes controls suitable for the operation of the Blu Ray Disk BD-ROM reader1440and for navigation of disk content.
The Blu Ray™ Disk BD-ROM reader1440is operable to read CD-ROMs compatible with the game console1400, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs. The Blu Ray™ Disk BD-ROM reader1440is also operable to read digital video disk-ROMs (DVD-ROMs) compatible with the game console1100, in addition to conventional pre-recorded and recordable DVDs. The Blu Ray™ Disk BD-ROM reader1440is further operable to read BD-ROMs compatible with the game console1400, as well as conventional pre-recorded and recordable Blu-Ray Disks.
The game console1400is operable to supply audio and video, either generated or decoded via the Reality Synthesizer graphics unit1430, through audio connectors1450and video connectors1452to a display and sound output device1442, such as, e.g., a monitor or television set, etc., having a display screen1444and one or more loudspeakers1446, or to supply the audio and video via the Bluetooth® wireless link port1118to the display device of the HMD1405. The audio connectors1450, in various embodiments, include conventional analogue and digital outputs whilst the video connectors1452variously include component video, S-video, composite video, and one or more High Definition Multimedia Interface (HDMI) outputs. Consequently, video output may be in formats such as phase alternating line (PAL) or National Television System Committee (NTSC), or in 2220p, 1080i or 1080p high definition. Audio processing, e.g., generation, decoding, etc., is performed by the cell processor1408. An operating system of the game console1400supports Dolby® 5.1 surround sound, Dolby® Theatre Surround (DTS), and the decoding of 7.1 surround sound from Blu-Ray® disks. The display and sound output device1442is an example of the display device1(FIG. 1).
In some embodiments, a video camera, e.g., the video camera1412, etc., comprises a single charge coupled device (CCD), an LED indicator, and hardware-based real-time data compression and encoding apparatus so that compressed video data is transmitted in an appropriate format such as an intra-image based motion picture expert group (MPEG) standard for decoding by the game console1400. An LED indicator of the video camera1412is arranged to illuminate in response to appropriate control data from the game console1400, for example, to signify adverse lighting conditions, etc. Some embodiments of the video camera1112variously connect to the game console1400via a USB, Bluetooth or Wi-Fi communication port. Various embodiments of a video camera include one or more associated microphones and also are capable of transmitting audio data. In several embodiments of a video camera, the CCD has a resolution suitable for high-definition video capture. In use, images captured by the video camera are incorporated within a game or interpreted as game control inputs. In another embodiment, a video camera is an infrared camera suitable for detecting infrared light.
In various embodiments, for successful data communication to occur with a peripheral device, such as, for example, a video camera or remote control via one of the communication ports of the game console1400, an appropriate piece of software, such as, a device driver, etc., is provided.
In some embodiments, the aforementioned system devices, including the game console1400, the HHC, and the HMD1405enable the HMD1405to display and capture video of an interactive session of a game. The system devices initiate an interactive session of a game, the interactive session defining interactivity between the user1and other users and the game. The system devices further determine an initial position and orientation of the HHC, and/or the HMD1405operated by the user1. The game console1400determines a current state of a game based on the interactivity between the user1and the game. The system devices track a position and orientation of the HHC and/or the HMD1405during an interactive session of the user1with a game. The system devices generate a spectator video stream of the interactive session based on a current state of a game and the tracked position and orientation of the HHC and/or the HMD1405. In some embodiments, the HHC renders the spectator video stream on a display screen of the HHC. In various embodiments, the HMD1405renders the spectator video stream on a display screen of the HMD1405.
With reference toFIG. 15, a diagram illustrating components of an HMD1502is shown. The HMD1502is an example of the HMD1405(FIG. 14). The HMD1502includes a processor1500for executing program instructions. A memory device1502is provided for storage purposes. Examples of the memory device1502include a volatile memory, a non-volatile memory, or a combination thereof. A display device1504is included which provides a visual interface, e.g., display of image frames generated from save data, etc., that any the user1(FIG. 1) views. A battery1506is provided as a power source for the HMD1502. A motion detection module1508includes any of various kinds of motion sensitive hardware, such as a magnetometer1510, an accelerometer1512, and a gyroscope1514.
An accelerometer is a device for measuring acceleration and gravity induced reaction forces. Single and multiple axis models are available to detect magnitude and direction of the acceleration in different directions. The accelerometer is used to sense inclination, vibration, and shock. In one embodiment, three accelerometers1512are used to provide the direction of gravity, which gives an absolute reference for two angles, e.g., world-space pitch and world-space roll, etc.
A magnetometer measures a strength and a direction of a magnetic field in a vicinity of the HMD1502. In some embodiments, three magnetometers1510are used within the HMD1502, ensuring an absolute reference for the world-space yaw angle. In various embodiments, the magnetometer is designed to span the earth magnetic field, which is ±80 microtesla. Magnetometers are affected by metal, and provide a yaw measurement that is monotonic with actual yaw. In some embodiments, a magnetic field is warped due to metal in the real-world environment, which causes a warp in the yaw measurement. In various embodiments, this warp is calibrated using information from other sensors, e.g., the gyroscope1514, a camera1516, etc. In one embodiment, the accelerometer1512is used together with magnetometer1510to obtain the inclination and azimuth of the HMD1502.
A gyroscope is a device for measuring or maintaining orientation, based on the principles of angular momentum. In one embodiment, instead of the gyroscope1514, three gyroscopes provide information about movement across the respective axis (x, y and z) based on inertial sensing. The gyroscopes help in detecting fast rotations. However, the gyroscopes, in some embodiments, drift overtime without the existence of an absolute reference. This triggers resetting the gyroscopes periodically, which can be done using other available information, such as positional/orientation determination based on visual tracking of an object, accelerometer, magnetometer, etc.
The camera1516is provided for capturing images and image streams of a real-world environment, e.g., room, cabin, natural environment, etc., surrounding any of the users1-3. In various embodiments, more than one camera is included in the HMD1502, including a camera that is rear-facing, e.g., directed away from the user1when the user1is viewing the display of the HMD1502, etc., and a camera that is front-facing, e.g., directed towards the user1when the user1is viewing the display of the HMD1502, etc. Additionally, in several embodiments, a depth camera1518is included in the HMD1502for sensing depth information of objects in the real-world environment.
The HMD1502includes speakers1520for providing audio output. Also, a microphone1522is included, in some embodiments, for capturing audio from the real-world environment, including sounds from an ambient environment, and speech made by the user1, etc. The HMD1502includes a tactile feedback module1524, e.g., a vibration device, etc., for providing tactile feedback to the user1. In one embodiment, the tactile feedback module1524is capable of causing movement and/or vibration of the HMD1502to provide tactile feedback to the user1.
LEDs1526are provided as visual indicators of statuses of the HMD1502. For example, an LED may indicate battery level, power on, etc. A card reader1528is provided to enable the HMD1502to read and write information to and from a memory card. A USB interface1530is included as one example of an interface for enabling connection of peripheral devices, or connection to other devices, such as other portable devices, computers, etc. In various embodiments of the HMD1502, any of various kinds of interfaces may be included to enable greater connectivity of the HMD1502.
A Wi-Fi™ module1532is included for enabling connection to the Internet via wireless networking technologies. Also, the HMD1502includes a Bluetooth™ module1534for enabling wireless connection to other devices. A communications link1536is also included, in some embodiments, for connection to other devices. In one embodiment, the communications link1536utilizes infrared transmission for wireless communication. In other embodiments, the communications link1536utilizes any of various wireless or wired transmission protocols for communication with other devices.
Input buttons/sensors1538are included to provide an input interface for the user1(FIG. 1). Any of various kinds of input interfaces are included, such as buttons, touchpad, joystick, trackball, etc. An ultra-sonic communication module1540is included, in various embodiments, in the HMD1502for facilitating communication with other devices via ultra-sonic technologies.
Bio-sensors1542are included to enable detection of physiological data from the user1. In one embodiment, the bio-sensors1542include one or more dry electrodes for detecting bio-electric signals of the user1through the user1's skin.
The foregoing components of HMD1502have been described as merely exemplary components that may be included in HMD1502. In various embodiments, the HMD1502include or do not include some of the various aforementioned components.
FIG. 16illustrates an embodiment of an Information Service Provider (INSP) architecture. INSPs1302delivers a multitude of information services to the user1geographically dispersed and connected via a computer network1606, e.g., a LAN, a WAN, or a combination thereof, etc. The computer network1102(FIG. 11A) is an example of the computer network1606. An example of the WAN includes the Internet and an example of the LAN includes an Intranet. The user1operates a client device1620-1, another user2operates another client device1620-2, and yet another user3operates yet another client device1620-3. The client device1620-1is an example of the client device602(FIG. 6).
In some embodiments, each client device1620-1,1620-2, and1620-3includes a central processing unit (CPU), a display, and an input/output (I/O) interface. Examples of each client device1620-1,1620-2, and1620-3include a personal computer (PC), a mobile phone, a netbook, a tablet, a gaming system, a personal digital assistant (PDA), the game console1400and a display device, the HMD1502(FIG. 15), the game console1400and the HMD1502, a desktop computer, a laptop computer, and a smart television, etc. In some embodiments, the INSP1602recognizes a type of a client device and adjusts a communication method employed.
In some embodiments, an INSP delivers one type of service, such as stock price updates, or a variety of services such as broadcast media, news, sports, gaming, etc. Additionally, the services offered by each INSP are dynamic, that is, services can be added or taken away at any point in time. Thus, an INSP providing a particular type of service to a particular individual can change over time. For example, the client device1620-1is served by an INSP in near proximity to the client device1620-1while the client device1620-1is in a home town of the user1, and client device1620-1is served by a different INSP when the user1travels to a different city. The home-town INSP will transfer requested information and data to the new INSP, such that the information “follows” the client device1620-1to the new city making the data closer to the client device1620-1and easier to access. In various embodiments, a master-server relationship is established between a master INSP, which manages the information for the client device1620-1, and a server INSP that interfaces directly with the client device1620-1under control from the master INSP. In some embodiments, data is transferred from one ISP to another ISP as the client device1620-1moves around the world to make the INSP in better position to service client device1620-1be the one that delivers these services.
The INSP1602includes an Application Service Provider (ASP)1608, which provides computer-based services to customers over the computer network1606. Software offered using an ASP model is also sometimes called on-demand software or software as a service (SaaS). A simple form of providing access to a computer-based service, e.g., customer relationship management, etc., is by using a standard protocol, e.g., a hypertext transfer protocol (HTTP), etc. The application software resides on a vendor's server and is accessed by each client device1620-1,1620-2, and1620-3through a web browser using a hypertext markup language (HTML), etc., by a special purpose client software provided by the vendor, and/or other remote interface, e.g., a thin client, etc.
Services delivered over a wide geographical area often use cloud computing. Cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the computer network1606. The users1-3do not need to be an expert in the technology infrastructure in the “cloud” that supports them. Cloud computing is divided, in some embodiments, in different services, such as Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). Cloud computing services often provide common business applications online that are accessed from a web browser, while the software and data are stored on the servers. The term cloud is used as a metaphor for the computer network1606, e.g., using servers, storage and logic, etc., based on how the computer network1606is depicted in computer network diagrams and is an abstraction for the complex infrastructure it conceals.
Further, the INSP1602includes a game processing provider (GPP)1610, also sometime referred to herein as a game processing server, which is used by the client devices1620-1,1620-2, and1620-3to play single and multiplayer video games. Most video games played over the computer network1606operate via a connection to a game server. Typically, games use a dedicated server application that collects data from the client devices1620-1,1620-2, and1620-3and distributes it to other clients that are operated by other users. This is more efficient and effective than a peer-to-peer arrangement, but a separate server is used to host the server application. In some embodiments, the GPP1610establishes communication between the client devices1620-1,1620-2, and1620-3, which exchange information without further relying on the centralized GPP1610.
Dedicated GPPs are servers which run independently of a client. Such servers are usually run on dedicated hardware located in data centers, providing more bandwidth and dedicated processing power. Dedicated servers are a method of hosting game servers for most PC-based multiplayer games. Massively multiplayer online games run on dedicated servers usually hosted by the software company that owns the game title, allowing them to control and update content.
A broadcast processing server (BPS)1612, sometimes referred to herein as a broadcast processing provider, distributes audio or video signals to an audience. Broadcasting to a very narrow range of audience is sometimes called narrowcasting. A final leg of broadcast distribution is how a signal gets to the client devices1620-1,1620-2, and1620-3, and the signal, in some embodiments, is distributed over the air as with a radio station or a television station to an antenna and receiver, or through a cable television or cable radio or “wireless cable” via the station. The computer network1606also brings, in various embodiments, either radio or television signals to the client devices1620-1,1620-2, and1620-3, especially with multicasting allowing the signals and bandwidth to be shared. Historically, broadcasts are delimited, in several embodiments, by a geographic region, e.g., national broadcasts, regional broadcasts, etc. However, with the proliferation of high-speed Internet, broadcasts are not defined by geographies as content can reach almost any country in the world.
A storage service provider (SSP)1614provides computer storage space and related management services. The SSP1614also offers periodic backup and archiving. By offering storage as a service, the client devices1620-1,1620-2, and1620-3use more storage compared to when storage is not used as a service. Another major advantage is that the SSP1614includes backup services and the client devices1620-1,1620-2, and1620-3will not lose data if their hard drives fail. Further, a plurality of SSPs, in some embodiments, have total or partial copies of the data received from the client devices1620-1,1620-2, and1620-3, allowing the client devices1620-1,1620-2, and1620-3to access data in an efficient way independently of where the client devices1620-1,1620-2, and1620-3are located or of types of the clients. For example, the user1accesses personal files via a home computer, as well as via a mobile phone while the user1is on the move.
A communications provider1616provides connectivity to the client devices1620-1,1620-2, and1620-3. One kind of the communications provider1616is an Internet service provider (I6P) which offers access to the computer network1606. The ISP connects the client devices1620-1,1620-2, and1620-3using a data transmission technology appropriate for delivering Internet Protocol datagrams, such as dial-up, digital subscriber line (DSL), cable modem, fiber, wireless or dedicated high-speed interconnects. The communications provider1616also provides, in some embodiments, messaging services, such as e-mail, instant messaging, and short message service (SMS) texting. Another type of a communications Provider is a network service provider (NSP), which sells bandwidth or network access by providing direct backbone access to the computer network1606. Examples of network service providers include telecommunications companies, data carriers, wireless communications providers, Internet service providers, cable television operators offering high-speed Internet access, etc.
A data exchange1618interconnects the several modules inside INSP1302and connects these modules to the client devices1620-1,1620-2, and1620-3via computer network1606. The data exchange1618covers, in various embodiments, a small area where all the modules of INSP1602are in close proximity, or covers a large geographic area when the different modules are geographically dispersed. For example, the data exchange1602includes a fast Gigabit Ethernet within a cabinet of a data center, or an intercontinental virtual LAN.
In some embodiments, communication between the server system and the client devices1620-1through1620-3may be facilitated using wireless technologies. Such technologies may include, for example, 5G wireless communication technologies. 5G is the fifth generation of cellular network technology. 5G networks are digital cellular networks, in which the service area covered by providers is divided into small geographical areas called cells. Analog signals representing sounds and images are digitized in the telephone, converted by an analog-to-digital converter and transmitted as a stream of bits. All the 5G wireless devices in a cell communicate by radio waves with a local antenna array and low power automated transceiver (transmitter and receiver) in the cell, over frequency channels assigned by the transceiver from a pool of frequencies that are reused in other cells. The local antennas are connected with the telephone network and the Internet by a high bandwidth optical fiber or wireless backhaul connection. As in other cell networks, a mobile device crossing from one cell to another is automatically transferred to the new cell. It should be understood that 5G networks are just an example type of communication network, and embodiments of the disclosure may utilize earlier generation wireless or wired communication, as well as later generation wired or wireless technologies that come after 5G.
It should be noted that in various embodiments, one or more features of some embodiments described herein are combined with one or more features of one or more of remaining embodiments described herein.
Embodiments described in the present disclosure may be practiced with various computer system configurations including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like. In one implementation, the embodiments described in the present disclosure are practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a wire-based or wireless network.
With the above embodiments in mind, it should be understood that, in one implementation, the embodiments described in the present disclosure employ various computer-implemented operations involving data stored in computer systems. These operations are those requiring physical manipulation of physical quantities. Any of the operations described herein that form part of the embodiments described in the present disclosure are useful machine operations. Some embodiments described in the present disclosure also relate to a device or an apparatus for performing these operations. The apparatus is specially constructed for the required purpose, or the apparatus is a general-purpose computer selectively activated or configured by a computer program stored in the computer. In particular, in one embodiment, various general-purpose machines are used with computer programs written in accordance with the teachings herein, or it may be more convenient to construct a more specialized apparatus to perform the required operations.
In an implementation, some embodiments described in the present disclosure are embodied as computer-readable code on a computer-readable medium. The computer-readable medium is any data storage device that stores data, which is thereafter read by a computer system. Examples of the computer-readable medium include a hard drive, a network-attached storage (NAS), a ROM, a RAM, a compact disc ROM (CD-ROM), a CD-recordable (CD-R), a CD-rewritable (CD-RW), a magnetic tape, an optical data storage device, a non-optical data storage device, etc. As an example, a computer-readable medium includes computer-readable tangible medium distributed over a network-coupled computer system so that the computer-readable code is stored and executed in a distributed fashion.
Moreover, although some of the above-described embodiments are described with respect to a gaming environment, in some embodiments, instead of a game, other environments, e.g., a video conferencing environment, etc., is used.
Although the method operations were described in a specific order, it should be understood that other housekeeping operations may be performed in between operations, or operations may be adjusted so that they occur at slightly different times, or may be distributed in a system which allows the occurrence of the processing operations at various intervals associated with the processing, as long as the processing of the overlay operations are performed in the desired way.
Although the foregoing embodiments described in the present disclosure have been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications can be practiced within the scope of the appended claims. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the embodiments are not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.
Claims
- A method for processing a self-coaching interface, comprising, identifying a gameplay event during gameplay by a user, wherein the gameplay event is tagged as falling below a skill threshold while a game is being played by the user;generating a recording for a window of time for the gameplay event;processing game telemetry for the recording of the gameplay event, wherein the game telemetry is used to identify a progression of interactive actions before the gameplay event for the window of time;and generating overlay content in the self-coaching interface, wherein the overlay content is applied to one or more image frames of the recording when viewed via the self-coaching interface, the overlay content appearing in the one or more image frames during a playback of the recording, wherein the overlay content provides hints for increasing a skill of the user to be above the skill threshold for the tagged gameplay event.
- The method of claim 1, wherein the window of time includes a time period before the gameplay event, a time of the gameplay event, and a time period after the gameplay event.
- The method of claim 1, wherein the window of time changes with the gameplay event and other gameplay events occurring before the gameplay event.
- The method of claim 1, wherein the game telemetry is processed to identify a reason for the skill of the user to fall below the threshold.
- The method of claim 4, wherein the game telemetry includes one or more game states that are processed to identify a virtual object that is not in the recording for the time window.
- The method of claim 1, wherein the progression of interactive actions includes a movement of a virtual object that is not in the recording for the time window.
- The method of claim 6, wherein the overlay content includes the virtual object.
- The method of claim 6, wherein the overlay content includes a frame that is placed around the virtual object.
- The method of claim 1, further comprising: generating a timeline scrubber;receiving a selection of a segment of the timeline scrubber;and modifying the self-coaching interface and the overlay content based on a selection of the segment.
- The method of claim 1, further comprising generating another self-coaching interface for display with the self-coaching interface, wherein the other self-coaching interface is associated with game play of another user and is of the same type as the gameplay event.
- The method of claim 1, further comprising generating another self-coaching interface for display with the self-coaching interface, wherein the other self-coaching interface is associated with a different gaming session of game play of the user and is of the same type as the gameplay event.
- The method of claim 1, further comprising training an artificial intelligence model to determine whether to initiate a coaching session in which the overlay content is overlaid on the self-coaching interface.
- The method of claim 1, wherein the self-coaching interface excludes one or more virtual objects that are displayed in a virtual scene during occurrence of the gameplay event.
- The method of claim 1, wherein the self-coaching interface includes one or more virtual objects that are not displayed in a virtual scene during occurrence of the game play event.
- A server system for processing a self-coaching interface, comprising, a processor configured to: identify a gameplay event during gameplay by a user, wherein the gameplay event is tagged as falling below a skill threshold while a game is being played by the user;generate a recording for a window of time for the gameplay event;process game telemetry for the recording of the gameplay event, wherein the game telemetry is used to identify a progression of interactive actions before the gameplay event for the window of time;and generate overlay content in the self-coaching interface, wherein the overlay content is applied to one or more image frames of the recording when viewed via the self-coaching interface, the overlay content appearing in the one or more image frames during a playback of the recording, wherein the overlay content provides hints for increasing a skill of the user to be above the skill threshold for the tagged gameplay event;and a memory device coupled to the processor for storing the recording of the game play event.
- The server system of claim 15, wherein the processor is configured to generate another self-coaching interface for display with the self-coaching interface, wherein the other self-coaching interface is associated with game play of another user and is related to the gameplay event.
- The server system of claim 15, wherein the processor is configured to generate another self-coaching interface for display with the self-coaching interface, wherein the other self-coaching interface is associated with a different gaming session of game play of the user and with the gameplay event.
- The server system of claim 15, wherein the window of time changes with the gameplay event and other gameplay events occurring before the gameplay event.
- A system for processing a self-coaching interface, comprising, a client device configured to be used by a user to facilitate generation of gameplay event while playing a game;a server coupled to the client device via a computer network, wherein the server is configured to: identify the gameplay event, wherein the gameplay event is tagged as falling below a skill threshold while a game is being played by the user;generate a recording for a window of time for the gameplay event;process game telemetry for the recording of the gameplay event, wherein the game telemetry is used to identify a progression of interactive actions before the gameplay event for the window of time;and generate overlay content in the self-coaching interface, wherein the overlay content is applied to one or more image frames of the recording when viewed via the self-coaching interface, the overlay content appearing in the one or more image frames during a playback of the recording, wherein the overlay content provides hints for increasing a skill of the user to be above the skill threshold for the tagged gameplay event.
- The system of claim 19, wherein the server system is configured to generate another self-coaching interface for display with the self-coaching interface, wherein the other self-coaching interface is associated with game play of another user and is related to the gameplay event.
Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.