U.S. Pat. No. 11,701,590

Player-Tracking Video Game

AssigneeMeta Platforms Technologies, LLC

Issue DateDecember 15, 2021

Illustrative Figure

Abstract

A rhythm-based video game (“game”) is disclosed. In the game, a player slashes blocks representing musical beats using a pair of energy blades resembling a lightsaber. A gaming console renders multiple digital objects, e.g., digital blocks, digital mines and digital obstacles, that are approaching a player in a virtual space. The gaming console also renders a digital representation of an instrument, e.g., a lightsaber (“digital saber”), using which the player slashes, cuts or otherwise interacts with the digital blocks to cause a digital collision between the digital saber and the digital blocks. The player can score by slashing the digital blocks, not hitting the digital mines and avoiding the digital obstacles. The game presents the player with a stream of approaching digital objects in synchronization with music, e.g., a song's beats, being played in the game. The pace at which the digital blocks approach the player increases with the beats.

Description

DETAILED DESCRIPTION The disclosure is related to a video game (“game”). In the game a player interacts with digital objects that approach the user in a 3D corridor. Interaction occurs via digital collision between a digital element controlled by the player. Control of the digital element is based on body tracking. In some embodiments, body tracking is performed via a worn or held peripheral that tracks its own movement relative some other reference point. In some embodiments, a depth camera or room-mapping cameras (e.g., Playstation Camera, Microsoft Kinect, LeapMotion, or equivalents) captures video of the player and uses computer vision techniques to identify body positions of the user. The game presents the player with a stream of approaching digital objects and the user causes the digital element to collide with the digital objects based on use of body tracking. Digital collisions with different types of digital objects and from different angles cause a variety of game actions to occur. In some embodiments, the game further tracks the motion of the user's body and shifts a player location in game corresponding to body movements. Movement of the player location enables the player to avoid digital obstacles. An embodiment of the disclosure is related to a rhythm-based video game. In the rhythm-based video game a player slashes blocks representing musical beats using one or a pair of energy blades resembling a lightsaber (the digital element). A gaming console renders multiple digital objects, e.g., a digital representation of a block (“digital block” or “block”), that are moving in a specified direction, e.g., in a direction towards a user or player. The gaming console also renders a digital representation of an instrument, e.g., a digital representation of a sword resembling a lightsaber (“digital saber”), using which the player slashes, cuts or otherwise interact with ...

DETAILED DESCRIPTION

The disclosure is related to a video game (“game”). In the game a player interacts with digital objects that approach the user in a 3D corridor. Interaction occurs via digital collision between a digital element controlled by the player. Control of the digital element is based on body tracking. In some embodiments, body tracking is performed via a worn or held peripheral that tracks its own movement relative some other reference point. In some embodiments, a depth camera or room-mapping cameras (e.g., Playstation Camera, Microsoft Kinect, LeapMotion, or equivalents) captures video of the player and uses computer vision techniques to identify body positions of the user. The game presents the player with a stream of approaching digital objects and the user causes the digital element to collide with the digital objects based on use of body tracking. Digital collisions with different types of digital objects and from different angles cause a variety of game actions to occur. In some embodiments, the game further tracks the motion of the user's body and shifts a player location in game corresponding to body movements. Movement of the player location enables the player to avoid digital obstacles.

An embodiment of the disclosure is related to a rhythm-based video game. In the rhythm-based video game a player slashes blocks representing musical beats using one or a pair of energy blades resembling a lightsaber (the digital element). A gaming console renders multiple digital objects, e.g., a digital representation of a block (“digital block” or “block”), that are moving in a specified direction, e.g., in a direction towards a user or player. The gaming console also renders a digital representation of an instrument, e.g., a digital representation of a sword resembling a lightsaber (“digital saber”), using which the player slashes, cuts or otherwise interact with the digital blocks to cause a digital collision between the digital saber and the digital blocks. The game presents the player with a stream of approaching digital blocks in synchronization with beats of music, e.g., a song's beats and notes, being played in the game. As the beat picks up in the music, the pace at which the digital blocks approach the player can increase.

A game action occurs in response to the digital collision. The game action can be any of an increase or decrease in score of the player, an increase or decrease in energy of the player, a gain or loss of life of the player in the game, an increase or decrease in a rate at which the score changes, an increase or decrease in the pace at which the blocks move towards the player, etc. The game can end based on multiple factors, such as after a specified time period, when the player runs out of energy or lives, or when the player issues a command to stop the game. The video game can be implemented as a two-dimensional (2D) video game, a three-dimensional (3D) video game, a virtual reality (VR) game, or an augmented reality (AR) game. In some embodiments, the gaming console is configured to implement the video game as a VR game.

FIG.1is a block diagram of an environment100in which the rhythm-based video game can be implemented. The environment100includes a gaming console105which executes a rhythm-based video game, such as the one described above. In some embodiments, the gaming console105can be a computing device having a processor and memory, and the processor executes instructions stored in the memory to present the game to a player110on a display device115. The display device115supports 2D and/or 3D rendering of the game. In some embodiments, the player110may have to wear 3D glasses (not illustrated) to experience the game in 3D. The gaming console105supports a VR implementation of the game. In the VR implementation, an apparatus such as a headset120may have to be used by the player110to experience the game in VR.

The headset120is head-mounted device, which is used to track the orientation or position of a body or head of the player110. The headset120has one or more display devices that presents the game in VR. The headset120can also have one or more sensors that are used to determine and transmit co-ordinates of the position of the player110to the gaming console105. Examples of such sensors include gyroscopes, accelerometers, structured light systems, depth sensing cameras, magnetic position sensors, and eye tracking sensors. Sensors can be located in one or more locations, e.g., integrated with the headset120, be worn by the player110anywhere on the body, integrated with a motion controller125, or part of other equipment worn by the player110. The gaming console105establishes the position of the player110in a 3D virtual space by translating the co-ordinates received from the headset120to coordinates in the 3D virtual space. The co-ordinates received from the headset120can also help in determining different positions or actions of the player110, e.g., whether the player110is sitting, standing, ducking, jumping, moving, etc. The headset120may include a microphone to receive any audio input from the player110or the surroundings of the player110. The headset120may include one or more speakers that outputs audio to the player110, such as the song being played in the game. The headset120can communicate with the gaming console105wirelessly or using wired means.

The environment100also includes a hand-held or hand-worn apparatus such as a motion controller125, which is used to track an orientation, position and movement of the hand of the player110. The motion controller125includes one or more sensors, e.g., such as the ones mentioned above, that track the orientation, position and motion of the hand of the player110(examples include an Oculus Touch, a Samsung Gear controller, a PlayStation Move, or a Nintendo Switch controller). In some embodiments, the motion controller includes a magnetic position sensor that senses a position of the motion controller125in relation to a non-hand-held peripheral, e.g., headset120. The motion controller125transmits the co-ordinates of the hand and/or the movement of the hand to the gaming console105, which then translates the movement to the 3D virtual space. The motion controller125can also include a haptic feedback mechanism that provides haptic feedback, e.g., when the player110slashes the digital block. In some embodiments, the environment100may include more than one motion controller125, e.g., a pair of motion controllers. The player110can hold one motion controller in one hand and the other in the other hand. In some embodiments, a single motion controller125is held in both hands. In a two-player game one player can hold one motion controller and the other player can hold the other motion controller. The motion controller125can be of any shape, size or dimension that is suitable to be held in the hand of a player. The motion controller125can communicate with the gaming console105wirelessly or using wired means. The motion controller125can also communicate with other devices, such as headset120, wirelessly or using wired means.

In the rhythm-based video game, the gaming console105establishes a 3D virtual space, such as the 3D virtual space205ofFIG.2. The 3D virtual space205includes a position of the player110, e.g., which is determined based on the co-ordinates of the position of the player110obtained from the headset120. The 3D virtual space205includes a proximate end210that is proximate to the player110and a distal end215opposite to the proximate end210. A portion of the proximate end210corresponds to the position of the player110. In some embodiments, the player may calibrate the controller or the virtual space205to themselves. Calibration may be performed by the hardware, or manually by the player.

The gaming console105renders multiple digital objects, e.g., a digital block220, that are approaching the player110from the distal end220. In some embodiments, the digital block220may appear bigger in size as it approaches the proximate end210. The gaming console105also renders a digital representation of an instrument, e.g., a digital saber, using which the player110can slash, cut or otherwise interact with the digital block220to cause a game action to occur in the game. The game presents the player110with a stream of the digital blocks in synchronization with beats of music, e.g., a song's beats and notes, being played in the game. As the beat picks up in the music, the pace at which the digital blocks approach the player110can increase.

In the VR implementation, the motion controller125can be a VR based motion controller, which is represented as a digital saber in the 3D virtual space205. The player110uses a pair of VR motion controllers to wield a pair of digital lightsabers, e.g., a first digital saber230and a second digital saber235, in the 3D virtual space205to slash the digital blocks. The digital blocks can be of various types, e.g., a first type and a second type, which the player110may interact with using the two different digital sabers. A specific type of digital blocks should be interacted with using a specified digital saber. In some embodiments, the first type of digital blocks can be of a first color and may have to be interacted with using a digital saber of the corresponding color, and the second type of digital blocks can be of a second color and may have to be interacted with using a digital saber of the corresponding color. For example, each digital block is colored red or blue to indicate whether the red or blue digital saber should be used to slash it.

In some embodiments, each of the digital blocks is marked with a direction indicator225, which indicates the direction to slash through the digital block. For example, a directional indicator225such as an arrow can indicate one of eight possible directions to slash through the digital block220. In another example, a directional indicator225such as a dot can indicate that the player110may slash through the digital block220in any direction. When a digital block is slashed by a digital saber, the digital block is destroyed, and a score is awarded based on one or more factors, e.g., timing accuracy and physical positioning of the cut.

The game can also present digital objects other than digital blocks, which the player110should not hit. For example, the game can present a digital object such as a “digital mine”240that the player110should not hit. In another example, the game can present obstacles such as a digital representation of an oncoming wall (“digital obstacle”)245that the player110or the head of the player110should avoid. The player110can avoid the digital obstacle245approaching the proximate end210by moving out of the path of digital obstacle245, which can be done by stepping to the right or left of the digital obstacle245or by ducking below the digital obstacle245. For example, if the player110steps to the right (in the real world) of the oncoming digital obstacle245, the gaming console105shifts at least a portion of the 3D virtual space205to the right of the player110so that the digital obstacle ends up passing through the left of the player110at the proximate end210.

The sensors worn by the player110, e.g., in the headset120, motion controller125or elsewhere, can transmit the co-ordinates of the player110, portion of the body of the player110, such as a head of the player110, movements of the player110, or movements of a portion of the body of the player110to the gaming console105. The gaming console105translates the received co-ordinates to the co-ordinates in the 3D virtual space205and determines the action of the player110, e.g., whether there was a digital collision between a digital block and a digital saber, whether there was a digital collision between a digital obstacle and a digital saber, whether there was a digital collision between a digital wall and any portion of the body of the player110(which corresponds to a portion of the 3D virtual space in the proximate end210), whether the player110moved out of the path of the approaching digital obstacles, etc., which result in a specified game action.

A game action can occur in response to a digital collision between a digital saber and a digital object. The game action can be any of an increase or decrease in score of the player, an increase or decrease in energy of the player110, a gain or loss of life of the player110in the game, an increase or decrease in a rate at which the score changes, an increase or decrease in the pace at which the digital objects are created or move towards the player110, etc. Different types of game actions can occur in response to different events. For example, a score of the player110can increase in response to a digital collision between a digital block of a specified color and the digital saber of the specified color. In another example, a score of the player110can decrease in response to a digital collision between a digital block of a specified color and a digital saber of a color other than the specified color. In another example, a score of the player110may increase, or increase by an amount above a threshold, in response to a digital collision between a digital block and a digital saber in which a contact angle of the digital saber with the digital block is consistent with the direction indicated on the digital block.

In another example, a score of the player110may not increase, or increase by an amount below a threshold, in response to a digital collision between a digital block and a digital saber in which a contact angle of the digital saber with the digital block is not consistent with the direction indicated on the digital block. In another example, a score of the player110may not increase, or increase by an amount below a threshold, in response to a digital collision between a digital block and a digital saber whose collision impact is below a specified threshold.

In some embodiments, the collision impact can be measured as a function of how hard, fast or strong the player110swings the motion controller125to slash the digital block220. In another example, a score, energy or life of the player110can decrease in response to a digital collision between a digital mine and a digital saber. In another example, a score, energy or life of the player110can decrease in response to a digital collision between a digital obstacle and a digital saber or the player110. In some embodiments, a game action can also occur if there is no digital collision for a specified duration. For example, if the player110does not slash through any of the digital blocks for a duration exceeding a specified threshold, a score, energy or life of the player110can decrease, or the rate at which the score, energy or life increases can be decreased. The game can be configured to calculate the score, energy, or a life of the player110using various factors, including the above specified factors.

The game can end based on one or more factors, such as after a specified time period, when the player runs out of energy or lives, when the player110has completed one or more levels of the game, when the music stops playing, or when the player issues a command to stop the game.

As mentioned above, the game can be implemented as a 2D game, 3D game, a VR game, or an AR game. The entities of the environment100can be adjusted, adapted or configured for a specific implementation. For example, while the environment100described providing the VR gaming experience through the headset120, in some embodiments, the VR gaming experience can also be created through specially designed rooms with multiple large screens.

In some embodiments, some of the entities of the environment100may have to be calibrated before being able to play the game to obtain the coordinates of the position of the player110. For example, a setup process in the game may ask the player110to turn the headset120and perform some gestures, such as moving from left to right, right to left, ducking, jumping, or talking. In another example, the setup process in the game may ask the player110to move the motion controller in each hand to perform a gesture, such as raising the motion controller, swinging the hand with the motion controller to hit a digital object displayed on the display device115or the headset120. Once the calibration is complete, the gaming console105establishes the 3D virtual space205after which the player110can proceed to play the game.

In some embodiments, the user is able to customize their game experience. Examples include changing the graphical representation on the digital element (the digital sabers)230/235. The digital sabers230/235may change color, change in graphical design and through use of various “skins.” The sabers230/235may also change in shape or character causing the manner in which the user causes digital collisions to shift. In some embodiments, a player avatar is displayed to the user. The player avatar is customizable using skins and different digital models. In some embodiments, the user is able to generate gameplay via attaching a “beat map” to an audio file. A beat map includes data describing each digital object220,240,245that is generated in the 3D virtual space205, at what point in the audio file the objects220,240,245are generated, the speed of the objects220,240,245, the type/color of each object220,240,245, the directionality225of each object220,240,245, and a position and vector in the 3D virtual space205of each object220,240,245. Given a beat map and a corresponding audio file, any song can be played in the game. A digital distribution system may also provide packs or groups of beat maps and audio files to play with the game.

FIG.3is a flow diagram of a process300for presenting the game to a player, consistent with various embodiments. In some embodiments, the process300can be implemented in the environment100ofFIG.1. The player110can complete the setup process to calibrate the motion controller and the headset120. In some embodiments, the player110may need to complete the calibration only once per session, e.g., when the gaming console105is powered on and prior to playing a first game and need not calibrate the devices again unless the gaming console105is powered off and powered on again. In some embodiments, the devices may have to be calibrated when the player changes. In some embodiments, the devices may have to be calibrated when there are environmental changes around the player110, e.g., a change in intensity of light in the room where the player110plays the game, a change in distance between the player110and the gaming console105, a change in the position of the furniture in the room where the game is played. After the devices are calibrated, the gaming console105will have the necessary coordinates to establish the 3D virtual space in which the game is played.

At block305, the gaming console105renders the 3D virtual space305in which the game is to be played. The 3D virtual space305includes a digital position of the player110.

At block310, the gaming console105renders a digital element, e.g., a digital saber, in the 3D virtual space305relative to the digital position of the player110. The co-ordinates and the orientation of the digital saber relative to the digital position correspond to an orientation and the co-ordinates of the physical hand of the player110relative to the physical body of the player110. The gaming console105obtains the orientation and the co-ordinates of the physical hand of the player110relative to the physical body of the player110using the motion controller125held in the hand of the player110and the headset120worn by the player110.

At block315, the gaming console105renders multiple digital objects that approach the digital position of the player110from a distance in the 3D virtual space. The digital objects can include digital blocks that the player110should slash using the digital element. In some embodiments, the digital objects can include digital mines that the player110should not hit, and digital obstacles that the player110should avoid.

At block320, the gaming console105causes a game action to occur in response to a digital collision between the digital element and one or more of the digital objects. The different type of game actions that can occur are described at least with reference toFIG.1.

FIG.4is a flow diagram of a process400for presenting the game to a player, consistent with various embodiments. In some embodiments, the process400can be implemented in the environment100ofFIG.1. The process400assumes that the headset120and the motion controllers are calibrated. At block405, the gaming console105renders a 4D virtual space in which the game is to be played. The 4D virtual space205includes a proximate end210that is proximate to the player110and a distal end215opposite to the proximate end210. In some embodiments, a portion of the proximate end210corresponds to the digital position of the player110.

At block410, the gaming console105renders a digital representation of an instrument to be used by the player110, e.g., the first digital saber230, to play the game. The co-ordinates and the orientation of the first digital saber230relative to the digital position of the player110correspond to an orientation and the co-ordinates of the physical hand of the player110relative to the physical body of the player110.

At block415, the gaming console105instantiates multiple digital blocks in the 4D virtual space205.

At block420, the gaming console105associates each of the digital blocks with a direction indicator. In some embodiments, the direction indicator indicates the direction in which the digital block is to be slashed by the player110.

At block425, the gaming console105plays an audio file having a musical beat, e.g., a song.

At block430, the gaming console105causes the multiple digital objects to travel from the distal end215to the proximate end210in the 4D virtual space205in synchronization with the musical beats. For example, the rate at which the digital blocks are created or the pace at which the digital blocks approach the proximate end210depend on the musical beats. As the beat picks up in the song, the pace at which the digital blocks approach the player can increase.

At block435, the gaming console105causes a game action to occur based on an interaction between the first digital saber230and one or more of the digital blocks. The different type of game actions that can occur are described at least with reference toFIG.1.

FIG.5is a flow diagram of a process500for presenting the game to a player, consistent with various embodiments. In some embodiments, the process500can be implemented in the environment100ofFIG.1. The process500assumes that the headset120and the motion controllers are calibrated. At block505, the gaming console105renders a 3D virtual space in which the game is to be played. The 3D virtual space205includes a proximate end210that is proximate to the player110and a distal end215opposite to the proximate end210. In some embodiments, a portion of the proximate end210corresponds to the digital position of the player110.

At block510, the gaming console105renders a digital representation of instruments to be used by the player110, e.g., the first digital saber230and the second digital saber235, to play the game. For example, the first digital saber230can correspond to the motion controller held by the player110in the left hand and the second digital saber235can correspond to the motion controller held by the player110in the right hand. The co-ordinates and the orientation of the digital sabers relative to the digital position of the player110correspond to an orientation and the co-ordinates of the physical hands of the player110relative to the physical body of the player110. The digital sabers can have different characteristics. For example, the first digital saber230can be red colored saber and the second digital saber235can be a blue colored saber.

At block515, the gaming console105renders multiple digital objects traveling in the 3D virtual space205from the distal end215to the proximate end210. The digital blocks can include two different sets of blocks. In some embodiments, the first set of digital blocks can be of a first color and the second set of digital blocks can be of a second color. For example, each digital block is colored red or blue to indicate whether the red or blue digital saber should be used to slash it.

At determination block520, the gaming console determines whether there was an interaction, e.g., digital collision, between the red digital blocks and the red saber230or between the blue digital blocks and the blue saber235.

If the gaming console105determines that at least one of the conditions in block520is true, at block525, the gaming console105causes a first type of game action, and the control is transferred to block515. For example, the first type of game action can be to increase a score of the player110in response to a digital collision between a digital block and the digital saber of the same color.

At determination block530, the gaming console determines whether there was an interaction, e.g., digital collision, between the red digital blocks and the blue saber235or between the blue digital blocks and the red saber230.

If the gaming console105determines that at least one of the conditions in block530is true, at block535, the gaming console105causes a second type of game action, and the control is transferred to block515. For example, the second type of game action can be not to increase the score, or decrease the score of the player110in response to a digital collision between a digital block and the digital saber of different colors.

FIG.6is a flow diagram of a process600for presenting the game to a player, consistent with various embodiments. In some embodiments, the process600can be implemented in the environment100ofFIG.1. At block605, the gaming console105obtains an orientation and coordinates of the player110, a hand and head of the player110at least from the headset120and the motion controller125.

At block610, the gaming console105renders a 3D virtual space in which the game is to be played. The 3D virtual space205is generated based on the co-ordinates received from the headset120and the motion controller125associated with the player110. The 3D virtual space205includes a proximate end210that is proximate to the player110and a distal end215opposite to the proximate end210. In some embodiments, a portion of the proximate end210corresponds to the digital position of the player110.

At block615, the gaming console105renders a digital representation of instruments to be used by the player110, e.g., the first digital saber230and the second digital saber235, at the proximate end210. The digital saber is responsive to the movements of the hand of the player110holding the motion controller125. The co-ordinates and the orientation of the digital sabers relative to the digital position of the player110correspond to an orientation and the co-ordinates of the physical hands of the player110relative to the physical body of the player110.

At block620, the gaming console105renders multiple digital objects traveling in the 3D virtual space205from the distal end215to the proximate end210. The digital objects can be of different types. For example, a first type can include digital blocks that the player110should slash using the digital saber, a second type can include digital mines that the player110should not hit, and a third type can include digital obstacles that the player110should avoid.

At block625, the gaming console105causes interaction with the digital objects based on the movement of the hand, head and/or body of the player110.

At determination block630, the gaming console105determines whether there was an interaction, e.g., digital collision, between a digital block and the digital saber. If yes, at block635, the gaming console105causes a first type of game action. For example, the first type of game action can be to increase a score of the player110in response to a digital collision between a digital block and the digital saber. The control is transferred to block625.

At determination block640, the gaming console105determines whether there was an interaction, e.g., digital collision, between the digital saber and a digital mine. If yes, at block645, the gaming console105causes a second type of game action. For example, the second type of game action can be not to increase the score or decrease the score/energy/life of the player110in response to the digital collision between a digital mine and the digital saber. The control is transferred to block625.

At block650, the gaming console105causes the digital obstacle to change the direction of travel based on actual movements of the player110or movements of the head of the player110.

At determination block655, the gaming console105determines whether the digital obstacle pass through the digital position of the player110at the proximate end210. If yes, at block660, the gaming console105causes a third type of game action. For example, the third type of game action can be to decrease the score/energy/life of the player110. The control is transferred to block625.

FIG.7illustrates variable game actions based on angle and position of incidence between a digital element730and a digital object720. The pictured digital object720includes a direction indicator725. In the illustrated example, the digital element730is elongated and extends from a control point at the user's hand (as a sword would).

The direction indicator275indicates the game's ideal collision direction between the digital element730and the digital object720. Causing a collision using a swing direction735that corresponds to the direction indicator725results in a positive game action (e.g., rewarding of points), whereas causing a collision at a different direction causes a different game action (e.g., issuing a fault, ending a combo streak, rewarding fewer points than the positive game action, subtracting points).

In some embodiments, angle incidence and/or digital element rotation740may affect the type of game action occurring from a collision. Incidence angles745,750and755illustrate regions an incoming digital element730might collide with the digital object720. In some embodiments, the game action occurring from a collision is more positive for the user when the collision has an incidence angle745,750,755closest to on-center (e.g., straight through center “mass” of the digital object720). For example, the collision may be worth more points depending on where a user strikes the digital object720with the digital element730.

The incidence angles745,750,755may be used as a region (e.g., plus or minus 0-30 degrees from center) or as an absolute measurement (e.g., exactly 37 degrees right or left of center) where a collision at 0 degrees from center is worth the most points. In some embodiments, instead of, or in addition to incidence angles, entry and exit surfaces are used. Where a collision begins and ends on opposite sides of a digital object720, the digital element730took a relatively straight path through the digital object720. Taking a straight path through the digital object720may provide users with a more positive game action (e.g., more points) than a collision that enters and exits through adjacent sides of the digital object720. Illustrative of the relatively straight path described above is a collision “cut” that begins on surface760and exits through surface765. A collision that does not use a straight path begins at surface760and exits through surface770; or alternatively, a collision that begins at surface770and exits through surface765. The non-straight path collisions may render less positive game actions than straight path collisions.

In some embodiments, the digital element rotation740further influences game actions. The digital element rotation740is controlled by the user's motion controller or hand gestures. Where the user rotates within the applicable control scheme, the digital element730will rotate in game, in a corresponding fashion. In some embodiments of digital elements730have varied surfaces. Examples of varied surfaces include a bladed edge of a sword and the flat of a sword. In this example, where a user strikes the digital object720with a bladed edge (e.g., a cut), a different game action results than if the user strikes the digital object720with the flat side (e.g., a slap). In some embodiments, a cut renders a more positive game action than a slap.

FIGS.8-16illustrate screenshots of various graphical user interfaces (GUIs) of the game, which are generated by the gaming console105ofFIG.1.

FIG.8is a screenshot of a player playing the game, consistent with various embodiments. InFIG.8, the player is holding a pair of motion controllers in the hands which are depicted as digital sabers in the 3D virtual space of the game.FIG.8also illustrates the player slashing a pair of digital blocks.

FIG.9is a screenshot of a player view of the 3D virtual space generated by the gaming console, consistent with various embodiments.

FIG.10is a screenshot of a graphical user interface (GUI) with instructions to the player, consistent with various embodiments. InFIG.10, the GUI instructs the player to move their hands so that the digital sabers cut the digital blocks in the indicated direction.

FIG.11is a screenshot of a GUI with instructions to the player, consistent with various embodiments. InFIG.11, the GUI indicates to the player that some of the digital objects presented in the 3D virtual space are meant to be avoided, e.g., by not cutting or slashing the objects.

FIG.12Ais a screenshot of a GUI with instructions to the player, consistent with various embodiments. InFIG.12A, the GUI indicates to the player that some of the digital objects presented in the 3D virtual space are to be avoided by player's location as opposed to avoiding using the digital sabers. For example, the player can duck, crouch or move to avoid to some of the digital objects.

FIG.12Bis a screenshot of a GUI with instructions to the player, consistent with various embodiments. InFIG.12B, the GUI indicates to the player that some of the digital objects presented in the 3D virtual space are to be avoided by ducking or crouching.

FIG.13is a screenshot of a GUI with instructions to the player, consistent with various embodiments. InFIG.13, the GUI indicates to the player that the player can gain points by cutting the correct digital objects and in the correct manner.

FIG.14is a screenshot of a GUI in which the player can select various options, consistent with various embodiments. InFIG.14, the GUI presents the player with various songs to select from and a difficulty level of the game. Note that the digital objects are presented in the 3D virtual space based on the beats in the music.

FIG.15is a screenshot of a GUI showing multiple digital objects approaching the player in the 3D virtual space, consistent with various embodiments.

FIG.16is a screenshot of a GUI showing multiple digital objects approaching the player in the 3D virtual space, consistent with various embodiments. InFIG.16, the game presents digital blocks and digital obstacles simultaneously. The player has to slash the digital blocks but avoid the digital obstacles.

FIG.17is a block diagram of a computer system as may be used to implement features of some embodiments of the disclosed technology. The computing system1700may be used to implement any of the entities, components, modules, interfaces, or services depicted in the foregoing figures (and in this specification). The computing system1700may include one or more central processing units (“processors”)1705, memory1710, input/output devices1725(e.g., keyboard and pointing devices, display devices), storage devices1720(e.g., disk drives), and network adapters1730(e.g., network interfaces) that are connected to an interconnect1715. The interconnect1715is illustrated as an abstraction that represents any one or more separate physical buses, point to point connections, or both connected by appropriate bridges, adapters, or controllers. The interconnect1715, therefore, may include, for example, a system bus, a Peripheral Component Interconnect (PCI) bus or PCI-Express bus, a HyperTransport or industry standard architecture (ISA) bus, a small computer system interface (SCSI) bus, a universal serial bus (USB), IIC (I2C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 1394 bus, also called “Firewire”.

The memory1710and storage devices1720are computer-readable storage media that may store instructions that implement at least portions of the described technology. In addition, the data structures and message structures may be stored or transmitted via a data transmission medium, such as a signal on a communications link. Various communications links may be used, such as the Internet, a local area network, a wide area network, or a point-to-point dial-up connection. Thus, computer-readable media can include computer-readable storage media (e.g., “non-transitory” media) and computer-readable transmission media.

The instructions stored in memory1710can be implemented as software and/or firmware to program the processor(s)1705to carry out actions described above. In some embodiments, such software or firmware may be initially provided to the processing system1700by downloading it from a remote system through the computing system1700(e.g., via network adapter1730).

The technology introduced herein can be implemented by, for example, programmable circuitry (e.g., one or more microprocessors) programmed with software and/or firmware, or entirely in special-purpose hardwired (non-programmable) circuitry, or in a combination of such forms. Special-purpose hardwired circuitry may be in the form of, for example, one or more ASICs, PLDs, FPGAs, etc.

Although the invention is described herein with reference to the preferred embodiment, one skilled in the art will readily appreciate that other applications may be substituted for those set forth herein without departing from the spirit and scope of the present invention. Accordingly, the invention should only be limited by the Claims included below.

Claims

  1. A computer-readable storage medium storing instructions that, when executed by a computing system, cause the computing system to perform a process comprising: instantiating a virtual space that includes a user location of a user;rendering a plurality of digital targets that approach the user location, wherein one or more digital targets, of the plurality of digital targets, display an indication of a predetermined angle or orientation for effecting a first action in response to contacting the one or more digital targets;and detecting contacts between the one or more digital targets and one or more digital interface elements controlled by the user, wherein control of the one or more digital interface elements is based on a position and orientation of a respective hand of the user, wherein detected contacts that A) are between the one or more digital targets and a first digital interface element, of the one or more digital interface elements, and B) that occur at the predetermined angle or orientation, that the one or more targets display by the indication of the predetermined angle or orientation, effect the first action.
  1. The computer-readable storage medium of claim 1, wherein detected contacts that A) are between the one or more digital targets and the first digital interface element, of the one or more digital interface elements, and B) that do not occur at the predetermined contact angle or orientation that the one or more targets display by the indication of the predetermined angle or orientation, effect a second action different from the first action.
  2. The computer-readable storage medium of claim 1, wherein the predetermined angle or orientation is based on a relative orientation of the first digital interface element as compared to a respective digital target of the one or more digital targets.
  3. The computer-readable storage medium of claim 1, wherein the one or more digital interface elements further include a second digital interface element;wherein a first subset of digital targets, of the one or more digital targets, are linked with the first digital interface element via graphical indicators and a second subset of digital targets, of the one or more digital targets, are linked with the second digital interface element via graphical indicators;and wherein detected contacts that effect the first action further require that C) a colliding digital interface element and a corresponding digital target are linked.
  4. A computing system comprising: one or more processors;and one or more memories storing instructions that, when executed by the one or more processors, cause the computing system to perform a process comprising: instantiating a virtual space that includes a user location of a user;rendering a plurality of digital targets that approach the user location, wherein one or more digital targets, of the plurality of digital targets, display an indication of a predetermined angle or orientation for effecting a first action in response to contacting the one or more digital targets;and detecting contacts between the one or more digital targets and one or more digital interface elements controlled by the user, wherein control of the one or more digital interface elements is based on a position and orientation of a respective hand of the user, wherein detected contacts that A) are between the one or more digital targets and a first digital interface element, of the one or more digital interface elements, and B) that occur at the predetermined angle or orientation, that the one or more targets display by the indication of the predetermined angle or orientation, effect the first action.
  5. The computing system of claim 5, wherein the first action is one of: increasing a game score;decreasing the game score;effecting a pace of the approach of the one or more digital targets;effecting a creation of the one or more digital targets;effecting a rate of change in the game score;or any combination thereof.
  6. The computing system of claim 5, wherein the process further comprises: playing a musical score including musical notes, wherein a pace of movement of the plurality of digital targets approaching the user location is based on the musical notes.
  7. The computing system of claim 7, wherein a pace of generation of the plurality of digital targets is based on the musical notes.
  8. A method comprising: instantiating a virtual space that includes a user location of a user and multiple digital interface elements, the multiple digital interface elements including a right-hand digital interface element and a left-hand hand digital interface element, wherein control of the multiple digital interface elements is based on a position and orientation of a respective pair of hands of the user;rendering a plurality of digital targets viewable from and moving in relation to the user location, wherein the plurality of digital targets each include a graphic indicator, associating each respective digital target with one of the right-hand digital interface element or the left-hand digital interface element;detecting contact between one of the multiple digital interface elements controlled by the user and a first digital target, wherein the graphic indicator of the first digital target corresponds to a first digital interface element of the multiple digital interface elements;and performing one of: effecting a first action where the detected contact is between the first digital interface element that corresponds to the graphic indicator of the first digital target;or effecting a second action, different from the first action, where the detected contact is between a digital interface element other than the first digital interface element and does not correspond to the graphic indicator of the first digital target.
  9. The method of claim 9, wherein rendering of the plurality of digital targets is timed to a musical track.
  10. The method of claim 9, wherein the process further comprises: effecting the second action in response to the user failing to cause a hit on a given digital target of the plurality of digital targets within a predetermined period of time.
  11. A computing system comprising: one or more processors;and one or more memories storing instructions that, when executed by the one or more processors, cause the computing system to perform a process comprising: instantiating a virtual space that includes a user location of a user and multiple digital interface elements, the multiple digital interface elements including a right-hand digital interface element and a left-hand hand digital interface element, wherein control of the multiple digital interface elements is based on a position and orientation of a respective pair of hands of the user;rendering a plurality of digital targets viewable from and moving in relation to the user location, wherein the plurality of digital targets each include a graphic indicator, associating each respective digital target with one of the right-hand digital interface element or the left-hand digital interface element;detecting contact between one of the multiple digital interface elements controlled by the user and a first digital target, wherein the graphic indicator of the first digital target corresponds to a first digital interface element of the multiple digital interface elements;and performing one of: effecting a first action where the detected contact is between the first digital interface element that corresponds to the graphic indicator of the first digital target;or effecting a second action, different from the first action, where the detected contact is between a digital interface element other than the first digital interface element and does not correspond to the graphic indicator of the first digital target.
  12. The computing system of claim 12, wherein the graphic indicator of the first digital target further indicates a rotational orientation associated with the multiple digital interface elements, and wherein effecting the first action further requires that the first digital interface element is rotationally orientated, during the detected hit, according to the graphic indicator of the first digital target.
  13. The computing system of claim 12, wherein the process further comprises: rendering additional digital objects in the virtual space in addition to the plurality of digital targets, wherein the additional digital objects include a different interaction scheme than the plurality of digital targets.
  14. The computing system of claim 12, wherein the first digital target includes a type criterion and the first action includes an effect style, wherein the effect style is based on the type criterion.
  15. A method comprising: instantiating a virtual space that includes a user location of a user and multiple digital interface elements (“elements”) including a right element and a left element, and wherein control of the multiple digital interface elements is based on a position and orientation of a respective pair of hands of a user;rendering a plurality of digital targets that approach the user, timed to an audio track, wherein the plurality of digital targets each include a graphic indicator indicating an association between each respective digital target and one of the right element or the left element;detecting a connection, on a first digital target by the right element controlled by the user, by identifying a match with the right element, wherein the match is based on the association indicated by the graphic indicator of the first digital target;and effecting a first action in response to the detection of the connection.
  16. The method of claim 16 further comprising: detecting a second hit on a second digital target by the left element controlled by the user;and effecting the first action in response to the detection of the second hit, wherein the hit is a match with the left element, the match based on the association indicated by the graphic indicator of the second digital target.
  17. The method of claim 16 further comprising: detecting a second hit on a second digital target by a first element of the multiple elements controlled by the user;and effecting a second action in response to the detection of the second hit, wherein the second hit does not match with the first element, the lack of a match based on the association indicated by the graphic indicator of the second digital target.
  18. A non-transitory computer-readable storage medium storing instructions that, when executed by a computing system, cause the computing system to perform a process comprising: instantiating a virtual space that includes a user location of a user and multiple digital interface elements (“elements”) including a right element and a left element, and wherein control of the multiple digital interface elements is based on a position and orientation of a respective pair of hands of a user;rendering a plurality of digital targets that approach the user, timed to an audio track, wherein the plurality of digital targets each include a graphic indicator indicating an association between each respective digital target and one of the right element or the left element;detecting a connection, on a first digital target by the right element controlled by the user, by identifying a match with the right element, wherein the match is based on the association indicated by the graphic indicator of the first digital target;and effecting a first action in response to the detection of the connection.
  19. The non-transitory computer-readable storage medium of claim 19, wherein the process further comprises: effecting the negative action in response to the user failing to cause a hit on a third digital target of the plurality of digital targets within a predetermined period of time of the third digital target approaching the user location.

Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.