U.S. Pat. No. 8,894,484

MULTIPLAYER GAME INVITATION SYSTEM

AssigneeMicrosoft Technology Licensing LLC

Issue DateJanuary 30, 2012

Illustrative Figure

Abstract

A system and related methods for inviting a potential player to participate in a multiplayer game via a user head-mounted display device are provided. In one example, a potential player invitation program receives user voice data and determines that the user voice data is an invitation to participate in a multiplayer game. The program receives eye-tracking information, depth information, facial recognition information, potential player head-mounted display device information, and/or potential player voice data. The program associates the invitation with the potential player using the eye-tracking information, the depth information, the facial recognition information, the potential player head-mounted display device information, and/or the potential player voice data. The program matches a potential player account with the potential player. The program receives an acceptance response from the potential player, and joins the potential player account with a user account in participating in the multiplayer game.

Description

DETAILED DESCRIPTION FIG. 1shows a schematic view of one embodiment of a multiplayer game invitation system10for visually augmenting an appearance of an interaction environment as seen through a display in a head-mounted display (HMD) device. The multiplayer game invitation system10includes a potential player invitation program14that may be stored in mass storage18of a computing device22. The potential player invitation program14may be loaded into memory26and executed by a processor30of the computing device22to perform one or more of the methods and processes described in more detail below. In one example, the multiplayer game invitation system10may also include a multiplayer game program34that may be stored in mass storage18of the computing device22. The multiplayer game program34may generate a game in the form of a video game, augmented or virtual reality game or experience, or any other suitable electronic game or experience that involves interaction among two or more participants. In another example, the multiplayer game program34may be stored remotely and may be accessed by the computing device22over a network38to which the computing device is operatively connected. A user account40corresponding to a particular user may also be stored in the mass storage18of the computing device22. The user account40may include, for example, a user's account information for one or more online games and/or gaming services. Such information may include, for example, games that the user has purchased, installed, and/or validated, as well as usage data regarding games the user has played. It will be appreciated that the user account40may also serve as an abstraction that is associated with the particular user. Such an abstraction may be further associated with a particular hardware device, such as the user head-mounted display (HMD) device42described in more detail below, upon proper authentication of the particular user via the hardware device. A user profile44corresponding to a particular user may also be stored ...

DETAILED DESCRIPTION

FIG. 1shows a schematic view of one embodiment of a multiplayer game invitation system10for visually augmenting an appearance of an interaction environment as seen through a display in a head-mounted display (HMD) device. The multiplayer game invitation system10includes a potential player invitation program14that may be stored in mass storage18of a computing device22. The potential player invitation program14may be loaded into memory26and executed by a processor30of the computing device22to perform one or more of the methods and processes described in more detail below.

In one example, the multiplayer game invitation system10may also include a multiplayer game program34that may be stored in mass storage18of the computing device22. The multiplayer game program34may generate a game in the form of a video game, augmented or virtual reality game or experience, or any other suitable electronic game or experience that involves interaction among two or more participants. In another example, the multiplayer game program34may be stored remotely and may be accessed by the computing device22over a network38to which the computing device is operatively connected.

A user account40corresponding to a particular user may also be stored in the mass storage18of the computing device22. The user account40may include, for example, a user's account information for one or more online games and/or gaming services. Such information may include, for example, games that the user has purchased, installed, and/or validated, as well as usage data regarding games the user has played. It will be appreciated that the user account40may also serve as an abstraction that is associated with the particular user. Such an abstraction may be further associated with a particular hardware device, such as the user head-mounted display (HMD) device42described in more detail below, upon proper authentication of the particular user via the hardware device.

A user profile44corresponding to a particular user may also be stored in the mass storage18of the computing device22, either separately or as part of the user account40. The user profile44may include, for example, a user name, a user ID, one or more game achievement levels corresponding to levels of proficiency attained by the user on certain games, game interests of the user, games purchased and/or available to the user, and/or any other user-related information that the user may choose to store in the user profile. It will also be appreciated that in other examples the user account40and/or user profile44may be stored remotely, such as on a remote server46accessed via network38.

The computing device22may take the form of a desktop computing device, a mobile computing device such as a smart phone, laptop, notebook or tablet computer, network computer, home entertainment computer, interactive television, gaming system, or other suitable type of computing device. Additional details regarding the components and computing aspects of the computing device22are described in more detail below with reference toFIG. 6.

The computing device22may be operatively connected with the user HMD device42using a wired connection, or may employ a wireless connection via WiFi, Bluetooth, or any other suitable wireless communication protocol. Additionally, the example illustrated inFIG. 1shows the computing device22as a separate component from the user HMD device42. It will be appreciated that in other examples the computing device22may be integrated into the user HMD device42.

The computing device22also may be operatively connected with one or more additional HMD devices, such as first potential player HMD device48, second potential player HMD device50, and third potential player HMD device52. In one example, the computing device22may communicate with the first potential player HMD device48, second potential player HMD device50, and third potential player HMD device52via network38. Network38may take the form of a local area network (LAN), wide area network (WAN), wired network, wireless network, personal area network, or a combination thereof, and may include the Internet. Additionally, a remote server such as server46may include a first potential player account56and one or more additional accounts associated with additional potential player(s).

FIG. 2shows an example of an HMD device200in the form of a pair of wearable glasses that include a transparent display54. In other examples, the HMD device200may take other suitable forms in which a transparent, semi-transparent or non-transparent display is supported in front of a viewer's eye or eyes. It will be appreciated that the user HMD device42, first potential player HMD device48, second potential player HMD device50, and third potential player HMD device52may take the form of the HMD device200, as described in more detail below, or any other suitable HMD device.

With reference toFIGS. 1 and 2, in this example the HMD device200includes a transparent display54that enables images to be delivered to the eyes of a user. The transparent display54may be configured to visually augment an appearance of a physical environment to a user viewing the physical environment through the transparent display. For example, the appearance of the physical environment may be augmented by graphical content (e.g., one or more pixels each having a respective color and brightness) that is presented via the transparent display54.

The transparent display54may be configured to enable a user to view a real-world object in the physical environment through one or more partially transparent pixels that are displaying a virtual object representation. In one example, the transparent display54may include image-producing elements located within lenses204(such as, for example, a see-through Organic Light-Emitting Diode (OLED) display). As another example, the transparent display54may include a light modulator on an edge of the lenses204. In this example, the lenses204may serve as a light guide for delivering light from the light modulator to the eyes of a user.

In other examples, transparent display54may support selective filtering of light received from the physical environment before reaching an eye of a user wearing the HMD device200. Such filtering may be performed on a pixel-by-pixel basis or on groups of pixels. As one example, the selective filtering or removal of ambient light may be supported by the transparent display54at a different resolution (e.g., a lower resolution or a higher resolution) than the resolution supported by the transparent display for the presentation of lighted graphical content (e.g., illuminated pixels). In other examples, transparent display54may include a first display layer that adds light in the form of one or more illuminated pixels, and a second display layer that filters ambient light received from the physical environment. These layers may have different display resolution, pixel density, and/or display capabilities.

The HMD device200may also include various systems and sensors. For example, the HMD device200may include an eye-tracking system58that utilizes at least one inward facing sensor208. The inward facing sensor208may be an image sensor that is configured to acquire image data in the form of eye-tracking information from a user's eyes. Provided the user has consented to the acquisition and use of this information, the eye-tracking system58may use this information to track the position and/or movement of the user's eyes. The eye-tracking system58may then determine where and/or at what person or object the user is looking. In another example, the inward facing sensor208may capture retinal scan information from a user's retina. Provided the user has consented to the acquisition and use of this information, such information may be used to identify the user wearing the HMD device200.

The HMD device200may also include an optical sensor system62that utilizes at least one outward facing sensor212, such as an optical sensor. Outward facing sensor212may detect movements within its field of view, such as gesture-based inputs or other movements performed by a user or by a person within the field of view. Outward facing sensor212may also capture image information, such as facial recognition information, and depth information from a physical environment and real-world objects within the environment. For example, outward facing sensor212may include a depth camera, a visible light camera, an infrared light camera, and/or a position tracking camera. In some examples, outward facing sensor212may include one or more optical sensors for observing visible spectrum and/or infrared light from the ambient lighting conditions in the physical environment.

In other examples, the HMD device200may include facial recognition capabilities via one or more still or video cameras. To detect a facial image of a person, the HMD device200and/or computing device22may use any suitable face detection technologies and/or algorithms including, but not limited to, local binary patterns (LBP), principal component analysis (PCA), independent component analysis (ICA), evolutionary pursuit (EP), Elastic Bunch Graph Matching (EBGM), or other suitable algorithm or combination of algorithms.

As noted above, the HMD device200may include depth sensing via one or more depth cameras. Time-resolved images from one or more of these depth cameras may be registered to each other and/or to images from another optical sensor such as a visible spectrum camera, and may be combined to yield depth-resolved video.

In some examples, a depth camera may take the form of a structured light depth camera configured to project a structured infrared illumination comprising numerous, discrete features (e.g., lines or points). The depth camera may be configured to image the structured illumination reflected from a scene onto which the structured illumination is projected. A depth map of the scene may be constructed based on spacings between adjacent features in the various regions of an imaged scene.

In other examples, a depth camera may take the form of a time-of-flight depth camera configured to project a pulsed infrared illumination onto a scene. This depth camera may be configured to detect the pulsed illumination reflected from the scene. Two or more of these depth cameras may include electronic shutters synchronized to the pulsed illumination. The integration times for the two or more depth cameras may differ, such that a pixel-resolved time-of-flight of the pulsed illumination, from the source to the scene and then to the depth cameras, is discernible from the relative amounts of light received in corresponding pixels of the two depth cameras. The HMD device200may also include an infrared projector to assist in structured light and/or time of flight depth analysis.

In other examples, gesture-based and other motion inputs from the user and/or persons in the physical environment may also be detected via one or more depth cameras. For example, outward facing sensor212may include two or more optical sensors with known relative positions for creating depth images. Using motion results from these optical sensors with known relative positions, such depth images may evolve over time.

Outward facing sensor212may capture images of a physical environment, such as the interaction space300shown inFIG. 3, which may be provided as input to the potential player invitation program14and/or multiplayer game program34. In one example, the multiplayer game program34may include a 3D modeling system that uses such input to generate a virtual environment that models the physical environment that is captured.

The HMD device200may also include a position sensor system66that utilizes on or more motion sensors216to enable position tracking and/or orientation sensing of the HMD device, and determine a position of the HMD device within a physical environment. As one example, position sensor system66may be configured as a six-axis or six-degree of freedom position sensor system. This example position sensor system may, for example, include three accelerometers and three gyroscopes to indicate or measure a change in location of the HMD device200within three-dimensional space along three orthogonal axes (e.g., x, y, z), and a change in an orientation of the HMD device about the three orthogonal axes (e.g., roll, pitch, yaw).

Position sensor system66may support other suitable positioning techniques, such as GPS or other global navigation systems. For example, position sensor system66may include a wireless receiver (e.g., a GPS receiver or cellular receiver) to receive wireless signals broadcast from satellites and/or terrestrial base stations. These wireless signals may be used to identify a geographic location of the HMD device200.

Positioning information obtained from wireless signals received by the HMD device200may be combined with positioning information obtained from the motion sensors216to provide an indication of location and/or orientation of the HMD device200. While specific examples of position sensor systems have been described, it will be appreciated that other suitable position sensor systems may be used.

Motion sensors216may also be employed as user input devices, such that a user may interact with the HMD device200via gestures of the neck and head, or even of the body. Non-limiting examples of motion sensors include an accelerometer, a gyroscope, a compass, and an orientation sensor, which may be included as any combination or subcombination thereof.

The HMD device200may also include one or more microphones70. In some examples, and as described in more detail below, microphones70may receive audio input from a user and/or audio input from one or more persons in a physical environment around the user. Such audio input may include, for example, commands, requests, casual speaking, singing, whistling, etc. Additionally or alternatively, one or more microphones separate from the HMD device200may be used to receive audio input.

In other examples, audio may be presented to the user via one or more speakers74on the HMD device200. Such audio may include, for example, music, instructions, and/or other communication from the multiplayer game program34, the potential player invitation program14, or other sources.

In other examples, the HMD device200may also include a transceiver78for broadcasting wireless signals such as Wi-Fi signals, Bluetooth signals, etc., and receiving such signals from other devices, These wireless signals may be used, for example, to exchange data and/or create networks among devices.

It will be understood that the sensors and other components described above and illustrated inFIG. 2are shown by way of example. These examples are not intended to be limiting in any manner, as any other suitable sensors, components, and/or combination of sensors and components may be utilized.

The HMD device200may also include a controller220having a logic subsystem and a data-holding subsystem, discussed in more detail below with respect toFIG. 6, that are in communication with the various input and output devices of the HMD device. Briefly, the data-holding subsystem may include instructions that are executable by the logic subsystem, for example, to receive and forward inputs from the sensors to computing device22(in unprocessed or processed form) via a communications subsystem, and to present images to the user via the transparent display54.

It will be appreciated that the HMD device200described above is provided by way of example, and thus is not meant to be limiting. Therefore it is to be understood that the HMD device200may include additional and/or alternative sensors, cameras, microphones, input devices, output devices, etc. without departing from the scope of this disclosure. Further, the physical configuration of an HMD device200and its various sensors and subcomponents may take a variety of different forms without departing from the scope of this disclosure.

With reference now toFIGS. 3 and 4, a description of interactions among user304, first potential player308, second potential player312, and third potential player316via one example of the multiplayer game invitation system10will now be provided.FIG. 3is a schematic view of a user304standing in a physical interaction space300, which in this example is an urban plaza bounded by a street320, a building324, and a fence330.

With reference also toFIG. 1, the user304may wear the user HMD device42that includes the transparent display54through which the user may view the interaction space300. Similarly, first potential player308may wear the first potential player HMD device48, second potential player312may wear the second potential player HMD device50, and third potential player316may wear the third potential player HMD device52.

FIG. 4is a schematic view of the interaction space300ofFIG. 3as seen through the transparent display54of the user HMD device42worn by the user304. Using one or more of the sensors described above, the user HMD device42may receive optical information, position information, depth information, audio information, and/or other information corresponding to the interaction space300, the user304, first potential player308, second potential player312, and/or third potential player316. As described above, the user HMD device42may also receive data from the network38, first potential player HMD device48, second potential player HMD device50and/or third potential player HMD device52. As described in more detail below, the potential player invitation program14may use this information and/or data to aid the user304in inviting one or more potential players to participate in a multiplayer game with the user.

In one example, the user304may desire to invite the first potential player308to participate in a multiplayer game. The user304may look at the first potential player308and say, “Hey, wanna play a game?” The potential player invitation program14may receive the user's voice data via the microphone70.

The potential player invitation program14may use speech recognition to analyze the user's voice data and determine that the voice data is an invitation to participate in a multiplayer game. For example, the potential player invitation program14may identify and interpret key words, such as “wanna”, “play” and “game,” that suggest an invitation from the user304. In this manner, the user304may use natural, colloquial language to invite the first potential player308to play a game. It will be appreciated that other words and forms of a verbal invitation from the user304may be identified as an invitation (“Would you like to join me in a game?”, “Are you interested in playing virtual tennis?”, etc.).

In one example, the user304may desire to invite the first potential player308to participate in an active multiplayer game in which the user304is currently participating. Advantageously, the potential player invitation program14enables the user304to use natural language and easily provide an invitation to the first potential player308with minimal interruption to the user in the active multiplayer game. Additionally, and as explained in more detail below, providing an invitation may trigger an exchange of data between the first potential player HMD device48and the user's multiplayer game invitation system10that may facilitate the user304and the first potential player308participating in a multiplayer game.

The user HMD device42may receive one or more of the various forms of input and information described above. Using such information and input, the potential player invitation program14may associate the user's invitation with an appropriate recipient. For example, the eye-tracking system58may provide eye-tracking information that shows that the user304is looking at the first potential player308as the user says, “Hey, would you like to play a game?” In another example, the optical sensor system62may provide depth information indicating that the first potential player308is closer to the user304than the second potential player312or the third potential player316, when the user speaks the invitation. Similarly, the position sensor system66may provide position information indicating that the first potential player308is closer to the user304than the second potential player312or the third potential player316.

In another example, the optical sensor system62may use depth information to detect that the user304is gesturing toward the first potential player308. For example, and with reference toFIGS. 3 and 4, the user304may point at the first potential player308with the user's right index finger334as the user speaks the invitation. A depth camera may provide depth information that shows that the user304is pointing at the first potential player308.

It will be appreciated that the above examples of information and input, used individually or in various combinations, may be used to associate the user's invitation with the first potential player308. It will also be appreciated that one or more other forms of information and/or input received by the user HMD device42may also be used to associate the invitation with the first potential player308.

With the user's invitation associated with the first potential player308, the potential player invitation program14may match the first potential player308to a potential player account, such as first potential player account56. In one example, the potential player invitation program14may use facial recognition information received from the optical sensor system62in the user HMD device42to match the first potential player308with the first potential player account56.

In another example, the potential player invitation program14may use information broadcast by the first potential player HMD device48to match the first potential player308with the first potential player account56. For example, the first potential player308may choose to broadcast from the first potential player HMD device48data that allows selected other devices to link the first potential player308with the first potential player account56. Such data may be received by the user HMD device42and used to match the first potential player308with the first potential player account56.

In another example, the potential player invitation program14may use voice data received from the first potential player308to match the first potential player with the first potential player account56. For example, the potential player invitation program14may use speech recognition to analyze speech patterns in the first potential player voice data, and match the first potential player308with the first potential player account56.

The potential player invitation program14may also determine from the user account40what multiplayer games the user304has available to play. Similarly, the potential player invitation program14may determine from the first potential player account56what multiplayer games the first potential player308has available to play. Using this information, the potential player invitation program14may help the user304select an appropriate game to play with potential player308.

In one example, the potential player invitation program14may display on the transparent display54of the user HMD device42a “Games Available” list338adjacent to the first potential player308that shows the multiplayer games that are available to both the user304and first potential player. In another example, the potential player invitation program14may provide additional information related to the available games that may be of interest to the user304. For example, the “Games Available” list338may also include a game achievement level next to each game that indicates the highest level achieved by the user304or by the potential player308in each of the games. In other examples, the game achievement level may indicate the highest level achieved by both the potential player308and the user304.

In another example, the potential player invitation program14may provide audio through the speaker74that informs the user of the multiplayer games and possible related additional information that are available to both the user304and the first potential player308.

In another example and as discussed above, the potential player invitation program14may receive depth information that corresponds to surfaces, objects, and other forms within the interaction space300. With reference toFIG. 3, an example of such depth information is shown as a measured depth342from the user HMD device42to the building324across the plaza from the user304. In some examples, additional measured depths corresponding to other surfaces, objects, and other forms within interaction space300may be used to construct a virtual representation of the interaction space300.

The potential player invitation program14may use one or more measured depths of the interaction space300to select a suggested multiplayer game that is physically compatible with the interaction space. For example and with reference toFIG. 3, the potential player invitation program14may select games that are suitable for spaces bounded by a structure at a measured depth342. Accordingly, the potential player invitation program14may select a Dance game, a Search And Find game, and a Tennis game that each may be comfortably played within the interaction space300.

In another example, the potential player invitation program14may suggest the suggested games to the user304via the transparent display54. For example, in the “Games Available” list338each of the Dance, Search And Find and Tennis games may be highlighted, colored, or otherwise emphasized to indicate that they are suggested games. In another example, where another game may be available but not suggested, that game may be listed in the “Games Available” list338but not emphasized.

In still another example, the user304and/or potential player308may mention a particular game in conversation. If the user304offers an invitation to the potential player308, then this game may be listed and/or emphasized in the “Games Available” list338. For example, the user304may say to the potential player308, “Hey, I love Tennis.” The potential player308may respond, “Yeah, me too.” The user304may reply “Great! Let's play!” The potential player invitation program14may determine that the reply of the user304is an invitation to the potential player308to participate in the game Tennis. The potential player invitation program14may then list and/or emphasize the game Tennis in the “Games Available” list338.

It will be also appreciated that the potential player invitation program14may use other information and input provided by the user HMD device42, such as ambient lighting information from the optical sensor system62, to select a suggested multiplayer game that is compatible with the interaction space300.

In another example, the potential player invitation program14may display on the transparent display54of the user HMD device42an indicator in the form of a floating tag344adjacent to the first potential player308. In one example, such as prior to the user304providing an invitation to the first potential player308, the floating tag344may indicate an availability of the first potential player for playing a multiplayer game. The floating tag344may include a letter or symbol that corresponds to an availability status of the first potential player308—for example, the letter “A” may indicate available and the letter “U” may indicate unavailable. In another example, the floating tag344may have a color that corresponds to an availability status—for example, green may indicate available and red may indicated unavailable.

In another example, such as after the user304has given an invitation to play a multiplayer game to the first potential player308, the floating tag344may indicate an invitation status of the invitation. The floating tag344may include a letter or symbol that corresponds to an invitation status—for example, the letter “P” may indicate the invitation is pending, the letter “A” may indicate the invitation has been accepted, and the letter “D” may indicate the invitation has been declined. In another example, the floating tag344may have a color that corresponds to an availability status—for example, green may indicate accepted, yellow may indicate pending, and red may indicate declined.

In another example, the potential player invitation program14may display on the transparent display54of the user HMD device42profile information346that is made available by the first potential player308. In one example, the profile information346may include a name of the first potential player308, a user ID, a game achievement level on one or more games, game interests of the first potential player, and/or other information related to the first potential player.

It will be appreciated that the user304and first potential player308may select privacy settings that control the types and amounts of profile and other information that is shared or made available to others via the potential player invitation program14. In one example, prior to an invitation being received by the first potential player308, the first potential player may choose to share no information with others. If an invitation is received, the first potential player308may share limited information, such as a name, user ID and a game achievement level, with the user304who provided the invitation. After the first potential player308accepts the invitation, the first potential player may share additional information with the user304, such as other game interests.

After the user304has made the invitation, the first potential player invitation program14may receive an acceptance response from the first potential player308. In one example, the first potential player308may respond to the invitation by saying, “Yeah, sure.” The potential player invitation program14may receive the user's voice data via the user HMD device42and microphone70, and may use speech recognition to analyze the voice data and determine that the voice data is an acceptance to the invitation. It will be appreciated that other vocal responses, such as “Yes”, “I'd love to”, etc., may be used to indicate an acceptance.

In another example, the first potential player308may respond to the invitation by performing a gesture. For example, the first potential player308may accept the invitation by making a thumbs-up gesture with her right hand350. A depth camera in the optical sensor system62of the user HMD device42may detect the thumbs-up gesture, and the potential player invitation program14may interpret the gesture as an acceptance of the invitation. It will be appreciated that other gestures or movements made by the first potential player308, such as nodding her head up and down, may be received and interpreted as an acceptance of the invitation.

Upon receiving an acceptance response from the first potential player308, the potential player invitation program14may join the first potential player account56with the user account40in participating in a multiplayer game. In one example, the user304may ask the first potential player308, “How about a game of Dance?” When the first potential player308responds affirmatively, the potential player invitation program14may initiate a game of Dance that includes the user account40and the first potential player account56.

In other examples, the user304may desire to delineate two or more teams of players for a multiplayer game. For example, in the context of a mixed-doubles tennis game, the user304may say, “Jane (first potential player308) and I are one team, and Bjorn (second potential player312) and Chrissie (third potential player316) are the other team.” In these examples, the potential player invitation program14may use voice recognition, eye-tracking information, and other inputs from the user HMD device42to create the two teams desired by the user304.

In other examples, the user304may make a multiple-person invitation. For example, the user304may look at a cluster of people and say, “Would you all like to play?” The user304may also make a gesture directed at the cluster of people while speaking an invitation, such as “You want to play?” Such a gesture may include pointing at and/or sweeping across the cluster of people, pointing to each person in rapid succession, nodding toward the cluster of people, and any other gesture directed at the cluster of people. The potential player invitation program14may use voice recognition, eye-tracking information, depth information, and/or other inputs from the user HMD device42to determine that the words and any gestures of the user304are an invitation to play a game, and to associate the invitation with each of the people in the cluster of people.

In still another example, the user304may observe one or more other potential players who are currently participating in an active multiplayer game. For example, and with continued reference toFIGS. 3 and 4, the second potential player312and the third potential player316may be currently participating in an active multiplayer game of Sword Fighting. As shown inFIG. 4, the transparent display54of the user HMD device42may display one or more images that correspond to virtual object representations used in the active Sword Fighting game, such as swords354. The transparent display54may also display information related to the active game being played by the second potential player312and the third potential player316. In one example, a dialog box358indicating the active game being played may be displayed above the head of the second potential player312and the third potential player316.

In one example, the user304may be permitted to see the dialog box358and/or swords354if the user account40and/or user profile44includes sufficient privileges related to the second potential player312and third potential player316. For example, where the user304is “friends” with the second potential player312and third potential player316via a social networking service, the user304may be permitted to see the swords354via the transparent display54. In another example, the second potential player312and third potential player316may specifically grant permission to the user304to see games these players are playing.

The user304may desire to join the currently-participating second potential player312and third potential player316in their active multiplayer game of Sword Fighting. In one example, the user304may ask the second potential player312, “Hey, can I play too?” The potential player invitation program14may receive the user voice data and determine that it is a request to join the second potential player312in participating in the Sword Fighting game. The second potential player312may provide an acceptance response, such as saying, “Yeah, sure.” Upon receiving an acceptance response from the second potential player312, the potential player invitation program14may join the user account40with an account associated with the second potential player312in participating in the Sword Fighting game.

In another example, the second potential player312or the third potential player316may invite the user304to participate in their active multiplayer game of Sword Fighting. If the user304accepts the invitation, the user account40may be joined with an account associated with the second potential player312and third potential player316in participating in the Sword Fighting game.

While participating in a multiplayer game, such as Dance with the first potential player308, the user304may desire to stop playing the game. In one example, the user304may speak his intention to stop playing the game, such as by saying, “I'm done”, “Thanks everyone, I'm finished”, or other words indicating that the user is exiting the game. Upon receiving the user's voice data from the user HMD device42, the potential player invitation program14may determine that the user voice data is a command to exit the multiplayer game. The potential player invitation program14may then remove the user account40from participation in the multiplayer game.

FIGS. 5A and 5Billustrate a flow chart of a method500for inviting a potential player to participate in a multiplayer game with a user according to an embodiment of the present disclosure. The following description of method500is provided with reference to the software and hardware components of the multiplayer game invitation system10described above and shown inFIGS. 1-4. It will be appreciated that method500may also be performed in other contexts using other suitable hardware and software components.

With reference toFIG. 5A, at504the method may include receiving user voice data from the user304. At508the method500may include determining that the user voice data is an invitation to participate in a multiplayer game. At512the method500may further include receiving from the user HMD device42eye-tracking information, depth information, facial recognition information, potential player HMD device information, and/or potential player voice data.

At516the method500may include associating the invitation from the user304with a potential player using the eye-tracking information, the depth information, the facial recognition information, the potential player HMD device information, and/or the potential player voice data. At520the method500may include matching a potential player account with the potential player. In one example, at524the method may include using the facial recognition information, the potential player head-mounted display device information, and/or the potential player voice data to match the potential player account with the potential player.

In one example, at528the method500may include displaying on the transparent display54of the user HMD device42a list of one or more multiplayer games available to the user304and the potential player. In another example, the depth information received by the user HMD device42may include a measured depth of an interaction space in which the user304and the potential player are located. In this example, at532the method500may include using the depth information to select a suggested multiplayer game that is physically compatible with the measured depth of the interaction space. At536, the method500may include suggesting to the user the suggested multiplayer game.

At540the method500may include displaying on the transparent display54of the user HMD device42an indicator adjacent to the potential player showing a potential player availability and/or an invitation status. With reference now toFIG. 5B, at544the method500may include displaying on the transparent display54of the user HMD device42potential player profile information. At548, the potential player profile information may include a name, a user ID, a game achievement level, and/or game interests of the potential player. At552the method500may include receiving an acceptance response from the potential player. In one example, at556the acceptance response may comprise the potential player voice data and/or a gesture performed by the potential player. At560, the method may include joining the potential player account with a user account associated with the user in participating in the multiplayer game.

At564the method500may include determining that the user voice data is a command to exit the multiplayer game. And at568, the method500may include removing the user account from participation in the multiplayer game.

FIG. 6schematically shows a nonlimiting embodiment of a computing device600that may perform one or more of the above described methods and processes. Computing device600is shown in simplified form. It is to be understood that virtually any computer architecture may be used without departing from the scope of this disclosure. In different embodiments, computing device600may take the form of a mainframe computer, server computer, desktop computer, laptop computer, tablet computer, home entertainment computer, network computing device, mobile computing device, mobile communication device, gaming device, etc.

As shown inFIG. 6, computing device600includes a logic subsystem604, a data-holding subsystem608, a display subsystem612, a communication subsystem616, and a sensor subsystem620. Computing device600may optionally include other subsystems and components not shown inFIG. 6. Computing device600may also optionally include other user input devices such as keyboards, mice, game controllers, and/or touch screens, for example. Further, in some embodiments the methods and processes described herein may be implemented as a computer application, computer service, computer API, computer library, and/or other computer program product in a computing system that includes one or more computers.

Logic subsystem604may include one or more physical devices configured to execute one or more instructions. For example, the logic subsystem may be configured to execute one or more instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more devices, or otherwise arrive at a desired result.

The logic subsystem604may include one or more processors that are configured to execute software instructions. Additionally or alternatively, the logic subsystem may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. Processors of the logic subsystem may be single core or multicore, and the programs executed thereon may be configured for parallel or distributed processing. The logic subsystem may optionally include individual components that are distributed throughout two or more devices, which may be remotely located and/or configured for coordinated processing. One or more aspects of the logic subsystem may be virtualized and executed by remotely accessible networked computing devices configured in a cloud computing configuration.

Data-holding subsystem608may include one or more physical, non-transitory devices configured to hold data and/or instructions executable by the logic subsystem604to implement the herein described methods and processes. When such methods and processes are implemented, the state of data-holding subsystem608may be transformed (e.g., to hold different data).

Data-holding subsystem608may include removable media and/or built-in devices. Data-holding subsystem608may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard disk drive, floppy disk drive, tape drive, MRAM, etc.), among others. Data-holding subsystem608may include devices with one or more of the following characteristics: volatile, nonvolatile, dynamic, static, read/write, read-only, random access, sequential access, location addressable, file addressable, and content addressable. In some embodiments, logic subsystem604and data-holding subsystem608may be integrated into one or more common devices, such as an application specific integrated circuit or a system on a chip.

FIG. 6also shows an aspect of the data-holding subsystem608in the form of removable computer-readable storage media624, which may be used to store and/or transfer data and/or instructions executable to implement the methods and processes described herein. Removable computer-readable storage media624may take the form of CDs, DVDs, HD-DVDs, Blu-Ray Discs, EEPROMs, and/or floppy disks, among others.

It is to be appreciated that data-holding subsystem608includes one or more physical, non-transitory devices. In contrast, in some embodiments aspects of the instructions described herein may be propagated in a transitory fashion by a pure signal (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for at least a finite duration. Furthermore, data and/or other forms of information pertaining to the present disclosure may be propagated by a pure signal.

Display subsystem612may be used to present a visual representation of data held by data-holding subsystem608. Display subsystem612may include, for example, the transparent display54of the user HMD device42. As the above described methods and processes change the data held by the data-holding subsystem608, and thus transform the state of the data-holding subsystem, the state of the display subsystem612may likewise be transformed to visually represent changes in the underlying data. The display subsystem612may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic subsystem604and/or data-holding subsystem608in a shared enclosure, or such display devices may be peripheral display devices.

Communication subsystem616may be configured to communicatively couple computing device600with one or more networks and/or one or more other computing devices. Communication subsystem616may include wired and/or wireless communication devices compatible with one or more different communication protocols. As nonlimiting examples, the communication subsystem616may be configured for communication via a wireless telephone network, a wireless local area network, a wired local area network, a wireless wide area network, a wired wide area network, etc. In some embodiments, the communication subsystem may allow computing device600to send and/or receive messages to and/or from other devices via a network such as the Internet.

Sensor subsystem620may include one or more sensors configured to sense different physical phenomenon (e.g., visible light, infrared light, sound, acceleration, orientation, position, etc.) as described above. For example, the sensor subsystem620may comprise one or more eye-tracking sensors, image sensors, microphones, motion sensors such as accelerometers, touch pads, touch screens, and/or any other suitable sensors. Sensor subsystem620may be configured to provide observation information to logic subsystem604, for example. As described above, observation information such as eye-tracking information, image information, audio information, ambient lighting information, depth information, position information, motion information, and/or any other suitable sensor data may be used to perform the methods and processes described above.

In some embodiments, sensor subsystem620may include a depth camera (e.g., outward facing sensor212ofFIG. 2). The depth camera may include left and right cameras of a stereoscopic vision system, for example. Time-resolved images from both cameras may be registered to each other and combined to yield depth-resolved video.

In other embodiments, the depth camera may be a structured light depth camera configured to project a structured infrared illumination comprising numerous, discrete features (e.g., lines or dots) onto a scene, such as the interaction space300shown inFIG. 3. The depth camera may be configured to image the structured illumination reflected from the scene. Based on the spacings between adjacent features in the various regions of the imaged scene, a depth image of the scene may be constructed.

In other embodiments, the depth camera may be a time-of-flight camera configured to project a pulsed infrared illumination onto the scene. The depth camera may include two cameras configured to detect the pulsed illumination reflected from the scene. Both cameras may include an electronic shutter synchronized to the pulsed illumination. The integration times for the cameras may differ, such that a pixel-resolved time-of-flight of the pulsed illumination, from the source to the scene and then to the cameras, is discernible from the relative amounts of light received in corresponding pixels of the two cameras.

In some embodiments, sensor subsystem620may include a visible light camera, such as a digital camera. Virtually any type of digital camera technology may be used without departing from the scope of this disclosure. As a non-limiting example, the visible light camera may include a charge coupled device image sensor.

The term “program” may be used to describe an aspect of the multiplayer game invitation system10that is implemented to perform one or more particular functions. In some cases, such a program may be instantiated via logic subsystem604executing instructions held by data-holding subsystem608. It is to be understood that different programs may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same program may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The term “program” is meant to encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.

It is to be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated may be performed in the sequence illustrated, in other sequences, in parallel, or in some cases omitted. Likewise, the order of the above-described processes may be changed.

The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.

Claims

  1. A method for inviting a potential player to participate in a multiplayer game with a user, the multiplayer game displayed by a display of a user head-mounted display device, comprising: receiving user voice data from the user;determining that the user voice data is an invitation to participate in the multiplayer game;receiving eye-tracking information, depth information, facial recognition information, potential player head-mounted display device information, and/or potential player voice data;associating the invitation with the potential player using the eye-tracking information, the depth information, the facial recognition information, the potential player head-mounted display device information, and/or the potential player voice data;matching a potential player account with the potential player;receiving an acceptance response from the potential player;and joining the potential player account with a user account associated with the user in participating in the multiplayer game.
  1. The method for inviting a potential player to participate in a multiplayer game of claim 1 , wherein the multiplayer game is an active multiplayer game in which the user is currently participating.
  2. The method for inviting a potential player to participate in a multiplayer game of claim 1 , wherein the depth information comprises a measured depth of an interaction space in which the user and the potential player are located, further comprising: using the depth information to select a suggested multiplayer game that is physically compatible with the measured depth of the interaction space;and suggesting to the user the suggested multiplayer game.
  3. The method for inviting a potential player to participate in a multiplayer game of claim 1 , further comprising displaying via the display of the user head-mounted display device an indicator adjacent to the potential player showing a potential player availability and/or an invitation status.
  4. The method for inviting a potential player to participate in a multiplayer game of claim 1 , further comprising displaying via the display of the user head-mounted display device potential player profile information.
  5. The method for inviting a potential player to participate in a multiplayer game of claim 5 , wherein the potential player profile information includes a name, a user ID, a game achievement level, and/or game interests.
  6. The method for inviting a potential player to participate in a multiplayer game of claim 1 , further comprising matching the potential player account with the potential player using the facial recognition information, the potential player head-mounted display device information, and/or the potential player voice data.
  7. The method for inviting a potential player to participate in a multiplayer game of claim 1 , wherein the acceptance response comprises the potential player voice data and/or a gesture performed by the potential player.
  8. The method for inviting a potential player to participate in a multiplayer game of claim 1 , further comprising: determining that the user voice data is a command to exit the multiplayer game;and removing the user account from participation in the multiplayer game.
  9. A multiplayer game invitation system including a user head-mounted display device having a display, the user head-mounted display device operatively connected to a computing device, the multiplayer game invitation system comprising: a potential player invitation program executed by a processor of the computing device, the potential player invitation program configured to: receive user voice data from a user;determine that the user voice data is an invitation to participate in a multiplayer game;receive eye-tracking information, depth information, facial recognition information, potential player head-mounted display device information, and/or potential player voice data;associate the invitation with a potential player using the eye-tracking information, the depth information, the facial recognition information, the potential player head-mounted display device information, and/or the potential player voice data;match a potential player account with the potential player;receive an acceptance response from the potential player;and join the potential player account with a user account associated with the user in participating in the multiplayer game.
  10. The multiplayer game invitation system of claim 10 , wherein the potential player invitation program is further configured to display via the display of the user head-mounted display device a list of one or more multiplayer games available to the user and the potential player.
  11. The multiplayer game invitation system of claim 10 , wherein the depth information comprises a measured depth of an interaction space in which the user and the potential player are located, and the potential player invitation program is further configured to: use the depth information to select a suggested multiplayer game that is physically compatible with the measured depth of the interaction space;and suggest to the user the suggested multiplayer game.
  12. The multiplayer game invitation system of claim 10 , wherein the potential player invitation program is further configured to display via the display of the user head-mounted display device an indicator adjacent to the potential player showing a potential player availability and/or an invitation status.
  13. The multiplayer game invitation system of claim 10 , wherein the potential player invitation program is further configured to display via the display of the user head-mounted display device potential player profile information.
  14. The multiplayer game invitation system of claim 10 , wherein the potential player invitation program is further configured to: display via the display of the user head-mounted display device an image corresponding to an active multiplayer game in which a currently-participating potential player is participating;determine that the user voice data is a request to join in participating in the active multiplayer game;receive the acceptance response from the currently-participating potential player;and join the user account with a currently-participating potential player account in participating in the active multiplayer game.
  15. The multiplayer game invitation system of claim 10 , wherein the potential player invitation program is further configured to match the potential player account with the potential player using the facial recognition information, the potential player head-mounted display device information, and/or the potential player voice data.
  16. The multiplayer game invitation system of claim 10 , wherein the acceptance response comprises the potential player voice data and/or a gesture performed by the potential player.
  17. The multiplayer game invitation system of claim 10 , wherein the potential player invitation program is further configured to: determine that the user voice data is a command to exit the multiplayer game;and remove the user account from participation in the multiplayer game.
  18. A computer-readable storage medium comprising instructions stored thereon and executable by a computing device to enable a user to invite a potential player to participate in a multiplayer game with the user, the instructions being executable to: receive user voice data from the user;determine that the user voice data is an invitation to participate in a multiplayer game;receive eye-tracking information, depth information, facial recognition information, potential player head-mounted display device information, and/or potential player voice data;associate the invitation with the potential player using the eye-tracking information, the depth information, the facial recognition information, the potential player head-mounted display device information, and/or the potential player voice data;match a potential player account with the potential player;receive an acceptance response from the potential player;and join the potential player account with a user account associated with the user in participating in the multiplayer game.
  19. The computer-readable storage medium of claim 19 , wherein the depth information comprises a measured depth in an interaction space in which the user and the potential player are located, and the instructions are executable by the computing device to: use the depth information to select a suggested multiplayer game that is physically compatible with the measured depth of the interaction space;and suggest to the user the suggested multiplayer game.

Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.