U.S. Pat. No. 10,509,461
AUGMENTED REALITY VIDEO GAME SYSTEMS
Issue DateMay 9, 2018
U.S. Patent No. 10,509,461: Augmented reality video game systems
U.S. Patent No. 10,509,461: Augmented reality video game systems
Issued: December 17, 2019, to Jeffrey David Mullen
Priority Date: October 11, 2007
Summary:
U.S. Patent No. 10,509,461 (the ’461 Patent) relates to an augmented reality video game system using a head-mounted device (HMD). The ’461 Patent details a method of displaying video game indicia on the display of the HMD that may be perceived as being located in the user’s physical environment. In some embodiments, a transparent display allows a user to see their physical environment, such as a table, while a camera that may be attached to the HMD captures images to collect data to detect surfaces and obstacles, such as a book on the table, to affect video game indicia in real-time. In such embodiments, changing surfaces such as the addition of a book may provide an obstacle to climb or disrupt line-of-sight between player controlled characters displayed on the display of the HMD. A user’s environment may be augmented, such as displaying augmented water on top of the table through the HMD’s display. In other embodiments, a non-transparent display may display the user’s environment with interlaced virtual objects.
Abstract:
An augmented reality home console is provided. Users can wear head-mounted displays with transparent screens and can control video game indicia that are perceived as being augmented over a user’s environment by displaying video game indicia onto the transparent displays. As such, for example, two users can play an airplane dog fighting game where each user controls a plane in their living room. A device that can determine surfaces of objects (e.g., walls) can be utilized such that when the airplane is flown into a wall, the airplane explodes.
Illustrative Claim:
1. A system comprising: a game system for providing a video game; a head-mounted display having a video camera for obtaining images in the proximity of said head-mounted display and utilizing said images to detect a surface of a physical object within the proximity of said head-mounted display, wherein said head-mounted display is operable to communicate information to said game system and said head-mounted display is operable to display a virtual object; and a game controller for providing a first three-dimensional control signal to move said virtual object and a second three-dimensional control signal to change the movement of said virtual object from the movement provided from said first three-dimensional control signal, wherein said first three-dimensional control signal moves said virtual object three-dimensionally over at least three axis of movement in a first manner and said second three-dimensional control signal moves said virtual object three-dimensionally over at least three axis of movement in a second manner causing said virtual object to interact with said detected surface
Illustrative Figure
Abstract
An augmented reality home console is provided. Users can wear head-mounted displays with transparent screens and can control video game indicia that are perceived as being augmented over a user's environment by displaying video game indicia onto the transparent displays. As such, for example, two users can play an airplane dog fighting game where each user controls a plane in their living room. A device that can determine surfaces of objects (e.g., walls) can be utilized such that when the airplane is flown into a wall, the airplane explodes.
Description
DETAILED DESCRIPTION OF THE INVENTION FIG. 1shows augmented reality setups100that may include users101and102utilizing table103as a playing surface for an augmented reality video game system. The users may select one of a number of playing areas for an augmented reality video game. For example, users may select to play a game within the confines of area120,130,140, and110. Users may also change the size and shape of a three-dimensional area manually. An area may be shown to a user in a color (e.g., green) such that a user can visually see where the area will be. During game play, a border of a light color may be provided such that the users of the system are aware of the game area. If an object (e.g., a user controlled object) attempts to leave the area, the game may stop the object from leaving the area. At any time, a game may be paused and a play area may be modified. Similarly, for example, when a game is saved, area dimension information may be saved along with a game. When a user returns to play the saved game at a later time, the user may be automatically requested to confirm the saved area dimensions (and the user may be provided with a visual representation of the area to confirm). An object leaving the confines of a game area may, for example, be destroyed and a game may be accordingly affected. For example, a user that drives a virtual vehicle (e.g., plane, space shuttle, car, robot, boat, airplane) into the boundary of a three-dimensional play area may be perform an animation (e.g., explode) when a boundary is hit and points/character lives may be reduced for a user. A game may alternatively, for example, end. Persons skilled in the art will appreciate that a game may ...
DETAILED DESCRIPTION OF THE INVENTION
FIG. 1shows augmented reality setups100that may include users101and102utilizing table103as a playing surface for an augmented reality video game system. The users may select one of a number of playing areas for an augmented reality video game. For example, users may select to play a game within the confines of area120,130,140, and110. Users may also change the size and shape of a three-dimensional area manually. An area may be shown to a user in a color (e.g., green) such that a user can visually see where the area will be. During game play, a border of a light color may be provided such that the users of the system are aware of the game area. If an object (e.g., a user controlled object) attempts to leave the area, the game may stop the object from leaving the area. At any time, a game may be paused and a play area may be modified. Similarly, for example, when a game is saved, area dimension information may be saved along with a game. When a user returns to play the saved game at a later time, the user may be automatically requested to confirm the saved area dimensions (and the user may be provided with a visual representation of the area to confirm). An object leaving the confines of a game area may, for example, be destroyed and a game may be accordingly affected. For example, a user that drives a virtual vehicle (e.g., plane, space shuttle, car, robot, boat, airplane) into the boundary of a three-dimensional play area may be perform an animation (e.g., explode) when a boundary is hit and points/character lives may be reduced for a user. A game may alternatively, for example, end. Persons skilled in the art will appreciate that a game may be provided with a default boundary (e.g., a dome-shaped boundary as in play area110). A user may be provided with the ability to scale the area. When a play area is scaled (e.g., increased in size or decreased in size) the virtual objects of the game may be proportionally scaled by the video game (e.g., increased in size or decreased in size).
Alternatively, for example, a game may be provided in a user-selected play area of varying size with virtual objects of the same size. A user-controllable virtual object (e.g., a virtual vehicle) may be provided at the center of a play area and user-controls may cause the environment to move with respect to the virtual vehicle such that the vehicle stays in the same location within a virtual environment. Such a play methodology may be user-selectable (e.g., an environment moving methodology or a vehicle moving methodology). Accordingly, a user may be provided with the flexibility to play the same game in different physical environments (e.g., in a car on a user's lap versus in a room with a large floor). Similarly, for example, virtual environment indicia may be stationary and a user-controlled object may explore the virtual environment indicia until, for example, the user-controlled object reaches a boundary threshold (e.g., a boundary or a proximity to a boundary). At this point, for example, the environment may scroll with the user-controls such that if an user-controlled object reaches the threshold then the vehicle would stop but the environment would move such that the user perceives the vehicle to be moving at the same speed and in the same direction with respect to the virtual environment indicia.
A game console may process a game or a head-mounted display may process a game (and, in multiplayer modes may share information with another head-mounted display). A game area may be, for example, dome shaped, pyramid shaped, cubed shaped, cylindrically shaped, rectangular shaped, or spherically shaped. Edges may be rounded. Heights, width's, and length's may be user adjusted. A user may store his/her preferred play area parameters (or a list of preferred parameter play areas) on a console or head-mounted device and games may automatically utilize a user's stored area preference (e.g., for a location).
A head-mounted display may include, for example, a device, such as a camera, for use in determining the landscape of a use's environment. Such a device may provide signals (e.g., images) to a processing device (e.g., central processor). In turn, for example, the central processor may determine where to best provide a play area and may autonomously provide a preview of that play area to a user. A user may be directed (via virtual indicia) to look at a particular object (e.g., a floor or table) where the game is intended to play so that the augmented reality platform is quickly able to provide a preview of a play area.
An augmented reality computing platform (e.g., a head-mounted display) may recognize physical objects. For example, a platform may recognize the difference between a wall, table top, floor, stairs, and a doorway. A platform may sense multiple areas in a room where play could occur and may, for example, utilize selection criteria to determine the best area for play. For example, the platform may base the best play area on surface size or volume size. Furthermore, for example, a platform may provide a user with multiple play areas such that a user can select an area for play (e.g., by looking at the area and saying “play”). An augmented reality computing platform may utilize physical manual control interfaces to receive information. An augmented reality computing platform may also utilize an audible speech interfaces to receive information.
Persons skilled in the art will appreciate that image data from a video camera can be, for example, processed in real time to determine the surface structure and boundary structure of objects. Similarly, for example, these determined surface and boundary structures may be utilized to assist in properly aligning and interlacing virtual indicia as a user changes his/her perception of a gaming play area or walks through a gaming play area.
FIG. 2shows game topology200having game area250, in which user210controls object251via controller(s)211and user220controls object252via controller(s)221. User210may utilize controller211to move object251. Physical interfaces may be provided on a controller to accept manual input such as buttons, joysticks, directional pads, and touch interfaces. A controller may also receive input through controller movement. Inertial movement determination devices may be included as well as positioning devices such that a user may, for example, control a virtual object by moving a controller. The pitch, roll, and yaw of a controller may be utilized as control signals to an augmented reality computing platform. A display screen may be provided on a controller such that a user may utilize this display screen to privately view information about a game. Alternatively, one user may be provided with private virtual indicia in play environment250. Accordingly, multiple users may be provided with the same virtual indicia within area250or may be provided with different virtual indicia within area250.
Games may include, for example, board games, racing games, fighting games, adventure games, sports games (e.g., football, soccer, tennis, baseball, basketball), role playing games, educational games or any other type of game. A play area may be provided about a physical object (e.g., a table). For example, a play area may extend onto the sides of a table such that a user can be provided with virtual objects that appear to come out from under the table. Furthermore, for example, a game area may include both the top and bottom surfaces of a table such that a user (e.g., a child) may be motivated to physically move (e.g., exercise) to play the game. For example, a virtual object may be provided that is able to drive on the top surface of a table, the side surfaces of a table, and the bottom surface of a table. Accordingly, a virtual car may be provided that appears to defy gravity. Such game play may, for example, increase the whimsical and festive nature of a game.
FIG. 3shows game topology300in which no game area is set. Person skilled in the art will appreciate that a game may be provided that is not confined to an area of play. A user may walk around his/her actual environment to and the game may be played in his/her actual environment. User301may control virtual object351and user302may control virtual object352. Physical surroundings (e.g., object399) may be detected (either prior to a game during setup and/or during a game) and virtual objects may interact with such physical objects. For example, if a user places object399onto a table, plane351may be animated to crash into object399instead of animated to pass through object399. Deployable sensors may be provided that may be attached to objects to assist in determining the location of the objects or to provide objects with certain capabilities. For example, one sensor may cause a game to utilize an object as a reward object (e.g., touching it gives a user a game point) while another sensor may cause a game to utilize the same object as a penalty object (e.g., touching it subtracts a game point for a user). Alternatively, for example, a user may utilize a graphical user interface (e.g., via a display on a controller or a graphical interface augmented onto a user's physical environment) to associate object characteristics to a physical object. A game may provide virtual indicia onto a physical object to indicate to one, more than one, or all users of a game of the characteristics of an object. For example, a reward object may be given a green tint by an augmented reality gaming console while a penalty object may be given a red tint by an augmented reality console.
FIG. 4shows game topology400that may include a room having ceiling431, walls432and433, and floor434. Games may request that the game be played on a particular surface (e.g., a floor, against a wall, a stairwell, a sink, a backyard, a tree, a pond, a beach). Such a request may assist a system to determine an object's surface structure (e.g., a stairwell or floor). For example, a game may instruct a user to “look at the ceiling and press button A.” The system may then utilize the knowledge of what the user is looking at in order to provide a better augmented reality environment (e.g., an augmented reality game). As per another example, a game may instruct a user to look at the top of a table and provide a manual control signal to confirm the action was taken. The video game system may submit multiple such requests to a user such that, for example, the game may begin to build a model of the user's environment before the user begins to play in the environment. The model of a user's environment can expand and mature as a game progresses as a result of an object detection device (e.g., a video camera). Accordingly, virtual indicia may interact with a user's environment (from the perspective of a user).
User401may control object450while user402may control object460. In topology400, user402has flown object460into ceiling431. The game system has detected that a virtual/physical collision has occurred and has provided a response that meets this virtual/physical collision condition to the user—in this case a crash that has resulted in object461separated from object460and results indicia420being displayed to both user401and402. Results420may be displayed and may rotate such that both user401and402may view results420. Results420may include any information such as, for example, the player that won, how many times each particular player has won, the opportunity for a user to request the game be played again, or a replay of the game that was previously played. If a replay is selected, the users may sit back and watch the match be autonomously replayed on their physical surroundings. As such, information about how a game progresses may be saved while a game is being played such that, for example, the information can be retrieved and the game can be replayed to a users (or users).
FIG. 5shows game topology500that may include object550controlled by one user and object560controlled by another user. Persons skilled in the art will appreciate that computer-controlled objects may also be provided and augmented onto the physical surroundings by changing the perspectives of one or more users. The user controlling object560may provide a control signal to the game such that object560provides an ACTION such as SHOOTING. In this way, for example, a user can control a virtual object and attempt to destroy—via SHOOTING—a virtual object controlled by another user. An object that is attacked (e.g., SHOT) may be damaged and may perform differently. For example, if a virtual plane is shot by another virtual plane, an engine may be destroyed and the speed of the damaged virtual plane may be affected (e.g., slowed).
FIG. 6shows game topology600that may include user501controlling object602and user611controlling object612on the surface of a table. Game information indicia603and604may be displayed to one, some, or all users. For example, game information indicia603and604may be identical. However, game information indicia603may be placed close to user601and game indicia613may be placed close to user611. Accordingly, game indicia613and611may not need to, for example, rotate and may instantly provide information to a particular user. Indicia, such as result and game information indicia, may be provided in three dimensional lettering in order to increase the whimsical and festive nature of the information and allow the information to be more easily recognized from different perspectives.
FIG. 7shows game topology700that may include game console750that is coupled to head-mounted displays via wires752and751. Controllers (and/or head-mounted displays) may be wire-based or wireless. Power may be transmitted wirelessly to head-mounted displays or controllers. Virtual surface799may be provided over a surface by augmenting a user's perspective of a surface through a head-mounted display. Accordingly, users may control objects702and703which may, in turn be controlled about virtual surface799. Virtual game information701may be provided that may include, for example, the status of a game (e.g., the game is PAUSED) and/or the option to pick another game. Examples of augmented reality games may include, for example, monopoly, car racing, battleship, airplane dog fights, boat racing, and combat fighting. A user may select a virtual indicia representative of an option by, for example, utilizing manual controls, via a voice activation, or by moving a virtual object such that the virtual object interacts with the option (e.g., flying a virtual plane through the option to play monopoly).
FIG. 8shows game topology800that may include, for example, stairway801in which objects802and803are computer controlled. A game may, for example, not have any user-controlled virtual objections that interact with a physical surface. A user may be provided with a gun, for example, that may shoot virtual bullets towards computer-controlled objects in order to destroy the computer controlled objects. The controller may include positioning and orientation devices in order to determine the position and orientation of the controller. Accordingly, this information may be utilized to calculate a trajectory for a virtual projectile and the virtual projectile may be shot according to the determined trajectory.
FIG. 9shows game topology900that may include monitors910and920through which virtual objects902,903, and999may be seen. For example, a user may look through monitor910and may see objects902,903, and999. The monitor may be transparent such that the virtual indicia are displayed on the monitors. Alternatively, video of the environment obstructed by the monitor may be shown on the monitor and virtual objects902,903, and999may be displayed on the video feed (in the appropriate location). A video camera may be coupled to the monitor or may reside in the monitor to capture a video. The video may be processed to determine the attributes of the physical surroundings in front of the monitor. Monitors may communicate with each other to ensure, for example, that all monitors are viewing the same physical surroundings. For example, the monitors can make sure they are a particular distance from a common origin and are facing the appropriate direction (at the appropriate angle) from that common origin.
A portable hand-held gaming system may include a display and game controls (e.g., manual buttons, directional pad, joystick, and/or inertial movement and positioning determination systems). The portable gaming system may include a camera. The camera may take video from the back of the portable gaming system such that a user may see the video from a display located on the front of the gaming system. The gaming system may have a rechargeable and/or replaceable battery. The game may wireless download and/or include a port to accept media (e.g., cartridges) that has game logic imprinted thereon. The user may be displayed the video through the display of the portable gaming system and the gaming system may interlace virtual indicia into the video feed such that a user can utilize a portable gaming device to play a game. The screen of the portable hand-held may also be a touch-screen. Furthermore, multiple screens (e.g., two or three) may be provided. Real time video with enhanced virtual images may be provided on one, more than one, or all such display screens. One, more than one, or all such displays may be, for example, touch sensitive. Features of an augmented reality game may be provided to such a hand-held device (e.g., game area selection and play).
FIG. 10shows environment100that includes virtual hardware1020perceived by user1010utilizing controls1005in a physical environment. Persons skilled in the art will appreciate that voice-based controls and thought-based controls may also be utilized in addition to physical controls. A user may utilize a augmented reality computing platform to augment his/her physical environment in a variety of ways. One such way that an augmented reality computing platform may augment a user's environment is to introduce virtual hardware of objects a user desires to use. Virtual hardware may decrease the amount of physical clutter in a user's environment—thus increasing the potential areas for augmentation with virtual indicia. Additionally, virtual hardware devices may be purchased and downloaded on-demand. In doing so, for example, a user does not have to wait for a physical delivery of a physical object in order for that user to benefit from the whimsical and festive nature of that object.
Virtual hardware device perspective1030may be the perspective of virtual hardware device1020is perceived by a user wearing head-mounted display1010in the orientation of the user of head mounted display1010of environment1000. Virtual hardware device1020may be, for example, a virtual television set. Such a virtual television set may include, for example, virtual frame1030, virtual display area1040, virtual base1031, and virtual interfaces1032,1033,1034, and1035. A user may watch, for example, television on virtual display1040. The user may watch a movie on display1040. A user may play a two-dimensional video game on virtual display area1040. For example, a two-dimensional video game may be provided via user-controlled virtual object1021on virtual gaming environment indicia10203with virtual object1022. A two-dimensional gaming surface may show a game that is played in a three-dimensional world or a two-dimensional world. Similarly, an augmented reality video game may be provided on a surface (e.g., table top, wall, or floor) that is displayed on a two-dimensional surface (e.g., showing a game set in a two-dimensional or three-dimensional world). Virtual interface1032may be utilized, for example, to increase the volume of virtual hardware1020. Virtual interface1033may be utilized, for example, to decrease the volume of virtual hardware1020. Virtual interface1034may be utilized, for example, to increase the channel of virtual hardware1020. Virtual interface1035may be utilized, for example, to decrease the channel of virtual hardware1020. Person skilled in the art will appreciate that virtual hardware device1020may have virtual speakers and the computer program defining hardware device1020may instruct a head-mounted display to provide audio associated with hardware device1020in a manner that the audio is perceived by a user to emanate from such virtual speakers.
FIG. 11shows environment1100that may include a user that utilizes head-mounted display1120and controller1120to view augmented reality environments and provide controls to an augmented reality computing platform. Persons skilled in the art will appreciate that a stationary and/or portable display monitor (e.g., a handheld device with a display) may be utilized to provide a user with a stationary or portable keyhole into an augmented reality environment. Such a display may include a camera to take video, in which virtual indicia may be interlaced. Such a display may be transparent such that indicia may be provided on the transparent display. Virtual indicia may be provided to a user via contact lenses having displays or via images provided (e.g., shined) into a user's eye (e.g., retina).
Virtual browser1130may be provided. Virtual browser1130may allow a user to, for example, search an intranet and/or internet. For example, virtual browser1130may be perceived as virtual frame1140, virtual browsing display area1136, virtual address bar1135, virtual back control interface1131, virtual forward control interface1132, virtual home interface1133, and virtual system menu interface1134. Persons skilled in the art will appreciate that virtual hardware and other objects may be set in particular locations. Such locations may be stored in a user's profile such that a user can turn ON an augmented reality computing platform and a user's physical environment can be populated with virtual hardware and other objects previously selected by a user. Virtual menu interface1134may, for example, cause browser1140to be replaced with a graphical user interface showing a menu page for an augmented reality computing system. Virtual menu interface1134may, for example, cause an augmented reality computing graphical user interface to appear in front of (e.g., and aligned or offset or separate from) virtual frame1140.
Interfaces can be selected from a virtual graphical user interface or virtual hardware in a number of ways. A controller may be utilized as, for example, a pointer such that a virtual laser emanates out of an end of a controller. A virtual laser whose path intersects with a virtual interface may, for example, cause that virtual interface to change its appearance (e.g., appear highlighted). A user may then, for example, press a button to confirm selection of the virtual interface and an associated feature may be executed. Manual controls (e.g., a directional pad) may be utilized to navigate through executable portions of a graphical user interface or virtual object (e.g., virtual hardware). Voice-based and/or thought-based controls may also be utilized.
FIG. 12shows environment1200that may include a user wearing a head-mounted device that includes a display (e.g., head-mounted display1210) and a utilizing one or more controllers1220. Virtual graphical user interface1225may be provided to a user and may include frame1230and virtual interfaces1231-1250and1291-1293. Graphical user interface1225may allow a user to select a virtual hardware device to be introduced into the user's actual, physical environment. Persons skilled in the art will appreciate that a virtual reality topologies may also utilize virtual reality hardware devices and may include virtual reality hardware similar to the hardware of those of graphical user interface1225.
A user may, for example, be allowed to set up one, a predetermined number of, or an unlimited number of a particular virtual hardware device. For example, the purchase of a virtual hardware dartboard may allow a user to deploy up to 3 virtual reality dartboards in his/her actual, physical environment.
Virtual interface1231may allow a user to add (e.g., augment) a movie theater screen into his/her physical environment. Virtual interface1231may be associated with, for example, one or more virtual television sets. The selection of virtual interface may cause, for example, a virtual hardware device to deploy or another graphical user interface to be introduced (e.g., replace interface1225) so that the attributes of a virtual hardware device may be selected. Virtual interface1232may be associated with, for example, one or more telephonic devices such as mobile telephonic devices. Mobile virtual hardware devices may follow a user (e.g., may follow in front of a user as a user moves around, but may remain hidden until a request is received from a user to utilize a mobile virtual hardware device). Persons skilled in the art will appreciate that different virtual mobile telephones may include, for example, different telephonic features. For example, a virtual telephonic device may include one plan (e.g., a particular minutes per month, a particular data plan per month, and a particular users per month). For example, a virtual telephonic device may include the ability to play at least certain types of media files (e.g., a particular type of music, games, or video).
Virtual interface1234may be associated with, for example, one or more jukeboxes. Virtual interface1235may be associated with, for example, one or more dartboards. Virtual interface1236may be associated with, for example, one or more pinball machines. Virtual interface1237may be associated with, for example, one or more computers (stationary and/or portable). A virtual laptop may include, for example, virtual software such as virtual word processing, virtual spreadsheet software, virtual gaming software, virtual video playback software, virtual music playback software, or virtual operating system.
Virtual interface1238may be associated with, for example, a virtual browsing device. Virtual interface1239may be associated with, for example, one or more game consoles (e.g., static and/or portable augmented reality and/or virtual reality gaming consoles). Virtual interface1240may be associated with, for example, one or more picture viewers. Virtual interface1241may be associated with, for example, one or more movie libraries. A movie library may, for example, cause a virtual movie-storage shelf to be placed in a room with a virtual movie library. The virtual movies in such a virtual movie library may be rented or may be movies that were purchased by a user. For example, a user may purchase a physical movie-bearing medium (e.g., DVD) and may be provided with a code that may be entered using an augmented reality computing system to add a virtual copy of the movie to a user's virtual movie library. Virtual interface1242may be associated with, for example, one or more virtual projectors. Virtual interface1243may be associated with, for example, one or more virtual robots (e.g., robots that provide knowledge assistance via answers to user-initiated questions). Virtual interface1243may be associated with, for example, one or more virtual libraries (e.g., library storage shelves with virtual books). Virtual interface1244may be associated with, for example, one or more skeeball machines (e.g., the ball of which can be controlled, for example, via a controller). Virtual interface1245may be associated with, for example, one or more virtual air hockey tables (e.g., the paddles of which can be controlled, for example, via a controller). Virtual interface1246may be associated with, for example, one or more virtual tables for playing virtual objects on/around. Persons skilled in the art will appreciate that a virtual table may increase the whimsical and festive nature of an augmented reality by, for example, allowing a virtual hardware device to virtually rest on the virtual table instead of being perceived by a user as being suspended in mid-air without any underlying support.
Virtual interface1247may be associated with, for example, one or more virtual board games (e.g., checkers, chess, or backgammon). Virtual interface1248may be associated with, for example, one or more virtual basketball hoops. Virtual interface1249may be associated with, for example, one or more clocks. Virtual interface1250may be associated with, for example, one or more virtual media players.
Virtual interface1291may allow, for example, a user to add or purchase a virtual hardware device. Virtual interface1292may allow a user, for example, to replace a deployed virtual hardware device in an augmented reality setting with another virtual hardware device. Virtual interface1293may allow, for example, a user to clear his/her augmented reality environment such that, for example, a user perceives his/her physical environment without any virtual indicia augmented onto the user's physical environment.
FIG. 13shows environment1300that may include a user having head-mounted augmented reality system1310and controller1320. Virtual graphical user interface1325may be provided and may be perceived by a user as having frame1330. A user may select virtual environments. Virtual environments may augment all or, or at least a portion of, a user's actual environment. Virtual environments may be utilized to increase the whimsical and festive nature of a user's augmented environment. Virtual environments may also be utilized by programs, such as augmented reality video games. Virtual interface1331may be utilized to provide a user with, for example, a plane cockpit environment. A plane cockpit environment may immerse a user into an environment that is more representative of a plane cockpit then the physical environment of a user. A plane cockpit environment may include, for example, virtual objects that are user-controllable such as, for example, virtual switches, knobs, wheels, and other virtual controls. A plane cockpit environment may include, for example, virtual displays (e.g., virtual altimeter and virtual horizon displays). A virtual cockpit environment may include, for example, a virtual cockpit dashboard. Persons skilled in the art will appreciate that if a virtual reality gaming program is initiated, then a user may be provided with a virtual world. However, such a virtual world may be provided about physical objects such that, for example, a user does not accidently trip over a physical object. Placement of virtual indicia may be autonomously determined by an augmented reality computing system.
Virtual interface1332may be utilized to provide a user with, for example, a tank cockpit environment. Virtual interface1333may be utilized to provide a user with, for example, a beach. Virtual interface1331may be utilized to provide a user with, for example, a jungle environment. Virtual interface1336may be utilized to provide a user with, for example, a space environment. Virtual interface1331may be utilized to provide a user with, for example, a plane cockpit environment. Virtual interface1337may be utilized to provide a user with, for example, an ocean environment (e.g., an underwater environment or a boat deck environment). Virtual interface1338may be utilized to provide a user with, for example, a car cockpit environment. Virtual interface1339may be utilized to provide a user with, for example, a Christmas environment. Virtual interface1340may be utilized to provide a user with, for example, an arcade environment (e.g., an environment with multiple arcade games). Virtual interface1341may be utilized to provide a user with, for example, a virtual store environment (e.g., a virtual electronics store). Virtual interface1342may be utilized to provide a user with, for example, a virtual mall environment (e.g., with multiple virtual stores). Virtual interface1343may be utilized to provide a user with, for example, a virtual theme park environment. Virtual interface1343may be utilized to provide a user with, for example, a virtual zoo. Persons skilled in the art will appreciate that the use of virtual environments may be associated with, for example, a one-time, subscription based, or feature-dependent cost. Vendors may provide interfaces associated with augmented environments on an intranet that multiple augmented reality computing systems are in communication with. Virtual interface1344may be utilized to provide a user with, for example, a virtual aquarium. Virtual interface1345may be utilized to provide a user with, for example, a virtual Americana environment. Virtual interface1346may be utilized to provide a user with, for example, a virtual Asian environment. Virtual interface1347may be utilized to provide a user with, for example, a virtual Indian environment. Virtual interface1348may be utilized to provide a user with, for example, a virtual dating room. Persons skilled in the art will appreciate that multiple users may utilize an augmented environment. Users in other locations (e.g., other houses) may be allowed, for example, to have a three-dimensional virtual avatar to be virtualized in the houses of other users. Accordingly, for example, a dating room environment may be provided that allows a particular number of users (e.g., 2) to be virtualized via an avatar in the physical environment of the other users such that a more personal form of communication may be perceived. Virtual interface1349may be utilized to provide a user with, for example, a virtual lecture environment (e.g., where a teacher provides a teaching avatar to the environments of the students and students provide student avatars to the environment of the teacher). Virtual interface1350may be utilized to provide a user with, for example, a virtual game room environment.
Virtual interface1391may be utilized by a user to utilize a virtual environment. Virtual interface1392may be utilized by a user to preview a virtual environment (e.g., via a virtual display screen augmented into a user's physical environment). Virtual interface1393may be utilized to, for example, clear a user's physical environment of virtual environment indicia or all virtual indicia.
FIG. 14shows environment1400. User hand1401and user hand1402may each hold a controller. For example, user hand1401may hold controller1410and user hand1402may hold controller1420. Each controller may include positioning and/or inertial movement determination devices such that the location/movement of a controller can be communicated to an augmented reality computing system. Accordingly, a user may utilize his/her hands as general controls to an augmented reality application such as an augmented reality game. A user's hand may be replaced with virtual indicia (from the perspective of the user utilizing an augmented reality head-mounted display). Virtual indicia may be augmented on top of a user's hand. Accordingly, for example, a user may be perceived as holding a virtual object such as a virtual basketball. A user may, gripping a controller, shoot the virtual basketball towards a virtual basketball hoop the user installed in his/her living room. Furthermore, for example, the virtual basketball may react with a user's physical environment. For example, a user's head-mounted display may include cameras that are operable to view more of a user's environment than the user can perceive. The augmented reality system may determine, for example, the location of walls, floors, and other objects. In or near real time, for example, the augmented system may determine that a virtual basketball that was thrown by a user at a trajectory has hit a physical surface (e.g., a ceiling) and the augmented system may utilize this information to change the course of the virtual ball (e.g., by causing the ball to be perceived by the user to bounce off the user's ceiling). A user may perceive his/her hand via perception1450such that hand1401is perceived as hand1451having virtual indicia (e.g., ball)1460. A user may perceive his/her hand1402to be hand1452having virtual indicia (e.g., star)1470). A user may play an augmented reality game that includes virtual indicia (e.g., a virtual star floating mid-air in a user's physical environment). The user may, gripping a controller, pass his/her hand through the star. In doing so, for example, the star may “stick” to a user's hand. Accordingly, a user may interact and move virtual indicia using his/her hands. One or more buttons may be associated with particular virtual object interactions (e.g., picking up, putting down, using an object). For example, a game may include a virtual jar of virtual liquid. A user may swipe his/her hand through the virtual jar of virtual liquid and the virtual jar may be perceived to stick to a user's hand as the user grips the controller. The user may press one button to release the jar. The user may press another button to use (e.g., drink in the video game) the virtual contents of the virtual jar.
A controller may include any number of devices. For example, controller1411may include inertial movement determination devices1411(e.g., an array of accelerometers and gyroscopes), battery1412, communication devices (e.g., receivers and transmitters)1413, buttons1414, feedback devices (e.g., vibrational feedback devices)1414, and extension ports1415(e.g., for one or more peripheral devices such as memory cartridges). Controller1420may include, for example, button1422located at one end of a controller. A controller may be, for example, generally cylindrical in shape.
FIG. 15shows physical environment1500and augmented reality usable area determination perception1550. A user's physical environment may include numerous objects. For example, a user's physical environment may be an outdoor environment that is relatively open or includes trees, a pond, and other people. Alternatively, for example, a user's physical environment may be an indoor environment that includes wall1501, wall1502, picture1503, doorway1504, wall1507, table1505, and floor1506. An augmented reality computing system may, for example, include a head-mounted display that includes a number of video cameras that can capture video of an area larger than a user can perceive. A user, at the start of a game, application, may be asked to turn around 360 degrees at a particular speed such that a camera system may obtain a scan of a room. Alternatively, for example, a system may include cameras that face all directions such that the system can autonomously obtain a 360 degree field of view. The system can then, for example, determine open surface areas over which virtual indicia may be augmented. An environment and/or game may include a list of virtual indicia to augment into a user's environment. This list may be prioritized such that, for example, a game can be played in a room even if there is not enough room to place all of the virtual indicia in a room. Virtual indicia may also be associated with whether the game can be played without the virtual indicia. For example, an enemy character may be associated with an augmentation profile that does not allow the game to be played if the enemy character cannot be augmented adequately into a user's environment. A virtual environment indicia (e.g., virtual picture for a wall) may be associated with an augmentation profile that allows the game to be played if the virtual picture cannot be augmented adequately into a user's environment. An augmentation profile may be associated with each virtual indicia. For example, an augmentation profile may include the type of object over which the object can be augmented (e.g., one virtual object may be associated with a floor or table surface, another virtual object may be associated with a wall). An augmentation profile may include the area a virtual object requires to move around (e.g., for a computer-controlled virtual character/object). An augmented computing system may, for example, determine different area combinations that may be provided such that the augmented computing system may determine the most virtual indicia that can be introduced into an environment or the most virtual indicia having at least a particular priority. An augmented reality computing system may scale areas and, accordingly, change the behavior of particular virtual objects such that more virtual objects may be introduced into a user's environment. Persons skilled in the art will appreciate that scaling the size and movement ability of virtual characters/objects may, for example, change the difficulty of a game. Accordingly, for example, a user may be provided with handicaps to counter any such difficulty (e.g., in a shooting game a smaller virtual object may be associated with more points).
Perception1550may be determined by an augmented reality computing system, which may, in turn, determine useable areas1551-1559. Persons skilled in the art will appreciate that useable portion of an environment may be determined as a useable surface area and/or usable volume.
FIG. 16may include, for example, head-mounted augmented reality system1600. System1600may include, for example, display1611and display1612. Persons skilled in the art will appreciate that a head-mounted display may include two displays. These two displays may be transparent (and virtual indicia provided on or projected onto the display) or these two displays may be non-transparent (and virtual indicia interlaced with real, or near real time, video captured of a user's perceived environment if no head-mounted display was presented. The two displays may show different perspectives of an augmented reality such that a user visualizes one perspective of an augmented reality. System1600may include, for example, microphones1621-1623for picking up sounds from a user's environment (e.g., microphones1621and1622) as well as those spoken from a user (e.g., microphone1623). System1600may include one or more manual switches1601(e.g., to turn a system ON or OFF or to receive manual input associated with other actions). Manual-input receiving interfaces1601-1604may also be provided to receive manual inputs from a user (e.g., to receive manual input to assist in selecting options on a virtual hardware device, graphical user interface, or browser. System1600may include two portions that extends along the side of a user's head to support the system—each of which may include, for example, one or more speakers1624, inertial movement determination devices1633, computing circuitry (e.g., processors)1690, and rechargeable/non-rechargeable permanent/replaceable batteries1680.
System1600may include, for example, any number of cameras such as cameras1641-1643. The view of cameras may overlap such that surface areas and object structures can more easily be determined by an augmented reality system. Any number of inertial movement determination devices1631and1632may be provided as well as positioning devices1634. A system may include, for example, a telephonic capability such that one user of one augmented reality system may verbally communicate with another user of another augmented reality system. Communications device1660may include, for example, a telephonic communications component. Augmented reality information (e.g., game information) may also be communicated via communications device1660(e.g., and stored at an intermediary server that delivers the information to user-permitted augmented reality users). Wireless power receiver1670may be included to wirelessly receive power from a wireless source of power. Persons skilled in the art will appreciate that inertial movement determination devices may be utilized to determine a users permission. Acceleration data may be integrated to determine velocity data. Velocity data may be integrated to determine position data. Positioning devices may include, for example, satellite positioning signal receivers and/or local positioning receivers. Mobile telephone base stations or gaming devices located about a user's environment may signal with a portable system to determine the position of the system via, for example, triangulation. A system may wirelessly communicate with an external processing system (e.g., stationary video game console). A system may communicate with an external processing system (e.g., stationary video game console) via wire-based communications. Persons skilled in the art will appreciate that a remote server may be utilize to assist in processing game data and a user may store game, or other augmented reality system data, on a remote server.
Persons skilled in the art will also appreciate that the present invention is not limited to only the embodiments described. Instead, the present invention more generally involves video games—both mobile and stationary. Persons skilled in the art will also appreciate that the apparatus of the present invention may be implemented in other ways then those described herein. All such modifications are within the scope of the present invention, which is limited only by the claims that follow.
Claims
- A system comprising: a game system for providing a video game;a head-mounted display having a video camera for obtaining images in the proximity of said head-mounted display and utilizing said images to detect a surface of a physical object within the proximity of said head-mounted display, wherein said head-mounted display is operable to communicate information to said game system and said head-mounted display is operable to display a virtual object;and a game controller for providing a first three-dimensional control signal to move said virtual object and a second three-dimensional control signal to change the movement of said virtual object from the movement provided from said first three-dimensional control signal, wherein said first three-dimensional control signal moves said virtual object three-dimensionally over at least three axis of movement in a first manner and said second three-dimensional control signal moves said virtual object three-dimensionally over at least three axis of movement in a second manner causing said virtual object to interact with said detected surface.
- The system of claim 1 , wherein said head-mounted display comprises a non-transparent display.
- The system of claim 1 , wherein said video game has a selectable playing area.
- The system of claim 1 , wherein said game controller includes buttons.
- The system of claim 1 , wherein said video game is a board game.
- The system of claim 1 , wherein said video game is a sports game.
- The system of claim 1 , wherein said head mounted display is powered wirelessly.
- The system of claim 1 , wherein said controller is powered wirelessly.
- The system of claim 1 , further comprising a second camera.
- The system of claim 1 , further comprising a second camera and a third camera.
- The system of claim 1 , further comprising a second camera and a third camera and the views of said second and third camera overlap.
- The system of claim 1 , further comprising a telephonic communications component.
- The system of claim 1 , further comprising a selected virtual environment.
- The system of claim 1 , further comprising a switch.
- The system of claim 1 , further comprising a microphone.
- The system of claim 1 , further comprising a first microphone and a second microphone.
- The system of claim 1 , further comprising a first microphone, a second microphone, a second camera, and a third camera.
- The system of claim 1 , further wherein said head-mounted display comprises a transparent display.
- The system of claim 1 , further comprising a microphone and a second camera.
- The system of claim 1 , further comprising a microphone and a second camera, wherein said head-mounted display comprises a transparent display.
Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.
