U.S. Pat. No. 8,585,476
LOCATION-BASED GAMES AND AUGMENTED REALITY SYSTEMS
Issue DateNovember 16, 2005
U.S. Patent No. 8,585,476: Location-based games and augmented reality systems
Summary:
The ‘476 patent deals with virtual realities and augmented realities that rely on real world data to create the virtual world. The player’s location in the real world is tracked using GPS or some other location tracking device, such as a play mat, while the player uses some manner of virtual reality headset. The game uses that location and movement from that data to correlate with location and movement in a virtual world. The augmented reality may rely on visual input from the real world with supplementary elements added by virtual reality. The patent covers both major movements, such riding in a driving a car or traversing a field, and minor movements, such as hand movements.
Abstract:
Handheld location based games are provided in which a user’s physical location correlates to the virtual location of a virtual character on a virtual playfield.
Augmented Reality (AR) systems are provided in which video game indicia are overlaid onto a user’s physical environment. A landscape detector is provided that may obtain information about the user’s landscape, in addition to the user’s location, in order to provide overlaying information to an AR head-mounted display and control information to non-user controlled video game characters.
1. An augmented reality game system comprising:
a head-mounted display that overlays virtual indicia onto a physical playfield;
memory comprising video game logic that provides a video game;
a wearable processor that utilizes said video game logic to provide video game indicia to said head-mounted display based on said video game logic, wherein said processor is coupled to said memory and said head-mounted display;
a detector that determines landscape characteristics of said physical playfield, wherein said video game logic utilizes said landscape characteristics in providing said video game; and
a locating device that determines the physical location of said locating device on the physical playfield, wherein said video game logic utilizes the physical location of said locating device in providing said video game.
Illustrative Figure
Abstract
Handheld location based games are provided in which a user's physical location correlates to the virtual location of a virtual character on a virtual playfield. Augmented Reality (AR) systems are provided in which video game indicia are overlaid onto a user's physical environment. A landscape detector is provided that may obtain information about the user's landscape, in addition to the user's location, in order to provide overlaying information to an AR head-mounted display and control information to non-user controlled video game characters.
Description
DETAILED DESCRIPTION OF THE DRAWINGS U.S. Provisional Patent Application No. 60/603,481 filed on Aug. 20, 2004 entitled “Wireless Devices With Flexible Monitors and Keyboards” and U.S. patent application Ser. No. 11/208,943 filed on Aug. 22, 2005 entitled “Wireless Devices With Flexible Monitors and Keyboards” are hereby incorporated by reference herein in their entirety. U.S. Provisional Patent Application No. 60/560,435 filed on Apr. 7, 2004 entitled “Advanced Cooperative Defensive Military Tactics, Armor, and Systems” and U.S. patent application Ser. No. 11/101,782 filed on Apr. 7, 2005 entitled “Advanced Cooperative Defensive Military Tactics, Armor, and Systems” are hereby incorporated by reference herein in their entirety. U.S. Provisional Application No. 60/560,435 filed on Sep. 2, 2003 entitled “Systems and Methods for Location Based Games and Employment of the Same on Location Enabled Devices” and U.S. patent application Ser. No. 10/932,536 filed on Sep. 1, 2004 entitled “Systems and Methods for Location-Based Games and Employment of the Same on Location-Enabled Devices” are hereby incorporated by reference herein in their entirety. U.S. patent application Ser. No. 10/797,801 filed on Mar. 9, 2004 titled “Systems and Methods for Providing Remote Incoming Call Notification for Wireless Telephones” is hereby incorporated by reference herein in its entirety. U.S. Provisional Patent Application No. 60/367,967 filed on Mar. 25, 2002 entitled “Systems and Methods for Locating Cellular Phones” and U.S. patent application Ser. No. 10/400,296 filed on Mar. 25, 2003 titled “Systems and Methods for Locating Wireless Telephones and Security Measures for the Same” are hereby incorporated by reference herein in their entirety. Turning first toFIG. 1, gaming system100is provided that includes handheld game system101and playmat150. Gaming system100may be a location-based game system in which the physical location (or physical movement) of a user on a physical playfield determines the virtual location (or virtual movement) of a virtual character on ...
DETAILED DESCRIPTION OF THE DRAWINGS
U.S. Provisional Patent Application No. 60/603,481 filed on Aug. 20, 2004 entitled “Wireless Devices With Flexible Monitors and Keyboards” and U.S. patent application Ser. No. 11/208,943 filed on Aug. 22, 2005 entitled “Wireless Devices With Flexible Monitors and Keyboards” are hereby incorporated by reference herein in their entirety.
U.S. Provisional Patent Application No. 60/560,435 filed on Apr. 7, 2004 entitled “Advanced Cooperative Defensive Military Tactics, Armor, and Systems” and U.S. patent application Ser. No. 11/101,782 filed on Apr. 7, 2005 entitled “Advanced Cooperative Defensive Military Tactics, Armor, and Systems” are hereby incorporated by reference herein in their entirety.
U.S. Provisional Application No. 60/560,435 filed on Sep. 2, 2003 entitled “Systems and Methods for Location Based Games and Employment of the Same on Location Enabled Devices” and U.S. patent application Ser. No. 10/932,536 filed on Sep. 1, 2004 entitled “Systems and Methods for Location-Based Games and Employment of the Same on Location-Enabled Devices” are hereby incorporated by reference herein in their entirety.
U.S. patent application Ser. No. 10/797,801 filed on Mar. 9, 2004 titled “Systems and Methods for Providing Remote Incoming Call Notification for Wireless Telephones” is hereby incorporated by reference herein in its entirety.
U.S. Provisional Patent Application No. 60/367,967 filed on Mar. 25, 2002 entitled “Systems and Methods for Locating Cellular Phones” and U.S. patent application Ser. No. 10/400,296 filed on Mar. 25, 2003 titled “Systems and Methods for Locating Wireless Telephones and Security Measures for the Same” are hereby incorporated by reference herein in their entirety.
Turning first toFIG. 1, gaming system100is provided that includes handheld game system101and playmat150.
Gaming system100may be a location-based game system in which the physical location (or physical movement) of a user on a physical playfield determines the virtual location (or virtual movement) of a virtual character on a virtual playfield. Location information may be obtained through, for example, any type of triangulation technique such as a GPS system or a localized positioning system (LPS). For example, the time it takes multiple signals from multiple transmitters to reach device101may be utilized to determine the position of device101. Location information may alternatively be obtained through various cell phone or wireless LAN location techniques. For example, a user's signal strength between multiple hubs or base stations may be utilized to determine that user's location. As per another example, inertial movement sensors such as accelerometers and/or gyroscopes may be utilized to keep track of a user's movement in a particular direction. In this manner, the user's location may be determined and updated based on the user's movements. Hybrids of such systems may also be utilized. For example, an accelerometer may be utilized to keep track of a user's position until a second locating signal is provided (e.g., a GPS system). In this manner, a GPS signal may be the master locating signal while the accelerometer provides location updates between GPS signals. Generally, device140is the locating device (or locating devices) for game system101.
Game system101may include manual controls120and manual control switch132that turns ON and OFF location-based controls. In this manner, a user may still obtain functionality from game system101while, for example, sitting on a park bench. ON/OFF switch131may control when device101is turned ON and OFF.
Persons skilled in the art will appreciate that controls similar to manual controls120and131may also be provided on an AR game system. Thus, a user may use manual controls to control the location of a video game character in an AR game (e.g., control what first-person perspective is displayed on an AR display) without physically moving. A user may also use manual controls similar to manual controls120and131to toggle between an AR and VR game, toggle between AR and VR configurations of a game, and toggle from a location-based control scheme to a manual control scheme after an AR game configuration has been toggled to a VR game configuration. Thus, if a user is located in an environment that makes location-based AR gameplay difficult, (e.g., a small-room or in a car), the user can instruct the game system to provide a VR version of the game to be played with a manual controller. Thus, a user may instruct an AR/VR game system to display all virtual indicia on a head-mounted display (and/or render all virtual indicia) and not allow any areas of the display to become transparent. Thus, a user may instruct an AR/VR game system to switch from location-based control to manual input control. For systems with multiple control signals generated from multiple control devices, a switch for alternate control schemes may be provided for each control device. For example, a user may turn a location-based sensor in a head-mounted display off, thus allowing a directional pad on a controller to control the location of a video-game character (a user may also turn a switch associated with the directional pad ON). Yet, the user may decide not turn OFF inertial movement sensors in a hand-held controller, thus deciding not to, for example, use control buttons or a directional pad to replace the functionality of the inertial movement sensors. Thus, a user may still swing around a hand-held controller to register internal sensor readings as control signals to, for example, swing around a video game object (e.g., a sword or lightsaber) in a game (e.g., a VR game) when the user is sitting in a car even though the user could, for example, switch to a directional pad for the control the video game object. Additional examples of a video game object controlled by one or more inertial sensors may include, for example, a fishing rod, tennis racket, baseball bat, pool cue, football (e.g., throwing a football), baseball (e.g., throwing a football), steering wheel, clutch, gun (or another object-projecting device or projectile), horse-racing whip, frisbee, net, boxing gloves, or any type of object or action.
Persons skilled in the art will appreciate that a location-based game system may not require a controller in addition to a game system. For example, game system101may be fabricated with just one or more location sensors and/or inertial movement sensors without any additional manual controls. In one example, game system101may be a low-cost system that only provides a primary control signal to move a virtual character in a virtual world (e.g., move a frog through an environment). Additional manual controls may be provided on a game system (e.g., controls120) and a game system may include connection ports to receive additional devices such as additional controllers, other game systems, displays (e.g., a TV or a head-mounted display), memory, add-on modules (e.g., software and/or hardware upgrade modules), or any type of peripheral.
Playmat120may be provided in order to increase the whimsical and festive nature of playing game system101. For example, playmat120may include indicia similar to environment indicia in a particular game. Playmat150may be sized according to the characteristics of a game, or virtual environment, on system101. For example, if a game on game system101has a water component and a land component, playmat1450may have indicia of a water component (e.g., indicia152) and land component (e.g., indicia151). The size of each of these components may correspond to the movement needed of device101to travel across these components in a virtual environment. For example, if a user has to move 5 feet to cross the land component on a level of a game provided by game system101then the land component of playmat150may be 5 feet long.
Playmats may be distributed with game system101in kits. In this manner, multiple playmats may be included in such a kit (e.g., a retail package) that correspond to different environmental indicia on the game. So, for example, the kit may include a level 1 playmat and a level 2 playmat. Alternatively, multiple versions of the same playmats may be included of the same level (e.g., have the same type of indicia), but may be fabricated in different sizes. Alternatively a playmat with an adjustable size may be provided. Alternatively still, a playmat may be provided with multiple different play areas (e.g., one half is used for level 1, the second half is used for level two) that can utilize both sides of the mat (e.g., one side is used for level 1, the second side is used for level two). By including different playmats, or by defining different playmat areas, a user may use a different playmat, or playmat area, depending on how much physical movement is needed to move a virtual game character. Control of a virtual game character may be adjustable such that, for example, one mode is provided where a 1 foot movement moves the virtual character 1 pixel while a second mode is provided where a 2 foot movement moves the character 1 pixel. Playmat150may include apertures180such that playmat150may be secured to a surface (e.g., pegged into the ground). Indicia may also be located on the playmat that corresponds to objects in the game. For example, the goal of the game may be included as indicia on a playmat (e.g., indicia160).
Some game systems may use a reference location such that a user is requested to return to that reference location before playing, for example, a particular level (e.g., the next level after a level has been completed). Such a reference position may also be included as indicia on the playmat (e.g., indicia170). Game system101may include display102for displaying a video game (e.g., displaying a 2-dimensional or 3-dimensional image). Location device140(e.g., a positioning system and/or inertial movement sensing system) may control the movement of a video game character in a virtual world. The movement of the video game character in the virtual world may be displayed on display102. Display102may be, for example, a transparent display capable of having virtual indicia displayed selective portions of the display. Thus, a user may look down through display102and see playmat150. Any number of virtual indicia may be, in the example of FROGGER, a frog, a number of moving cars and busses, and a number of moving logs). Thus, the static environment of the video game may not need to be displayed on such a transparent display because indicia representative of this static environment may exist on playmat150. In this manner, a user may look down through a transparent display and see the portion of playmat150that is aligned beneath the transparent display. If virtual indicia is supposed to be in the area of the virtual world that corresponds to the area of playmat150being viewed through a transparent display of system101, then that virtual indicia may be appropriately displayed on the transparent display screen. Persons skilled in the art will appreciate that when a user is looking through a transparent display toward ground, the visibility of virtual indicia may be limited. Functionality may be provided in game system101such that display102is a transparent display that can be held up to a user's eyes. Thus, the direction, the location, and the pitch of game system101may, for example, be determined and utilized to determine what, and where, virtual indicia are displayed on such a transparent display.
Persons skilled in the art will appreciate that a head-mounted display (or a display that is held-up to a user's eyes) may be provided with any number of horizontal, vertical, or otherwise aligned two-dimensional images or three-dimensional images. For example, to reduce the complexity of a hand-held game system101with the functionality of a hold-to-eye AR system, the virtual indicia may be provided on the display such that the virtual indicia is provided as flat, horizontal two-dimensional images hovering over playmat150. One advantage of such a two-dimensional example may be that functionally the size of display screen102becomes the size of playmat150when viewed through display102when display102is provided as a transparent display operable to display virtual images. Another advantage is that the same processing that is utilized to render a two-dimensional game (e.g., a two-dimensional FROGGER game) may be utilized to render a two-dimensional AR game. Game system101may include a clip111attached to game system101via band110.
FIG. 2shows handheld game system200that is fabricated in a shape similar to a virtual character that a user controls by moving game system200. As illustrated, game system200shows the classic game of FROGGER in which the main character is a frog. Handheld game system200is fabricated to resemble a frog, thus adding to the whimsical and festive nature of the functionality of the location-based game. Characteristics other than the shape may be manipulated to increase the whimsical and festive nature of a game system. The paint scheme of the system may, for example, be associated to the colors of a virtual indicia operable of being provided by the game system.
FIG. 3shows one mapping embodiment in which actual playfield310may be scaled to virtual playfield350. The use of a virtual playfield may be utilized, for example, in systems in which movement scaling is desired. In this manner, a user may be provided with the ability to control how far a particular physical movement (e.g., a 1 foot movement) moves a virtual character (e.g., 1 pixel or 2 pixels). The use of a virtual playfield may also be utilized in systems in which multiple players are on the same virtual playfield. Accordingly, location information may be transmitted directly to a second game device or via an intermediary device such as a database (e.g., a database remote from the multiple game devices). Such a database may push information received from one game device to another game device. Alternatively, the gaming device may periodically, or continually, check the database to see if new information is available or what information is available. In order to share information, such as location information, between two or more gaming devices, a security measure may be provided. For example, the user of one gaming device may request that the user of a second gaming device grant permission for the user of the requesting gaming device to retrieve location information of the requested user's game device from a remote database. The requested user may then use his/her game device to grant permission. Permissions can be granted, for example, for the duration of a particular game between the players, any game between the players for a particular game, any game between the players for any game, a particular length of time (e.g., a day or a week), or a particular time of day (e.g., after-school). Additional parental controls may be provided that allow a third-party to change the permissions of a game device (e.g., such that a child cannot play multiplayer during school hours). The identification of an individual to a game device may, for example, be done via a username and password. A game device may be operable to communicate wirelessly with any number of services. Thus, a parent may go on the internet and contact a server/database that is operable to communicate with a gaming device, identify himself/herself, and change the permissions the child has for using the system. Such a parental control may be operable to prohibit any type of multiple play or games that exceed a particular maturity rating (e.g., a child may be allowed to access everyone games, but not teen or adult games).
A parent, the user, or a third-party, may prohibit play within particular areas or may prohibit game information being sent in particular areas. For example, the manufacturer may prohibit the a game system from sending location information to game systems located in different regions such that U.S. players cannot play against Japanese players. Alternatively, the manufacturer may prohibit a U.S. region game to be played on U.S. region system, or any region system, outside of the U.S. (e.g., in Japan). A game, or game system, may be operable to store a log of game play and a history of where the game device has been located over time. In this manner, games may be fabricated in which the game world corresponds to an actual area of the physical world (such as the planet Earth or New York City). As such, a New York City game may be provided in which you travel New York City playing the game. Accordingly, the game may prompt a player to go to a particular location (e.g., Times Square) to play a mini-game associated to that location (e.g., a Times Square game) in order to obtain a virtual item (e.g., a Times Square T-shirt).
A location-based game (e.g., an AR game) may have its own virtual currency system such that any user can exchange real money for the virtual currency system, with the game manufacturer taking a percentage cut, and the virtual currency can be used to buy/sell items. A user can be provided with the option to then convert the virtual currency back to real currency (e.g., the U.S. dollar) with the game manufacturer again taking a percentage of the conversion. In this manner, a number of advertising schemes may be provided in a video game, or an augmented application, such as an AR video game. For example, an advertiser may buy advertising space at a particular location. A user may view the location at the particular location. If a user enters a physical store, or a virtual store, associated with the advertisements location then the user may be given a portion of the proceeds of the money collected by the advertiser. Alternatively, an advertiser may designate a maximum amount of advertising money the advertiser is willing to spend for a particular amount of time and the amount of money the advertiser is desirous of paying for each entering, or qualified, customer. A qualified customer may be, for example, any user that enters the store with a device operable to display/receive the advertisement, any user whose entrance denotes the first known entrance of that user to a particular location, any user whose entrance denotes the first known entrance of that user to a particular chain of stores (e.g., McDonalds), or any user that interacts with the advertisement before entering the store. When a user is determined to be a qualified a user, a percentage of the advertiser's fee for that qualified user may go to the video game manufacturer (or other third-party such a location-based advertisement service) and a percentage of the advertiser's fee for that qualified user may go to the qualified user. Alternatively, for example, the store that gave up its virtual advertising space (e.g., in an AR game where physical locations in the world are a part of gameplay) may get a percentage (e.g., the qualified user's percentage). Thus, a store can virtually advertise to an AR video game (or another location-based service such as an AR information application run on a wearable computer) where third party advertisements are provided by an advertising service. Each store (or chain of stores) may define a particular type of third-party virtual advertisement to be displayed (e.g., sporting good ads) and, for example, a minimum price per qualified customer that needs to be met to provide the third-party advertisement over a default advertisement chosen by the store (e.g., the stores advertisement).
Location scaling may be advantageously utilized in a number of situations. For example, one player may scale his/her physical movement differently than another player in order to produce a handicap/advantage or to allow for two players to play an AR game that are located on different physical playfields having different sizes. Thus, cousins can play against/with each other in the same virtual environment even though one player is located in a small, fenced-in backyard in Texas while the other player is located in wide, open soccer field in Minnesota.
Information on the actual playfield and virtual playfield may be stored on the memory of a portable device, an intermediary device, or both. Also, information about the actual playfield may be stored on the portable device while the virtual playfield is stored, for example, on an intermediary device. Information about a playfield may include, for example, the parameters of the playfield, scaling information, and the status of the playfield (e.g., where a particular user is located in an actual playfield and where a user is located in a virtual playfield). The information about a playfield may also include where actual objects are located (e.g., the areas of an actual playfield where impenetrable objects such as house or boulder are located). The information about a playfield may also include where virtual objects are located (e.g., the areas of a virtual playfield where a virtual sword or treasure is located).
Additionally, actual playfield data may not be stored at all so that only virtual playfield data is stored (and scaling information to convert the information to information useable on a particular actual playfield). Thus, scaling may be done only at an intermediary device (or other game device) such that the intermediary device (or other game device) scales right before information is transmitted to a game device. Alternatively, scaling may be done at the game system level such that scaled information may be written directly into a virtual playfield (e.g., such that the intermediary device or other game systems do not have any knowledge of how information is scaled). In this manner, a standard type of location information may be used to transmit information between devices. Such a standard may be 10 pixels for 1 foot such that devices only need to transmit data in the standard and the devices themselves can be set to scale the standard location information in any manner without affecting, for example, remote devices. Further still, virtual playfield data may not be stored at all and actual playfield data may be utilized to operate the game. Such playfields may take the form of, for example, a matrix of data, a matrix of vectors, or a matrix of matrices. In one embodiment, each matrix location may correspond to a pixel or a group of pixels (or a location or a group of locations). A user's location (and/or the location of game characters or game objects) may be stored in such matrices and utilized to operate the game.
Suppose a game system is configured such that recognition of movement along actual playfield310is recognized by moving through areas that are 1 foot by 1 foot. Suppose a virtual playfield350operates under a similar scheme only that the recognition of movement along the virtual playfield is recognized as movement through areas that are 0.5 foot by 0.5 foot area. Gameplay between two players located in two different playfields (both of which could be actual or virtual) may occur by simply transmitting the number of location areas that a particular player has traveled through (e.g., player 1 travels vertically two area locations). Such a number may be scaled such that the user's actual movement of 5 location areas (e.g., 5 feet) is transmitted to the process controlling virtual playfield as 10 location areas (e.g., 10 feet). As stated above, each system and/or intermediary may have its own virtual playfield for a one or multiplayer game. Persons skilled in the art will appreciate that a two player game may be provided on a single game system. Particularly, a number of wireless controllers may be provided. A user can set up how much movement of a controller in an actual world would be required to move a virtual character a particular distance in a virtual world. Thus, a user can control the speed of his/her virtual character compared to other users.
A user therefore may start at origin O (noted on actual playfield310as position10,5or information/location301) and move up 3 feet and to the left 1 foot to be at location L (noted on actual playfield310as position7,4or information/location302). In this manner, data may be stored on a matrix (or other data structure such as a database accessed by a memory pointer) corresponding to an actual playfield location (e.g., matrix location7,4). Actual playfield data (such as actual location information) may then be scaled and stored on a second playfield data structure (e.g., as location14,8on a second matrix) such that, for example, a scaling functionality can be provided. Generally, locations on different playfields are scaled according to a relationship between the structure of the two playfields. Multiple types of information can be written to a matrix such as a matrix for a virtual playfield. For example, a player identifier and scaling information may be stored in a matrix location. Persons skilled in the art will appreciate that the location of a user can also be stored in data structures as location information or the location can be determined based on where non-location information is stored in a storage structure (e.g., where in a matrix a players identification information is stored). Information about virtual indicia such as virtual characters and objects may also be stored in such a virtual playfield.
A playfield may also have event data (such as data E at location1,6on playfield310or information/location303). Event data may trigger an action if someone occurs at that event location, in relation to that event location, or in relation to the event data. For example, event data may be configured such that if a user (e.g., a manually controlled virtual character) enters location storing event data E, or location303, an event occurs (e.g., the level is completed or the game develops in a particular way). As per another example, event data may be configured such that if a user accomplishes a goal (e.g., shoots a rocket-propelled grenade at the location and the rocket-propelled grenade hits, or enters the location, then the event data may change the characteristics of the location such that an impenetrable location becomes an accessible via an access pathway).
A processor may move, operate, the virtual objects in such a virtual playfield according to code representative of the game. Code representative of the game can take account of particular situations such as, when example, a computer-controlled video game character nears, or enters the location of, a player-controlled video game character. The code can use this information and other information (e.g., additional manual control information from a player such as the status of ATTACK and DEFEND buttons) to determine what happens to the manually or computer-controlled game character on the virtual playfield. Thus, the video game can, for example, be executed on the system(s) housing the virtual playfield. If, for example, the virtual playfield is located on a remote system such as a remote database (or a game system remote from head-mounted displays) each separate gaming device, or head-mounted displays, may only need to have the ability to receive and display video/sound representative of the video game environment and send location and control information to the system playing the video-game. Such an embodiment provides light-weight and inexpensive head-mounted displays or game devices and may not need to store an actual playfield. Such devices can transmit location and control information scaled to a standard format (e.g., 0.5 foot movements) as well as other information such as the direction the user is facing, the perspective of the user (e.g., how the user's head is tilted such as the pitch and roll of the head) and the status of any additional buttons. Persons skilled in the art will appreciate that sensors can be utilized to determine the pitch, yaw, or roll of a device such as a head-mounted display and such information may be utilized by an AR game system (e.g., utilized to select and display virtual indicia on a virtual display).
In a dedicated single player game with no scaling capabilities, for example, actual playfield matrix310(or a playfield matrix) may be utilized to store location information and operate the game. As a result, the actual playfield matrix310may be visualized as a virtual playfield matrix. In this manner, a processor may periodically (or in real time) update the location of game objects or the status of the game (e.g., environment objects, enemy objects, and interactive objects) based on particular characteristics of the game. For example, if, for any period or at any time, a user's location corresponds to the location of an end object (as stored in actual matrix310) then the game may determine that the game has ended and display associated graphics to the user by way of a display.
In a multiplayer game with two portable systems and a remote intermediary, the intermediary may, for example, utilize playfield matrix350to update game locations while the individual systems use different playfield matrices. One portable system may, for example, have a playfield matrix similar to playfield matrix310. The other portable system may have, for example, a playfield matrix similar to playfield matrix350. As a result, even though data is handled at a different scale on each portable system, the data is scaled to the same matrix (e.g., the same virtual playfield). Thus, the two portable systems may easily play with each other—even if the two portable systems are using completely different playfields of different sizes (e.g., one player is playing on a basketball court in Japan while a second player is playing in his/her backyard in the USA). Such players may chose an actual playfield dimension from a variety of dimensions (e.g., 10 feet by 10 feet or 20 feet by 20 feet) and information may be stored according to these dimensions and transmitted to an intermediary (e.g., a database stored on a remote database). Information may be transmitted periodically or in real time. Information may be received periodically or in real time (in order to, for example, obtain opponent player information). Transmissions may alternatively occur when information changes. The portable systems may be in charge of transmitting/receiving. The intermediary (or intermediaries) may be in charge of transmitting/receiving. Or, for example, the portable system may be in charge of transmitting information when that portable system's location changes and the intermediary may be in charge of transmitting information (e.g., when the portable system receives information) when the opponent player's (or any one of multiple opponent players) location changes or a game characteristic changes (e.g., the location of a virtual game character not associated to an actual user changes).
FIG. 4shows semi-visible display451in which an AR game is provided. Landscape detector451may be provided in such a system such that the location of actual environmental objects may be determined and utilized in the game. In environment410, environmental objects411and412are present. Accordingly, landscape detector may determine the location of environmental objects411and412and record this location in memory (e.g., in a remote or local playfield matrix or data structure). The game may be then utilize such environmental objects to change the characteristics of the game (e.g., where virtual game characters not associated to actual users may move in a virtual environment). Thus, computer-controlled characters may not be able to move into areas with such environmental objects and the movement profiles of a computer-controlled character may change as a result of the environmental object. As such, a default traveling route that a computer-controlled character is coded to travel (or a default behavior that a character is coded with) may be changed due to an environmental object.
Landscape detector451may take multiple forms. For example, landscape detector451may measure the distance to objects and the shape of the object. This may be done for example by sound/light sensing (e.g., reflective sensing such as sound/light echoing or color sensing with respect to light source) or any type of object/distance sensing (e.g., infrared). A camera may be utilized to take a picture/video of the environment. Video processing features may be utilized to determine landscape as, for example, a user moves around the landscape. Such determined landscape may be stored in the game system (or a remote system) and utilized to determine where to place virtual images on a display (e.g., a head-mounted display) based on the alignment of that display to the actual environment. Landscape sensing may be done throughout game play or at set times (e.g., before a game is started or periodically). Such information may be stored in memory as part of an environment profile and associated with location information such as global information such that the environmental profile may be utilized whenever that user plays in that environment (e.g., at a particular longitude and latitude coordinate).
Alternatively still, environmental data may be coded directly into the game. For example, if the military desires an AR system for special operations warfare then a particular field may be utilized for game play. That particular field may include environmental objects and the location and size of these environmental objects may be coded directly into the system (e.g., on a playfield matrix or data structure). Virtual interfaces may help code such data by allowing a user to, for example, pick shapes and place them on a virtual playfield and manipulate the shapes until they correspond to actual shapes on an actual playfield. In this manner, functionality may also be added to objects. For example, if the object is a house, the object may be coded as hollow and may allow a user to pass through a particular portion of the object (e.g., the portion that correlates to a doorway) but not other portions of the object (e.g., the portion that correlates to wall). Virtual objects generated by a Location-Based (LB) or AR game (or other AR or LB service) that do not correspond to actual objects may operate in a similar manner (such that if a user tries to pass through a wall his/her perspective does not change). This may be an advantage over scanning techniques because scanning techniques may denote the object as solid if the door is closed when scanning occurs.
As a result of system400, virtual characters may be displayed on display452and augmented over reality. Using the above example, computer controlled enemy soldiers may be played in the house and may have the functionality of only being able to move inside of the house (manually controlled enemy soldiers may similarly be restrained from moving out of the house). Therefore, U.S. soldiers using augmented reality system400may be able to train with highly skilled virtual combatants in real environments. Such combatants may also be controlled remotely by other users (e.g., instructors) operating in a similarly sized real building on a different field. In this manner, live rounds may be used in a safe environment (if, for example, only 1 person is playing in each actual environment). Sensors may be placed on a physical door to the building to determine whether the door is OPEN or CLOSED. Alternatively, the landscape detector may be configured to recognize open doors or the AR system may determine that an access pathway is available in a real world once a user enters into such an access pathway (e.g., a location the game system things is blocked may be changed to unblocked after a game system moves into the location). Thus, the status of an actual, physical object (e.g., a door) may affect the game. Using the above examples, enemy soldiers may be able to move out of the building once an access pathway is created that leads into the house (e.g., where computer controlled characters, or other indicia can only leave through such access pathways or other exits). Access pathways may also be created in virtual objects not present in the physical, actual world. For example, a virtual house, or other object, may be augmented over a user's environment. If a user tries to pass though the object improperly (e.g., without opening a door or performing another action such as blowing a hole in the virtual object), the user may be able to walk forward in his/her physical environment, but the augmented reality perspective/scene provided to the user may not change. Actions (e.g., solutions) may be stored in the game that, when triggered, create an access pathway in a virtual object. Such an action may be, for example, pressing an OPEN button, or performing an OPEN function, on a virtual door, blowing a hole into a virtual wall via a gun or bomb, placing a virtual key (e.g., a virtual key obtained earlier in a game) near the virtual door or a virtual keyhole associated to a door, properly using an alphanumeric virtual access pad on the alphanumeric door, or getting past a particular stage in the game (e.g., lighting a computer-controlled virtual game character on fire so that the virtual game character crashes through the door).
Device400may have any component of a location-base game including, for example, positioning systems (e.g., GPS systems) and movement sensors (e.g., accelerometers and/or gyroscopes). Furthermore, additional sensors may be utilized to determine pitch that the head-mounted display is pointed, the height of the head-mounted display, the roll of the head-mounted display, and the direction that the head-mounted display is pointed. The perspective of virtual game characters may be, for example, determined by such pitch, height, roll, direction, and location information.
FIG. 500shows virtual environments501and550that may be rendered using actual environment data may be used in, for example, a location-based game (e.g., an AR game) in a variety of different ways. Persons skilled in the art will appreciate that environments of a figure (e.g., such as environments501and550) may also be considered playfields (e.g., virtual game playfields) or may be virtual indicia (e.g., virtual game objects and characters) augmeted over actual, physical environments.
In one methodology, actual, physical objects are scanned in by a landscape detector (or manually entered) may be virtualized as virtual objects in the video game. Thus, information regarding actual, physical environmental objects may be saved on a virtual playfield that stores information for a game such as a game matrix/database or other data structure. Persons skilled in the art will appreciate that such virtual objects may be displayed to a user or may be used only as game information that game code uses to execute a game. In an example where virtual environmental objects are displayed back to a user that has associated physical environmental objects in his/her view, the virtual environmental objects may be augmented to replace, or lay over, the physical, objects. Such a technique may be utilized to provide a more seamless execution of an AR game as virtual characters may be aligned with the virtual objects. Thus, if the game becomes misaligned to the landscape on a display (e.g., a semi-visible head-mounted display), the characters may still be aligned with the virtual environmental objects and the misalignment may not be noticed. Also, physical objects may be recognized as described above, their appearance manipulated, and then used to augment a viewer's perspective of the actual environmental object. For example, suppose a game has a number of immobile, non-hollow objects (e.g., crashed spaceships, dead dinosaurs, gigantic trees), the size of scanned in environmental objects (e.g., houses, boulders, trees) may be compared to the size of the game objects. Objects that are close in size to an environmental objects may be rendered in the location of the actual, physical environmental object until, for example, all of the actual, physical environmental objects have been augmented with environmental game objects. Such a feature may increase the available area of a playfield that a user may walk to. Such a feature may make playing the same game in different playfields appear different as virtual environmental objects are placed differently in the different playfields. Such a difference may result in a difference in the difficulty of the game.
In environment501, environmental objects511and512may be virtualized as a 2-D object in a 3-D space. Alternatively, as shown in environment550, environmental objects511and512are virtualized as 3-D objects in a 3-D space (e.g., either by 3-D objects or 2-D objects that define the perimeter of the 3-D object. If the environmental sensor can only sense the distance to an object and, for example, the 2-D size of the object then program logic may be provided to generate a 3-D image from the sensed 2-D image. For example, program logic may assume that a 2-foot tall and 2-foot wide 2-D object is 2-feet deep and, in this manner, may determine depth information based on height and width information. Such depth information may be updated as more information is gathered (e.g., as a user walks to a point where the depth of the object may be sensed). Persons skilled in the art will appreciate that two still images of an object taken from different perspectives may be utilized to determine additional dimensions of an object when compared to when only a single image is evaluated. Thus, the information on the actual environment may be ever-changing and utilized continually, or periodically, by the game system.
In another methodology, virtual environmental data may be utilized to display virtual environments on areas of the head-mounted display (or other display). This can provide numerous functionalities. For example, if two players play on different field with different objects, virtual objects from an opponents field may be generated on a user's field such that both opponents have the same type of area to play in. Each user may then not be able to move into environmental objects from other player's environments. Alternatively still, the location of actual environmental objects may determine the placement of, for example, non-interactive and/or impenetrable virtual objects. For example, if an augmented reality game has an item shop, that item shop may be overlaid over a large actual environmental object (e.g., a wall) such that the user's “free space” is maximized.
FIG. 6shows virtual character movement between environmental objects in environments610and600. Persons skilled in the art will appreciate that environments of a figure (e.g., such as environments610and600) may also be considered playfields (e.g., virtual game playfields) or may be virtual indicia (e.g., virtual game objects and characters) augmented over actual, physical environments. In environment601, no environmental objects may be present. Therefore, logic may be provided such that virtual character610moves along path611while virtual character620moves along path621. In environment650, actual environmental object may be present and stored in memory as virtual objects651and652. Alternatively, no actual environmental objects may have been scanned but virtual objects may otherwise be provided at the locations of virtual objects651and652(e.g., may be originally coded into a game's code or intelligently placed at the locations based on the actual environment). If the stored location of virtual objects651impedes on the planned movement of virtual characters, then program logic (e.g., game code) may manipulate the movement path of virtual characters around these virtual objects. If the movement path is dynamic (e.g., not laid out) then such virtual environment locations may not be “moved into” by a dynamic virtual character. Persons skilled in the art will appreciate that a number of intelligent placement features of virtual objects/characters may be provided. For example, a virtual object/character may be placed in the middle of a particular open space to maximize the change of the virtual object/character coming within a particular distance of a user (e.g., to maximize the virtual object/character's field of vision on the virtual playfield). Alternatively, virtual objects/characters may be played in locations determined to be strategically advantageous. For example, if there is only a single entrance to a playfield or playfield area (e.g., a doorway to a room) then virtual objects/characters can be spread about either behind virtual objects placed in the playfield (from the perspective someone standing in the door) or placed such that multiple objects/characters have different perspective angles to the door (e.g., such that virtual enemy soldiers can cover the doorway from different angles). Thus, cooperative defensive and offensive tactics may be used by the virtual game objects/characters.
Visibility of virtual objects/characters may be set and modified for a head-set augmented reality system. For example, a user may define a virtual visibility of 100 feet. Virtual barriers may then be provided (e.g., rendered and augmented on a display) at this proximity (or at predefined boundaries) to immerse the user in the augmented world. For example, if a child is playing in a city, then the visual barriers may take the form of fog and this may hide any skyscrapers that may take away from the believability of the augmented reality.
FIG. 7shows a playfield data structure in which environmental object data701(e.g., barrier information data) may be stored. Such a playfield data structure may be used as the primary information structure for the game such that program logic refers to information in the playfield structure to determine game events (e.g., point scoring, deaths, level completions).
FIG. 8shows expanded virtual character information (shows as information C802and information Cc801) stored in a playfield data structure. If a character takes up more than one pixel (or more than one data block if data blocks are associated with size) then one of the data locations may be the controlling location Cc. Thus, program logic may just move the controlling character information data Cc and then this controlling character information may be utilized to generate surrounding character information (e.g., C). Any sort of character information may be associated to, for example, a control location or any character information. For example, the distance a computer-controlled virtual character can see (e.g., the distance at which the character can read information on a virtual playfield) can be associated to the control information (e.g., the control location) for that character. As per another example, the distance a character can use a particular attack (e.g., project a projective using sling-shot versus shooting a bullet out of a gun) can be related, in a game's code) the control information (e.g., the control location) for that character.
FIG. 9shows possible data structures that may be utilized for a playfield data structure. Matrix of pointers910, matrix of descriptors920, matrix of vectors930, or matrix of tables/matrices940may be utilized. Matrix of pointers910may be, for example, pointers that point to memory locations in which a large amount of information is stored (e.g., in which a vector of information is stored). Matrix of descriptors920may be, for example, a matrix of the information needed for a particular playfield location. Matrix of vectors930may have one or more matrix locations associated to one or more playfield locations and may include complex information in any form (e.g., vector form or table form as shown in matrix of tables940). Any data structures such as any type of database may be utilized or, alternatively, actions and information may be written as additional code into a games code or, alternatively still, used by a games code as code updates.
FIG. 10shows virtual characters1010and1060generated on display screens1001and1051. Depending on the distance of virtual characters (either controlled by an opponent or computer controlled) from the display (or locating device on the system), the size of virtual characters that are shown may be manipulated. In this manner, a user is provided augmented reality indicia that is scaled to the perspective of that user (e.g., height, pitch, roll, distance and location of the perspective). As such a true three-dimensional virtual object/character can be provided whose size scales according to the height, pitch, roll, distance, and location of the perspective to the virtual object/character (e.g., the perspective of a user).
Logic may be included such that virtual characters (either controlled by an opponent or computer controlled) are transparent (e.g., virtual object/character1160of environment1150ofFIG. 11) or non-transparent (e.g., virtual object/character1110of environment1101ofFIG. 11). A user may be able to manually control the transparency or the contrast of indicia that is being displayed. Transparent objects may offer the additional functionality of making the environment safer to move in. For example, a user may trip over a rock if that user is in an “invincible mode” and runs through a virtual character and that character covers the rock by being non-transparent. A user is less likely to trip over a rock if that user can see the rock, in some form, through a character. The transparency of a character may also change depending on the user's distance from that character. For example, far-away characters may be semi-transparent (noting they can't interact with you yet such as they can't shoot you yet) while characters that you are sharing a location with are almost entirely-transparent. Persons skilled in the art will appreciate that the transparency of a virtual object may be changed by, for example, changing the number of pixels that define an image are displayed. The smaller amount of pixels that are used to depict a character that are used, the more transparent a character (or virtual object) may become.
Virtual characters (either opponent-controlled or computer controlled) may have certain functionalities that have certain functionality envelopes. For example, a virtual controlled character may have an interactive object like a weapon (e.g., a gun) that may shoot an interactive bullet over a particular distance. Such envelopes may take any form. For example, if a virtual character has a force field then the force field may be mapped around, for example, a controlling location a particular distance. Thus, an event may occur if a different character (such as a user-operated character) walks into this envelope (e.g., the user's health may decrease).
As illustrated inFIG. 12, such envelopes may be utilized to determine how far a virtual character may see. For example, one character (e.g., information C11201) may only see in the direction that character is facing (e.g., information S11202). A second character may see in a particular direction around that character (e.g., information S21203). These envelopes may overlap (e.g., information S31204) such that if a user walks into S1, C1can attack (or some other functionality may be utilized by the game). If a user walks into S2, C2can attack (or some other functionality may be utilized by the game). If a user walks into S3, C1and C2can attack (or some other functionality may be utilized by the game).
FIG. 13shows system1300in which sections of an actual playfield are scanned in before gameplay begins. Such scanning may be utilized from the perimeter of an area (or multiple perimeters). Alternatively, the game may be fabricated to utilize scanning from the center of an area and the user may be directed to rotate at a particular speed for a particular period of time (or until a particular number of rotations have occurred). Alternatively, a virtual configuration indicia (e.g., a virtual character) may be displayed on a display (e.g., a user's head-mounted display during set up). Manual controls may allow a user to change the location of the configuration indicia and acknowledge when desired configuration is obtained. Thus, a user may look at a particular portion of an actual playfield, move his/her head up and down, and continually change and acknowledge the location of the virtual indicia so that the virtual indicia is aligned with the landscape (e.g., is standing on the landscape). The game code (e.g., the code aligning the image on the display) may be dynamically updated such that as a user moves his/her head, the perspective to the virtual indicia changes and a user can change any misalignment that occurs at any perspective and such misalignment errors may be used to update the game's code (or rendering/alignment code). Manually entered acknowledgment information for alignment via manual controls may be utilized to generate a representation of the physical landscape a user is playing the game on. Thus, actual, physical environmental data can either be, for example, scanned by a location detector prior to a game (or during a game), built using a computer administration interface, or manually entered on the fly via configuration and acknowledgment controls.
FIG. 14shows game system1401and network topology1450.
Game system1401may include, for example, any number of power sources1420, output devices1425, memory1430, connection terminal (e.g., input, output interfaces)1435, additional components1440, location devices (e.g., GPS, LPS, accelerometers, gyroscopes, inertial movement sensors, hybrid location systems)1445, manual input controls1450, wireless transmitters and receivers1455and other communication transmitters and receivers (e.g., blue tooth, WiFi, wireless LAN, infrared, radio), landscape device1465.
Topology1450may include remote facilities such as content or administrator facilities1480with intermediaries such as remote databases1481(or content providers), network1451(e.g., a wireless network such as a wireless LAN based network). Internet portals1461and1471may also be provided such that information may be published and downloaded from web-based game systems (e.g., via cellular phone game systems). Portable gaming devices1460(e.g., handheld device100ofFIG. 1or head-mounted device452ofFIG. 2or a cell phone) may be utilized as game systems. Alternatively, stationary devices (e.g., home game systems) may be utilized to generate virtual game characters on an augmented reality system. As mentioned, wireless phones may include location devices such that wireless phones may, for example, download program logic and be utilized a location-based game systems or as control devices for augmented reality game systems. Any third-party service may be utilized by an AR game system (or an AR wearable computer). For example, cell phone, or another wireless communication service, may be provided to an AR device. Location security services (e.g., permission control services or encryption/compression services) may also be utilized by an AR system.
Persons skilled in the art will also appreciate that the present invention is not limited to only the embodiments described. Instead, the present invention more generally involves providing location-based games and AR systems. Persons skilled in the art will also appreciate that the apparatus of the present invention may be implemented in other ways then those described herein. For example, the AR capabilities may be utilized for AR advertising in which advertisement signs are provided outside of certain locations (e.g., outside a GPS signal that denotes you are near a GAP clothing store). Such an advertisement may be positioned, for example, based on a landscape detector, perspective determining devices, or particular locations. All such modifications are within the scope of the present invention, which is limited only by the claims that follow.
Claims
- An augmented reality game system comprising: a head-mounted display that overlays virtual indicia onto a physical playfield;memory comprising video game logic that provides a video game;a wearable processor that utilizes said video game logic to provide video game indicia to said head-mounted display based on said video game logic, wherein said processor is coupled to said memory and said head-mounted display;a detector that determines landscape characteristics of said physical playfield, wherein said video game logic utilizes said landscape characteristics in providing said video game;and a locating device that determines the physical location of said locating device on the physical playfield, wherein said video game logic utilizes the physical location of said locating device in providing said video game.
- The augmented reality game system of claim 1 , wherein a virtual object is overlayed by said head-mounted display on said physical playfield and said virtual object is displayed at different transparencies depending on the perceived distance of said virtual object from said head-mounted display.
- The augmented reality game system of claim 1 , wherein a virtual object is overlayed by said head-mounted display on said physical playfield so long as the perceived distance of said virtual object from said head-mounted display is not within a certain perceived distance from said head-mounted display.
- The augmented reality game system of claim 1 , wherein said landscape characteristics are utilized to control a computer-controlled video game character.
- The augmented reality game system of claim 1 , further comprising a wireless transmitter and receiver.
- The augmented reality game system of claim 1 , wherein a computer-controlled video game character is overlayed by said head-mounted display on said physical playfield and said computer-controlled video game character is limited to visibility within a certain area.
- The augmented reality game system of claim 1 , wherein a computer-controlled video game character is overlayed by said head-mounted display on said physical playfield, said computer-controlled video game character is limited to visibility within a certain area, and said computer-controlled video game character provides a reaction to said physical location when said physical location corresponds to entering said certain area.
- A system comprising: a first video game system comprising: a head-mounted display that overlays virtual indicia onto a physical playfield;a wearable processor that utilizes video game logic to provide video game indicia to said head-mounted display based on said video game logic, wherein said processor is coupled to said head-mounted display;a detector that determines landscape characteristics of said physical playfield, wherein said video game logic utilizes said landscape characteristics in providing said video game;a locating device that determines the physical location of said locating device on the physical playfield, wherein said video game logic utilizes the physical location of said locating device in providing said video game;and a second video game system, wherein said first video game system communicates information with said second video game system.
- The system of claim 8 , wherein information regarding said physical location is provided to a remote database.
- The system of claim 8 , wherein information regarding said physical location is provided to said second video game system.
- The system of claim 8 , wherein information regarding said physical location is provided to said second video game system based on a permission associated with said first video game system for said second video game system.
- The system of claim 8 , wherein said locating device comprises: a positioning system;and an inertial movement system.
- A system comprising: a head-mounted display that overlays virtual indicia onto a physical playfield;a processor that utilizes video game logic to provide video game indicia to said head-mounted display based on said video game logic, wherein said processor is coupled to said head-mounted display;a detector that determines landscape characteristics of said physical playfield, wherein said video game logic utilizes said landscape characteristics in providing said video game;a locating device that determines the physical location of said locating device on the physical playfield, wherein said video game logic utilizes the physical location of said locating device in providing said video game;and a handheld controller for providing at least one control signal to said video game.
- The system of claim 13 , wherein a virtual object is overlayed by said head-mounted display on said physical playfield and said virtual object is displayed at different transparencies depending on the perceived distance of said virtual object from said head-mounted display.
- The system of claim 13 , wherein a virtual object is overlayed by said head-mounted display on said physical playfield so long as the perceived distance of said virtual object from said head-mounted display is not within a certain perceived distance from said head-mounted display.
- The system of claim 13 , wherein said landscape characteristics are utilized to control computer-controlled video game character.
- The system of claim 13 , wherein said locating device comprises: a positioning system;and an inertial movement system.
Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.