U.S. Pat. No. 7,803,048

RADAR MANIPULATION IN A VIDEO GAME

AssigneeMicrosoft Corp.

Issue DateMarch 15, 2006

Patent Arcade analysis Read the full post

U.S. Patent No. 7,803,048: Radar manipulation in a video game

U.S. Patent No. 7,803,048: Radar manipulation in a video game

Issued September 28, 2010, to Microsoft Corp.
Priority Date March 15, 2006

Summary:
U.S. Patent No. 7,803,048 (the ‘048 Patent) describes methods and systems for a deception in an online multiplayer game. One such method is for a player to manipulate the game’s radar to trick other players. The player’s in-game character could shoot a fake bullet to create a noise at another location, triggering the virtual acoustic radar. Also, a player can temporarily jam an opponent’s radar, or make all the opponents visible on the radar. The ‘048 Patent describes several methods for deception using the in-game radar.

Abstract:
Methods and systems for deceiving other characters in a video game are disclosed. A video game may include a simulated environment in which player and computer controlled characters can monitor each other’s positions using radar, e.g., an acoustic radar that detects noise (such as the firing of various weapons) associated with other characters. A character may fire a decoy bullet, which creates noise at the location of impact rather than the location of firing. A character may temporarily jam another character’s radar so that the other character’s radar does not display character locations. A first character may mimic an enemy character so that the first character appears as a friend to enemy characters on each enemy characters’ radar. A special weapon may make all visible characters visually appear as enemies to a first character, thereby confusing the first character. Another special weapon may create a duplicate image of a character, thereby confusing others.

Illustrative Claim:
1. One or more computer readable storage device storing executable instructions for performing a video game method of representing characters on a radar image displayed on a video output device, said method comprising steps of: determining a first simulated noise level associated with a first object in a simulated environment operating under control of the video game; and displaying on the radar image, said radar image corresponding to a first character, a first radar blip corresponding to the first object, said first radar blip having a first characteristic based on the first simulated noise level associated with the first object determining a second simulated noise level associated with the first object in the simulated environment operating under control of the video game, wherein said second simulated noise level is determined to be louder than said first simulated noise level; and displaying on the radar image corresponding to the first character, a second radar blip corresponding to the first object, said second radar blip having a first characteristic based on the second simulated noise level associated with the first object, wherein the first characteristic of the first radar blip comprises a first amount of time based on the first simulated noise level, wherein the first characteristic of the second radar blip comprises a second amount of time based on the second simulated noise level, said second amount of time being longer than said first amount of time, wherein displaying the first radar blip comprises displaying the first radar blip for the first amount of time, and wherein displaying the second radar blip comprises displaying the second radar blip for the second amount of time.

Illustrative Figure

Abstract

Methods and systems for deceiving other characters in a video game are disclosed. A video game may include a simulated environment in which player and computer controlled characters can monitor each other's positions using radar, e.g., an acoustic radar that detects noise (such as the firing of various weapons) associated with other characters. A character may fire a decoy bullet, which creates noise at the location of impact rather than the location of firing. A character may temporarily jam another character's radar so that the other character's radar does not display character locations. A first character may mimic an enemy character so that the first character appears as a friend to enemy characters on each enemy characters' radar. A special weapon may make all visible characters visually appear as enemies to a first character, thereby confusing the first character. Another special weapon may create a duplicate image of a character, thereby confusing others.

Description

DETAILED DESCRIPTION In the following description of the various aspects, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration how various features described herein may be practiced. It is to be understood that other embodiments may be used and structural and functional modifications may be made. FIG. 1illustrates an example of a suitable gaming system environment100on which computer games, video games, and or other electronic games (collectively referred to herein as computer games) may be played. The gaming system environment100is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the features described herein. Neither should the gaming system environment100be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the illustrative operating gaming system environment100. Aspects described herein are operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use include, but are not limited to, personal computers; server computers; portable and hand-held devices such as personal digital assistants (PDAs), tablet PCs or laptop PCs; multiprocessor systems; microprocessor-based systems; set top boxes; programmable consumer electronics; network PCs; minicomputers; mainframe computers; electronic game consoles, distributed computing environments that include any of the above systems or devices; and the like. Another illustrative example of a suitable operating environment is shown inFIG. 2, and further described below. Aspects herein may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The ...

DETAILED DESCRIPTION

In the following description of the various aspects, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration how various features described herein may be practiced. It is to be understood that other embodiments may be used and structural and functional modifications may be made.

FIG. 1illustrates an example of a suitable gaming system environment100on which computer games, video games, and or other electronic games (collectively referred to herein as computer games) may be played. The gaming system environment100is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the features described herein. Neither should the gaming system environment100be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the illustrative operating gaming system environment100.

Aspects described herein are operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use include, but are not limited to, personal computers; server computers; portable and hand-held devices such as personal digital assistants (PDAs), tablet PCs or laptop PCs; multiprocessor systems; microprocessor-based systems; set top boxes; programmable consumer electronics; network PCs; minicomputers; mainframe computers; electronic game consoles, distributed computing environments that include any of the above systems or devices; and the like. Another illustrative example of a suitable operating environment is shown inFIG. 2, and further described below.

Aspects herein may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The features described herein may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.

FIG. 1shows an exemplary gaming system100. Gaming system100may include a game console102and one or more handheld controllers, as represented by controllers104(1) and104(2). The game console102may be equipped with an internal or external hard disk drive and a portable media drive106that supports various forms of portable storage media as represented by optical storage disc108. Examples of suitable portable storage media include DVD, CD-ROM, game discs, and so forth.

Game console102may have a number of slots110on its front face to support up to four controllers, although the number and arrangement of slots may be modified. A power button112and an eject button114are also positioned on the front face of the game console102. The power button112switches power to the game console and the eject button114alternately opens and closes a tray of the portable media drive106to allow insertion and extraction of the storage disc108. In some aspects, game console102may be a dedicated computing device for home entertainment, and may be a closed, secure system that only executes authenticated and authorized applications. The game console102may be optimized for executing game programs (e.g., having increased processing support for gaming applications, such as physics co-processors, math co-processors, graphics co-processors, higher resolution video output, higher fidelity audio output, etc.), and may omit certain features commonly found on personal computing devices, such as an alphabetic keyboard, internal hardware expansion slots, printer communication port, etc.

Game console102may connect to a television or other display (not shown) via A/V interfacing cables120. A power cable122provides power to the game console. Game console102may further be configured with broadband network capabilities, as represented by the cable or modem connector124to facilitate access to a network, such as the Internet. Connector124may also be fitted with a wireless adapter to connect to one or more wireless networks.

Each controller104may be coupled to the game console102via a wire or wireless interface. In the illustrated implementation, the controllers are USB (Universal Serial Bus) compatible and are connected to the console102via USB cables130. Controller102may be equipped with any of a wide variety of user interaction mechanisms. As illustrated inFIG. 1, each controller104may be equipped with two thumbsticks132(1) and132(2), a D-pad134, buttons136(e.g., ‘A’, ‘B’, ‘X’, ‘Y’), and two triggers138. The thumbsticks132may be analog directional control units, and may include analog potentiometers to detect a degree of position in the X- and Y-coordinates. D-pad134may be a directional pad, with inputs for entering directional commands such as up, down, left and right, or combinations of these directions (e.g., upper-left). D-pad134may also be analog, and may provide input as to a degree of pressure used to press in a particular direction. These mechanisms are merely representative, and other known gaming mechanisms may be substituted for or added to those shown inFIG. 1.

A memory unit (MU)140may be inserted into the controller104or game console102to provide additional and portable storage. Portable memory units enable users to store game parameters and user accounts, and port them for play on other consoles. In the described implementation, each controller is configured to accommodate two memory units140, although more or less than two units may be employed in other implementations. A headset142may be connected to the controller104or game console102to provide audio communication capabilities. Headset142may include a microphone for audio input and one or more speakers for audio output.

Gaming system100is capable of playing, for example, games, music, and videos. With the different storage offerings, titles can be played from the hard disk drive or the portable medium108in drive106, from an online source, or from a memory unit140. For security, in some embodiments executable code can only be run from the portable medium108. A sample of what gaming system100is capable of playing include game titles played from CD and DVD discs, from the hard disk drive, or from an online source; digital music played from a CD in the portable media drive106, from a file on the hard disk drive (e.g., “WINDOWS™” Media Audio (WMA) format), or from online streaming sources; and digital audio/video played from a DVD disc in the portable media drive106, from a file on the hard disk drive (e.g., Active Streaming Format), or from online streaming sources.

The gaming system100may be operated as a standalone system by simply connecting the system to a television or other display. In this standalone mode, the gaming system100allows one or more players to play games, watch movies, or listen to music. However, with the integration of broadband connectivity made available through a network interface, the gaming system100may further be operated as a participant in a larger network gaming community.

With reference toFIG. 2, an illustrative system for implementing the invention may include any computing device, such as computing device200. In its most basic configuration, computing device200typically includes at least one processing unit202and memory204. Depending on the exact configuration and type of computing device, memory204may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two. This most basic configuration is illustrated inFIG. 2by dashed line206. Additionally, device200may also have additional features/functionality. For example, device200may also include additional storage (removable and/or non-removable) including, but not limited to, magnetic or optical disks or tape. Such additional storage is illustrated inFIG. 2by removable storage208and non-removable storage210. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Memory204, removable storage208and non-removable storage210are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by device200. Any such computer storage media may be part of device200.

Device200may also contain communications connection(s)212that allow the device to communicate with other devices, for example, for networked game play. Communications connection(s)212is an example of communication media. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. The term computer readable media as used herein includes both storage media and communication media.

Device200may also have input device(s)214such as keyboard, mouse, pen, voice input device, touch input device, joystick, game controller, etc. Output device(s)216such as a display, speakers, printer, etc. may also be included. All these devices are well know in the art and need not be discussed at length here.

FIG. 3shows an exemplary network gaming environment300that interconnects multiple gaming systems100(1), . . . ,100(g) via a network302. Gaming systems100(1), . . . ,100(g) may include, e.g., Xbox game consoles and Xbox 360 game consoles available from Microsoft Corporation, among other game systems. The network302represents any of a wide variety of data communications networks. It may include public portions (e.g., the Internet) as well as private portions (e.g., a residential Local Area Network (LAN)), as well as combinations of public and private portions. Network302may be implemented using any one or more of a wide variety of conventional communications media including both wired and wireless media. Any of a wide variety of communications protocols can be used to communicate data via network302, including both public and proprietary protocols. Examples of such protocols include TCP/IP, IPX/SPX, NetBEUI, etc.

In addition to gaming systems100, one or more online services304(1), . . . ,304(s) may be accessible via the network302to provide various services for the participants, such as hosting online games, serving downloadable music or video files, hosting gaming competitions, serving streaming audio/video files, and the like. The network gaming environment300may further involve a key distribution center306that plays a role in authenticating individual players and/or gaming systems100to one another as well as online services304. The distribution center306distributes keys and service tickets to valid participants that may then be used to form games amongst multiple players or to purchase services from the online services304.

The network gaming environment300introduces another memory source available to individual gaming systems100-online storage. In addition to the portable storage medium108, the removable storage208, nonremovable storage210, and the memory unit(s)140, the gaming system100(1) or data processing system200can also access data files available at remote storage locations via the network302, as exemplified by remote storage308at online service304(s).

Illustrative Aspects

Features described herein may be used for a variety of video game types, including, for example, first-person shooter (FPS) games such as PERFECT DARK ZERO™by Rare, Ltd. of the United Kingdom and Microsoft Corporation of Redmond, Wash. Aspects described herein may also be used with other genres of video games, and are not limited to any one genre or implementation, and may be used with both single player and multiplayer games. Aspects described herein may be implemented in video game application software stored on a computer readable medium, e.g., storage108,208,210, and/or308, and executable by a data processing device. The video game software may be executed by a data processing device such as system200or game console102. Game console102may include, e.g., an Xbox® brand game console by Microsoft Corporation, an Xbox® 360 brand game console by Microsoft Corporation, a Playstation® brand game console by Sony Corporation of Japan, or a Nintendo® brand game console by Nintendo Co., Ltd. of Japan, to name a few. Other game consoles, mobile computing devices, and/or computers may also or alternatively be used.

Various aspects of the disclosure provide new capabilities and features in video games, thereby enhancing game play and providing more options through which players, or gamers, can develop strategies with which to play a video game. According to various aspects described herein, a video game may provide a graphically simulated virtual environment, or virtual world, in which the game takes place, referred to herein as a simulated environment of the video game. The simulated environment may have similar features to actual geographic locations, or may include science fiction or fantasy based environments. The discussion below indicates various features, actions, and items that can be used and/or performed, as applicable, as a player controls a character within the simulated environment.

FIG. 4illustrates a block diagram of a video game software application401. Each block inFIG. 4illustrates a logical software module or function that performs an action, provides a capability or feature, implements an object, or performs some other aspect of the video game. When the video game software401executes on a data processing system such as a PC or game console, the modules collectively operate to provide a video game experience to a player operating the PC or game console. The modules illustrated inFIG. 4are illustrative only, and additional or different modules may be used.

Video game software401may include, e.g., a primary game manager module403, which manages the overall operation of the video game, and may be the initial module launched when the video game is executed. Video game software401may also include a network module405, which manages network game sessions. A network game session may include, e.g., a cooperative campaign with another networked player, a multiplayer match, or other compartmentalized periods of game play involving players located at two network locations. A memory manager module423performs memory management during execution of thee video game401. Input module407may receive and interpret user input via a game controller, keyboard, mouse, and the like, and provide the interpreted commands to game manager403, network manager405, or other applicable module. UI module409may manage and control the user interface, including but not limited to a heads up display displayed on the video output device, interpreting input received via the input module407, and/or providing tactile feedback via a force feedback controller, and the like. UI module409may operate in conjunction with graphics module411, which renders graphical images of the simulated environment for output and display, and audio module415, which generates audio signals for output to one or more speakers and/or headsets. Physics engine413provides a physics model under which the simulated environment operates, e.g., by defining gravity, inertia, collision reactions, and the like. Physics engine413may model “real world” physics, or may alter the physical reality simulated in the video game to provide a desired simulated environment.

Various modules may operate with one or more classes of objects defined and used by video game401. Such classes and objects may be defined according to an object module425, and may include portions of executable software code and/or one or more data structures, depending on the object. A first class of objects may define characters in the video game, e.g., data and functions corresponding to each character. Character data may include attributes of the character, e.g., health, strength, speed, etc. Character functions may include actions such as roll, crouch, fire weapon, punch, etc. A second class of objects may define vehicles in the video game. A third class of objects may define equipment (e.g., weapons, simulated computers or PDAs, etc.) in the video game. Vehicle objects and equipment objects may be usable and/or capable of being possessed by character objects. Other classes of objects may be used as well.

AI module419defines the artificial intelligence with which one or more computer-controller character objects behave or operate within the video game. A radar module421may provide a radar system visible at least to player characters, through which players controlling a player character can identify locations of other characters in the simulated environment of the video game. Radar module421is described in further detail below.

Object module425may provide an array of weapons and or other usable equipment in the video game. Each weapon or piece of equipment may be defined by an object and instantiated during the game. For example, the video game may include a variety of weapon objects for characters to use in the game. Each weapon may provide varying features, attributes and functions, and provide advantages and disadvantages based on each feature and function. For example, an attribute of a weapon may be the magazine, or clip, size of the weapon (e.g., how many bullets can the weapon fire before reloading), whether the weapon can be damaged and, if so, what is the “health” of the weapon. Other attributes of the weapon may include damage that the weapon may cause, visual effect in use, effect on radar, etc. Alternatively, weapon objects may use or may include projectile objects, which can be fired by a weapon object. Damage may then be associated with a projectile object instead of or in addition to being associated with a weapon object.

Weapon objects may also define functions, e.g., defining alternate modes of operation, including primary, secondary and/or tertiary functions. The objects may include, for example, different portions of executable code triggered by predefined user inputs, such as the user pressing a trigger key or button while in a predetermined game state. Various pistols may fire bullets in a primary mode, and have alternate secondary and/or tertiary modes by entering different inputs and/or in different game states (e.g., toggling a weapon from one firing mode to another). According to an illustrative embodiment, a video game may provide a first pistol object (Pistol1) that fires bullets, and has a secondary function whereby the pistol fires in a silenced mode, and a tertiary mode operating a flashlight. A second pistol object (Pistol2) may have a second function whereby the pistol expels a clip, which then fires each round in sequence from wherever the clip lands (similar to a firecracker). Video game401may also provide additional and/or alternative weapons. A third pistol object (Pistol3) may provide a secondary function of a decoy bullet, further described below. A fourth pistol object (Pistol4) may provide a secondary function of a ricochet bullet, whereby a bullet bounces off walls and objects, as defined by the physics engine413. A fifth pistol object (Pistol5) may fire a tranquilizer bullet in a primary mode, and a psychosis bullet in a secondary mode. The psychosis bullet, operating to hinder a player's ability to distinguish friends from foes, is described in more detail below. Each pistol may have a magazine of bullets, and each magazine may differ in number between pistols.

Video game401and object module425may further provide heavy assault weapon objects. A first heavy assault weapon object (Heavy1) may provide a large magazine of bullets in a primary mode, and in a secondary mode may fire mini mines that slow down personnel and/or stop vehicles. A second heavy assault weapon object (Heavy2) may fire a plasma projectile in a primary mode, and provide a cloaking device in a secondary mode. A third heavy assault weapon object (Heavy3), may fire rockets in a primary mode, and fire guided missiles in a secondary mode.

Video game401and object module425may further provide machine gun weapon objects. A first machine gun object (Machine Gun1) may provide a magazine of bullets in a primary mode, fire with a silencer in a secondary mode, and provide a flashlight in a tertiary mode. A second machine gun object (Machine Gun2) may provide a magazine of bullets in a primary mode, and may deploy a booby trap in a secondary mode (the booby trap may explode when another character nears the booby trap in the simulated environment). A third machine gun object (Machine Gun3) may provide a magazine of bullets in a primary mode, and create a duplicate character image, or hologram, in a secondary mode. The duplicate image function is described in more detail below. A fourth machine gun object (Machine Gun4) may provide a magazine of bullets in a primary mode, provide a threat detector in a secondary mode, and reprogram enemy devices in a tertiary mode so that the enemy devices are usable by the character object having possession of a machine gun4object. For example, an instance of a vehicle object may have a data value identifying a team to which the vehicle object corresponds in a multi-team match. Only character objects on the appropriate team can activate that vehicle object. However, a character object on a different team and having possession of a machine gun4object may activate the machine gun4object in the tertiary mode to “hotwire” the vehicle object. Once hotwired, the vehicle may be usable by any team, or may be usable only by members of the team corresponding to the character object that hotwired the vehicle in the first place.

Video game401and object module425may further provide rifle weapon objects. A first rifle object (Rifle1) may provide a magazine of bullets in a primary mode, and fire a bayonet in a secondary mode. A second rifle object (Rifle2) may provide a magazine of bullets in a primary mode, and fire grenades in a secondary mode. A third rifle object (Rifle3) may provide a magazine of bullets and/or grenades in a primary mode, fire bounce-able grenades in a secondary mode, and provide night vision to a possessing character object in a tertiary mode. A fourth rifle object (Rifle4) may provide a magazine of bullets in a primary mode, and turn into a stationary sentry gun in a secondary mode, shooting anyone within proximity to the sentry gun in the simulated environment.

Video game401and object module425may further provide sniper weapon objects. A first sniper object (Sniper Rifle1) may provide a magazine of bullets in a primary mode, and fire a radar jamming projectile or device in a secondary mode. Radar jamming features of the Sniper Rifle1object are described in more detail below. A second sniper object (Sniper Rifle2) may provide a magazine of simulated particle bullets in a primary mode, and provide x-ray vision to a possessing character object in a secondary mode.

Video game401and object module425may further provide other weapon and equipment objects. For example, a shotgun object (Shotgun1) may provide a magazine of shotgun shells in a primary mode, a radar sweep function in a secondary mode, and a radar mimic function in a tertiary mode. The radar mimic function is described in more detail below. A sword object may cause damage by a possessing character object when a primary slash function is performed in proximity to another character, and may provide a deflection function in a secondary mode.

While a player's character is using a specific weapon object, the player can provide predetermined inputs using any of the aforementioned input devices (e.g., game controller, joystick, keyboard, mouse, etc.) to fire the weapon in a default mode (to perform the primary function), secondary mode (to perform the secondary function), and tertiary mode (to perform the tertiary function—if applicable). For example, a player may press and release a primary input button (e.g., the right trigger of a controller104) to fire the weapon in the default or primary mode of operation, press and release a secondary button (e.g., the right bumper button) to fire the weapon in or use the secondary mode, and press and hold the secondary button to fire the weapon in or use the tertiary mode operation.

Video game software401may include other software modules427, as needed.FIG. 4is illustrates but one possible software architecture that may be used. Each module depicted inFIG. 4may communicate directly or indirectly with each other module, e.g., by passing objects, data, parameters, input, output, etc.

FIG. 5Aillustrates a block diagram of an instance501of an object. Object instance501has an object type503(player character), and an object class505(character). Instance501may inherit one or more attributes507from object type503and/or object class505. Attributes507, when examined, define a state of the instance. In this example, because instance501is a Player Character object, instance501has an associated player attribute511(Aviator). The associated player may correspond to an input device connected to the game playing device on which the video game is being played, a user name on a network gaming service such as XBOX LIVE® (the game playing device may separately associate the user name with a specific controller or input device connected to the game playing device), or any other identifier or feature that uniquely identifies a human video game player providing input to video game401. Instance501may also have various other attributes, for example, name513, health515, radar mode517, psychosis flag519, hologram flag521, and team523.

Instance501may inherit methods509based on the object type503and/or class505. Each method can be performed by instance501, and collectively define the capabilities or actions that instance501can perform. In this example, because instance501is a character, instance501may inherit methods such as roll525, crouch527, punch529, primary action531, secondary action533, tertiary action535, use equipment537, and possess539. Each action may perform a corresponding action, e.g., as explained by the method names.

FIG. 5Billustrates a block diagram of an instance551of an object. Object instance551has an object type553(Machine Gun3), and an object class555(weapon). Instance551may inherit one or more attributes557from object type553and/or object class555. Attributes557, when examined, define a state of the instance. In this example, because instance551is a Machine Gun3object, instance551has an associated primary ammo remaining attribute561(16 rounds remaining), secondary ammo remaining attribute563(15 seconds), and hologram state564(off). Instance551may also have other attributes, such as primary sound level565(indicates a noise level associated with performing a primary trigger pull) and secondary sound level566(indicates a noise level associated with performing a secondary trigger pull).

Instance501may inherit methods559based on the object type553and/or class555. Each method can be performed by instance551, and collectively define the capabilities or actions that instance551can perform. In this example, because instance551is a Machine Gun3object, instance551may inherit methods such as primary trigger pull567(indicates when a character possessing instance551is performing a primary action), secondary trigger pull569(occurs when a character possessing instance551is performing a secondary action), fire bullet571(results from a primary trigger pull567), and toggle hologram (results from a secondary trigger pull573). Each action may perform a corresponding action, e.g., as explained by the method names.

Thus, for example, player character instance501may perform the Possess method539on a weapon object, such as instance551of Machine Gun3, to equip that weapon. While “possessing” Machine Gun3, instance501may perform secondary action method533(e.g., based on appropriate user input), thereby causing instance551to perform secondary trigger pull569. Instance551then performs the Toggle Hologram method, causing the Hologram attribute564to toggle between on and off. This is but one example, and other classes, objects, instances, attributes, methods, and/or other data may be used to implement the video game features described below.

For ease of understanding, aspects may be described below in a manner such as “when a first player character aims and fires a weapon at a second character.” This descriptive methodology may refer to a player character object performing a primary, secondary, and/or tertiary action method, causing the weapon object to perform a corresponding primary, secondary, and/or tertiary trigger pull, resulting in the weapon object performing another method corresponding to the appropriate trigger pull method. Similar and/or alternative interpretations are readily understood based on the various descriptions below.

According to an illustrative aspect, with initial reference toFIG. 6, a FPS video game401may provide a radar603with which a player can determine the location of nearby enemies.FIG. 6illustrates a screenshot601of the FPS shooter video game PERFECT DARK ZERO® by Rare, Ltd. and Microsoft Corporation. Radar603provides locations of other characters in the video game401relative to player character609. Player character609represents the character (e.g., instance501) controlled by the player or user of the video game, whereas other characters in video game401may include friendly, neutral, and/or enemy player characters and/or computer-controlled characters. InFIG. 6, radar603illustrates radar blips605a,605b, and605crepresenting locations corresponding to enemy characters607a,607b, and607c, respectively. Radar603also illustrates radar blips605dand605ecorresponding to enemy characters not visible in the screenshot ofFIG. 6, and thereby not directly visible to player character609. As used herein, a radar ‘blip’ refers to any visual indication or symbol on a radar screen to indicate a position of a second character relative to the position of a first character, where the first character is generally (but not necessarily) the user or owner of the radar.

Radar603may be oriented such that the position604of the player character609is always in middle of the radar603, and the remainder of the radar is based on a current orientation of the player character609. Radar603may provide a top-down view of the level, where portions of the radar above the center represent areas of the simulated virtual environment in front of the player character609, and portions of the radar below the center represent areas of the simulated virtual environment behind the player character609. Other radar implementations may be used instead. For example, the radar may be statically oriented, where the player character's position remains in the middle, but the radar is fixed where the top represents North, the bottom represents South, the right represents East, and the left represents West. Still alternatively, the radar may remain fixed as to a geographic area, and the player character's position is not fixed to the center of the radar. Instead, as player character609moves around the simulated virtual environment, the blip604corresponding to player character609moves correspondingly around radar603. Other characters in the game are illustrated in radar603relative to the current position of player character609.

Video game401may use various criteria to determine whether to include or exclude other characters, in proximity to character player609in the simulated environment of the video game, from illumination on radar603. For example, video game401may illuminate on radar603, without distinguishing friend from foe, any character within the range of the radar603. Alternatively, radar603may illuminate friendly characters in a first visual appearance or color, e.g., green, and may illuminate enemy characters in a second visual appearance or color, e.g., red. Video game401may illuminate characters on radar603based on movement of each character. For example, when a character moves, that character appears on radar. But if a character remains still, the character is not displayed on the radar.

Acoustic Radar

According to one illustrative aspect, video game401may illuminate characters on radar603based on a simulated acoustic signature or simulated noise level associated with each character in the simulated environment of video game401. In general terms, when the character makes noise within the simulated environment of video game401(e.g., when a weapon object defines an audible range, as a noise level or distance, for a function being used), that character appears on radar603of other players that are within the audible range. For those characters not within the range of radar603, video game401may optionally display an indication on the periphery of radar603indicating a general direction of the noise-making character without identifying an exact location. Such a radar may then be an acoustic radar.

Video game401may determine when and how long to include a character on radar603based on various factors. The appearance of a character on radar may be based on an approximation of how loud the character is, or how loud of a weapon the character is using. The louder the simulated noise level, the more likely the character is to appear on radar. Also, louder simulated noise levels may result in a character appearing on radar for longer periods of time than lower simulated noise levels. For example, a character might not appear on acoustic radar603until the character produces a sound above a minimum sound threshold. The game401may include an identification of a maximum audible distance for each weapon, where firing the weapon will appear on an another character's radar only if that character is within that maximum audible distance from the fired weapon. Audible distance may be implemented in a variety of ways, such as by using an attribute of weapon objects, by use of a lookup table, or may be identified using some other defined function.

The length of time that a character remains on acoustic radar603may depend on the noise level or loudness of the sound produced by that character. Sounds may be produced, e.g., by walking noisily, shooting various weapons, and talking, among other things. Thus, if a first character walks through simulated glass, the noise might cause the character to appear on a second character's radar. However, the video game may provide alternative modes where characters may be able to reduce the noise they produce, thereby precluding their inclusion on another character's radar. For example, if a first character is in a ‘crouch’ mode or otherwise walking slowly, and thus is walking stealthily, the first character might not appear on the second character's radar, even when walking through glass or some other noisy material or environment. The crouch mode may be identified by a Crouch toggle attribute (not shown) associated with character objects.

Similarly, if a first character speaks in the simulated environment of video game401, the first character might appear on a second character's radar. However, characters may be able to reduce the noise they produce, thereby precluding their inclusion on another character's radar. For example, if the first character is in a hand signal or whisper mode, or otherwise talks softly, the first character might not appear on the second character's radar, even when communicating with other characters. The whisper mode may be identified by a Whisper toggle attribute (not shown) associated with character objects.

Appearance on an acoustic radar might also depend on actual noise levels of communicating players. For example, in a multiplayer match, teammates may verbally communicate and talk with one another using headsets142. Video game401(e.g., input module407and/or audio module415) may monitor decibel levels or other audio/sound levels of such communications and, when such conversations are above a minimum threshold level, the character corresponding to the speaking player may appear on other proximately located characters' acoustic radar. The software modules may include a data structure identifying one or more threshold levels, with a predetermined audible range associated with each level, such that different sound levels may be detected at different ranges.

If a first character fires a weapon in the simulated environment of video game401, the first character might appear on a second character's radar. However, characters may be able to reduce the noise they produce, thereby precluding their inclusion on another character's radar. For example, if the first character fires the weapon in a silenced mode or otherwise muffles the sound of the weapon, the first character might not appear on the second character's radar, or might appear for a lesser amount of time than the character otherwise normally would appear as a result of firing that weapon. According to one illustrative aspect, because different weapons produce different amounts of noise, the length of time for which a character appears on radar might be proportional to or based on the amount of noise the weapon produces. That is, if a character fires a silenced weapon, the character might not appear on radar at all. If the character fires a pistol, e.g., a Pistol1object, the character might appear on radar for a relatively short amount of time, e.g., five seconds. If the character fires a rifle, e.g., a Rifle1object, the character might appear on radar for a longer amount of time, e.g., eight seconds. If the character fires a heavier weapon, e.g., a Pistol3object, the character might appear on radar for yet a longer amount of time, e.g., ten seconds. Table 1, below, illustrates radar signatures, or simulated noise levels, for the various weapons described above, according to an illustrative embodiment. The values in Table 1 may represent Sound attribute values of weapon objects.

TABLE 1Time on Radar(PrimaryTime on RadarTime on RadarNameSound)(Secondary Sound)(Tertiary Sound)Pistol 150n/aPistol 255n/aPistol 355n/aPistol 41010n/aPistol 500n/aHeavy 15n/an/aHeavy 25n/an/aHeavy 355n/aMachine Gun 150n/aMachine Gun 288n/aMachine Gun 35n/an/aMachine Gun 45n/an/aRifle 188n/aRifle 2800Rifle 355n/aRifle 477n/aBlade 100n/aShotgun5n/an/aSniper Rifle 155n/aSniper Rifle 255n/a

In addition to the above weapons, handheld or thrown weapons may also produce a radar signature, and may have weapon object data and attributes as described above with other weapons. For example, a grenade may produce a 5 second radar signature at the location of detonation of the grenade, or at the location from which the grenade was thrown. A flashbang grenade may produce a similar radar signature, e.g., 5 seconds. A boomerang weapon may also produce a radar signature, e.g., 5 seconds. The times listed above and in Table1are illustrative only, and other times may be used instead.

Each time an instance of an object performs a method that has an associated sound level, simulated noise level, acoustic signature, or other noise producing event and/or attribute associated with it, the object may register the event with radar module421. Radar module421, in turn may manage a database of reported noises with corresponding radar signatures and noise locations. The radar signature may include an indication of a player or team creating the noise. Each entry or event in the radar module database may expire after a predetermined amount of time based on the simulated noise level or Sound attribute. Radar objects associated with characters may subsequently query or poll radar module421for noise events. The query may or may not include a current location of a character corresponding to the radar object. Radar module421returns query results to the radar object. The query results may include only noise events within a predetermined distance of the location of the character, where a location is included in the query. Alternatively, radar module421may independently identify the character, and provide only relevant noise events. Radar module421may alternatively provide all noise events. The radar object then causes UI module409or graphics module411to display appropriate visual displays on radar603.

FIG. 7illustrates a flowchart for an illustrative method of displaying characters on an acoustic radar. In step701a first character performs some action which makes some noise in the simulated environment of video game401. The action may include speaking or other vocal noise, firing a weapon, and/or otherwise interacting with the simulated environment (e.g., breaking glass, etc.). In step703video game401determines a simulated noise level associated with the action. For example, video game401may use a lookup table or noise level function to determine a simulated noise level associated with the action. Video game401may alternatively refer to a corresponding Sound attribute of the character, weapon, equipment, or other object causing the sound to occur. That is, sound levels may be hard-coded. Video game401may alternatively refer to an acoustic or simulated noise level associated with a class or object type, if a noise level is not defined in an instance of an object. The simulated noise level may be generated in other manners as well. In step705, based on the simulated noise level, video game401determines a radar signature associated with the noise level. The radar signature may include a size of a blip, or radar image, to display on a radar, a blip color, and/or a blip duration (i.e., the length of time the blip appears on radar). The radar signature may include additional or different information as well, such as a blinking pattern to indicate a specific action performed by the first character (e.g., using a specific weapon, blowing up a bomb, or providing a specific audible signal). The radar signature may be defined as attributes to an object instance, type, and/or class, or may be determined by a radar object based on various received inputs, including a value of a team attribute, an object type and/or class, etc.

In step707the video game401determines whether the first character is within the radar range of any other characters. Location may be determined by comparing a location attribute (not shown) of each instance of a character object. Video game401uses location information to determine whether the radar signature should appear on any other character's radars. In step709the video game401displays the radar signature on the radar of all characters whose radar range covers the first character. The radar signature is displayed on each radar in a location corresponding to the first player's location in the simulated environment of the video game relative to the player on whose radar the radar signature is displayed, and the radar signature may be displayed with the designated color, blink pattern, duration, etc. In step711, the radar signature is displayed for the designated amount of time and with the designated appearance corresponding to the noise level and/or radar signature, and then in step713the radar signature is removed from the relevant characters' acoustic radar.

While the radar signature of the first character is being displayed on other characters' radars, the position of the first character may or may not be updated on radar. In a first variant, the position of the first character is not updated after being displayed on others' radar. In this first variant, the radar signature is associated with the location of the sound, and thus the radar blip may remain, e.g., for 5 seconds, in a constant location, regardless of whether the first character moves after creating the noise. In a second variant, the radar blip corresponding to the first character moves correspondingly to the first character's movement as long as the radar blip remains on other characters' radar. Thus, while the noise created by the first character causes the first character to appear on others' radar, the resultant radar blip tracks subsequent movement of the first character as long as the radar blip is being displayed.

Other variations and modifications may be made to the method illustrated inFIG. 7without departing from the present disclosure. For example, some steps inFIG. 7may be optional, steps may be combined, rearranged, and/or steps may be split up one or more separate steps. For example, step703and/or705may be optional where each action is directly associated with a radar signature or simulated noise level. As another example, step707may be performed at any time prior to step709. Also, subsequent actions by the first character may supersede a previously determined radar signature. For example, if the first character fires a pistol, resulting in a five second radar signature, and one second later fires a rifle, resulting in an eight second radar signature, the first character may actually appear on other characters' radar for nine seconds.

Features of acoustic radar described herein provide additional information and strategy options to video games. That is, when more powerful weapons have larger radar signatures, smaller and quieter weapons become more desirable to offensive players wanting to remain undetected or less detected. From a defensive perspective, features of the acoustic radar give players a capability to detect an enemy and to determine what type of weapon the enemy is using.

Decoy Bullet

As indicated above, a first person shooter game according to various aspects described herein provides a variety of weapons for players to use in each game. Each weapon may provide various features and functions, and provide advantages and disadvantages based on each feature and function. According to one aspect, a weapon (e.g., Pistol3objects) may provide a method (e.g., a primary, secondary or tertiary function) that fires a decoy bullet or some other projectile, energy beam, or the like. While a player's character is holding, or Possessing, the weapon, the player may provide input, e.g., via controller104, to cause the Pistol3object to fire a decoy bullet instead of a regular bullet. A decoy bullet deceives other players/characters as to the firer's true position by manipulating the acoustic radar. When a player's character fires a decoy bullet, the bullet (or other projectile) is fired from the weapon as if the weapon has a silencer. Thus, the firer does not appear on other characters' radar based on the location from which the decoy bullet was fired. The projectile instead makes noise at the location of first impact in the simulated environment of the video game. The report (i.e., explosive noise or sound resulting from firing a weapon) that would ordinarily originate from the location of firing instead originates from the location of the first obstacle in the path of the bullet or projectile fired from the weapon, thus causing a radar blip based on the location of impact. The character thus appears on acoustic radar as if he or she is at the location of impact of the projectile. The character thus appears on others' radar(s) at a false location. If another character is hit by the decoy bullet, the decoy bullet may or may not cause damage, or may cause more or less damage than an ordinary bullet (i.e., a bullet fired in the default mode of the weapon).

According to an aspect, video game401may provide for a class of objects whose acoustic signature or simulated noise level is represented differently than other classes. A decoy bullet object may be one such object. Such a class of objects may include a noise location attribute indicating that noise occurs at a location of impact of the decoy bullet object in the simulated environment of the video game401. A “regular” bullet object might include a noise location attribute indicating that noise originates from a location of firing, e.g., as defined by a character object Possessing the weapon object from which the regular bullet object was fired.

FIG. 8,FIG. 9, andFIG. 10illustrate the firing of a decoy bullet, the report originating from a location other than the firing character's position, and the resultant effect on another character's acoustic radar.FIG. 8illustrates an overhead view of a portion of a simulated environment801in a video game401. Two opposing player characters803and805are located at positions813,815, respectively. The southerly face of obstacle807is located at position817.FIG. 9Aillustrates an initial radar image901of an acoustic radar corresponding to player805(facing north) before player803fires a decoy bullet.FIG. 9Billustrates a resultant radar image903of the acoustic radar corresponding to player805(facing north) after player803fires a decoy bullet along trajectory809(i.e., from position803toward obstacle807).FIG. 10illustrates a process of firing a decoy bullet and resultant effects within an illustrative video game.

In step1001, player805reviews her radar image901, which presently does not indicate the presence of any nearby characters in the simulated environment of video game401. In step1003, player803fires a decoy projectile (e.g., a decoy bullet object resulting from a secondary trigger pull method of a Pistol3object) along trajectory809in the simulated environment of video game401. The projectile (decoy bullet object) and weapon (Pistol3object) from which it was fired do not register any noise at location813with any radar objects, based in the weapon firing a decoy projectile, and thus do not register on any other characters' radar from location813.

In step1005, the decoy bullet object (as may be calculated by the physics engine413) impacts an obstacle, here the southern edge of obstacle807at location817, at which location the bullet report occurs. That is, decoy bullet object may register a corresponding sound level attribute at the location of impact with radar module421. If a third player were standing near location817, that third player may hear player803firing Pistol3, but the sound may originate as if pistol were fired from location817instead of location813. For example, the object corresponding to the third player queries radar module421for any registered items, and radar module421reports the noise corresponding to the weapon object at a location corresponding to the location of impact. If desired, the game may also generate a sound of impact at location817, e.g., generated by the decoy bullet object. In step1007, player803illuminates on the acoustic radar of player805at the location817of impact of the decoy projectile, as illustrated inFIG. 9Band radar image903. Player805thus believes that a weapon was fired from location817. Location813, illustrated in broken lines inFIG. 9B, is for comparison purposes only, and represents the actual location of player803. Location813is not actually displayed on radar image903as visible by player805, thereby illustrating that player803can deceive player805by appearing in a false location on the radar of player805.

Variations and alternatives are possible. For example, the use of a decoy projectile might be more “expensive” than firing a regular projectile. A character might expend an entire bullet magazine to fire a single decoy bullet, and thus must wait to reload after firing the decoy bullet, whereas a bullet magazine might otherwise fire multiple (e.g.,6or9) regular bullets. The cost of using this feature may be reflected in the weapon object for that weapon. The character must therefore make a strategic decision when to use decoy bullets versus when to fire regular bullets. In another variation, the decoy bullet might fire an entire magazine in a single shot, and the bullets in the magazine may fire randomly from the location of impact, thereby also falsely reporting the location of the firer, but for a longer period of time. In such a variation, the noise from6(or however many bullets are in a magazine) might cause the falsely reported location to persist longer on others' radar.

According to another alternative, a player character may throw a decoy grenade. Upon throwing a decoy grenade, the decoy grenade may cause the character to appear on relevant radar based on the location at which the grenade detonates, rather than the location from which the grenade was thrown. Another variation includes the planting of a decoy mine. A first player character plants a decoy mine. When the decoy mine is activated, e.g., based on a timed delay or based on another character activating, or “tripping,” the mine, the first player character appears on relevant radar based on the location of the decoy mine, rather than the current location of the first player character

Radar Jamming

According to another aspect, a weapon object may provide a method, such as a default, secondary, or tertiary function, to jam an opponent's radar. That is, a player character's radar may have a default state being a Normal state (see, e.g.,FIG. 5A), whereby the corresponding radar object identifies other character's locations relative to the location of the player character. When a first player character aims and fires the weapon at a location within a predetermined distance from a second character (e.g., hitting the second character with a beam or projectile) in the simulated environment of the video game, the radar (acoustic or otherwise) of the second character may be permanently or temporarily placed into a jammed state, e.g., by changing the radar attribute to a Jammed state. While a character's radar attribute is Jammed, a radar object corresponding to that character displays no radar information regarding other characters, regardless of whether other characters would appear were the radar attribute in the Normal state. The duration of jamming may be any predetermined amount of time (e.g., 5 seconds, 10 seconds, permanent, etc.), and may depend on how close the weapon was aimed to the second character, may depend on the distance of the first player character to the second character when the first player character fired the weapon, or may depend on a host of other factors. Such factors may be included in lookup tables, databases, object attributes, determination functions, and the like.

According to various aspects, a character might only be able to jam any other character's radar once within a predetermined time period. For example, a weapon object might have an Inhibit Jamming method, which is automatically called upon activation of a Jam_Radar method of the same weapon object. The Inhibit_Jamming method may toggle a Can_Jam attribute to NO for some predetermined amount of time, and then toggle the Can_Jam attribute back to YES after expiration of that predetermined amount of time. Similarly, when that weapon object calls the Jam_Radar method, the method checks the Can_Jam attribute, and only executes if the Can_Jam attribute is presently set to YES.

Alternatively, after a first character's radar has been jammed, that first character might be immune from having her radar jammed again for some specified period of time. In such an alternative, the character object or radar object corresponding to the first character might include the Can_Jam attribute. When a second character attempts to jam the radar object corresponding to the first character, the radar object polls the Can_Jam attribute and, if set to YES, calls the Jam_Radar method. If set to NO, the radar object does not call the Jam_Radar method, or if the polling is performed from the Jam_Radar method, exits without changing the radar mode attribute to Jammed.

According to another aspect, a first character whose radar is presently jammed might automatically appear on all other characters' radar, regardless of whether the first character would normally appear on the other characters' radar. For example, if the video game uses an acoustic radar, the first character whose radar is jammed may be “lit up” and appear on all other relevant characters' radar, regardless of whether the first character is firing a weapon or making any noise. In such an aspect, the first character's radar object may register a noise event with radar module421, having a location corresponding to the first character, and having a noise level or duration corresponding to the length of time the radar is jammed.

FIG. 11illustrates a method of jamming a player's radar. In step1101a second character's radar is in a Normal operation state. In step1103a first character aims and fires a weapon (or some other device) at the second character while the weapon is in a radar jamming mode (e.g., a primary, secondary, or tertiary action is performed to cause the weapon object to perform a Jam_Radar method). In step1105video game401determines whether the first player's shot is considered a hit on the second player, based on whatever criteria the video game401uses to determine whether one player hits another player. For example, the physics engine413may calculate a trajectory of the projectile, and determine whether the projectile strikes another player. If the shot is considered a hit, then in step1107the video game places the second player's radar into a jammed state for a predetermined amount of time, e.g., 5 seconds (the step may also include an optional step of checking the second player's Can_Jam attribute and proceeding only if the player is able to be jammed). For example, the character object of the second player may indicate radar mode=Jammed. While in the jammed state, the second character's radar does not identify locations of other characters, and the second character may optionally be lit up and appear on all enemy radar, as described above. Video game401may jam the second player's radar based on how close the first player's shot was to the second player, or based on how far away the first player was from the second player when the first player shot at the second player. In step1109the video game determines whether the predetermined amount of time for the radar jamming has expired. If so, the radar of the second character is placed back in the Normal state in step1101.

According to other variations, radar jamming might result by other than a character firing a weapon at another character. For example, a first character might plant a mine or device (e.g., an equipment object) that, when approached by a second character, jams the radar of the second character. That is, radar jamming may result from any predetermined action directed at a character within the simulated environment of the video game. Radar jamming for some period of time might also result from any weapon, bullet or projectile that hits a character, e.g., to simulate disorientation of that character as a result of being hit by a weapon

Radar Mimic

As described above, a radar object may display a blip corresponding to friendly characters (e.g., characters having a same value for a team attribute523as the character to whom the radar object corresponds) with a first visual appearance (e.g., colored green), and may display a blip corresponding to enemy players (e.g., characters having a different value for team attribute523as the character to whom the radar object corresponds) with a second visual appearance (e.g., colored red). According to various aspects, a weapon or other object in video game401may provide a method (e.g., a default action, secondary action, or tertiary action method) that makes a first player character using the device or weapon appear on a second character's radar as a friendly player to the second character, regardless of whether the first character and second character are friends or enemies. When the first character activates the device or fires the weapon in this ‘radar mimic’ mode, and the first character appears on an enemy character's radar, the first character appears as a friend to the enemy character, e.g., by being displayed with the first visual appearance (e.g., as a green blip) instead of with the second visual appearance (e.g., as a red blip).

Radar mimic as described above may be performed, e.g., by altering radar mode attribute517to a predetermined value, e.g., Mimic. A radar object corresponding to the character object501may then display that character on all enemy players' radar as friendly. Alternatively, radar mimic may result from temporarily altering team attribute523to be other than the character objects initial team value. This change in appearance may be accomplished in a variety of ways. For example, the radar object code may include a data structure identifying an appearance value for each character object (e.g., identifying the team to which each character belongs).

According to various aspects, the duration of the radar mimic effect may last for some predetermined amount of time, e.g., 5 seconds, 10 seconds, etc. In addition, a character might only be able to activate the radar mimic mode once per predetermined period of time, based on attributes and methods inhibiting radar mimic similar as described above with respect to radar jamming. Radar mimic may also be directed at a first enemy character, thereby making the enemy character appear as a foe of his or her own friends (i.e., other enemy characters). That is, the targeted player swaps teams on radar (e.g., by changing team attribute523), and the first enemy character might appear as an enemy blip on the radar of his/her own teammates. In such a manner, the other enemy characters might shy away from the first enemy character during the video game for the duration of the radar mimic mode, or may accidentally shoot the first enemy character prior to visually confirming that the first enemy character is indeed a friend.

FIG. 12illustrates a method for performing a radar mimic according to various aspects described herein. In the method illustrated inFIG. 12, a first player character and a second character are enemies in the video game, e.g., because they are on opposing teams in a multiplayer match. In step1201the first player activates a radar mimic device, e.g., by firing a designated weapon object using a primary action, secondary action, or tertiary action that causes radar mimic as described herein, or by performing a method on some other device or equipment object corresponding to or proximately located to the first player character in the simulated environment of the video game401.

In step1203, the video game401determines whether the first player is within the range of any other player's radar, such as an enemy character's radar. If so, in step1205the video game401displays a blip on the enemy second character's radar corresponding to the first player character, but displays the blip as if the first player character is a friend of the second character. This may result from altering a radar object state or changing a team attribute, as described above.

As indicated above, the first player character may falsely appear as a friend to the second character for some predetermined amount of time. If the video game is using an acoustic radar, and if the first player character performed some action or fired a weapon which causes the first player character to appear on the second character's radar for longer than the effect of the radar mimic device, the radar blip corresponding to the first player character on the second character's radar may alter appearance at the end of the predetermined amount of time. That is, when the radar mimic effect has expired, the video game may revert the radar blip corresponding to the first player character on the second character's radar to again appear as an enemy to the second character for the remainder of the time that the first player character is to appear on the second character's acoustic radar according to the heuristics described above.

According to another variation, a radar mimic method of an object might have an area of effect attribute, and any character within a radius defined by the area of effect from the object is affected by the radar mimic method. For example, a radar mimic method of an object might have an area of effect of ten feet in the simulated environment of the video game. If a first player character activates the relevant radar mimic device, and two teammates of the first player character are located within the area of effect (here, ten feet) of the first player character at the time of activation (or at any time while the radar mimic device is active), then not only does the first player character appear as a friend on the relevant radar of any enemy characters, but the two teammates appear as friends on the relevant radar of any enemy characters as well.

Friend/Foe Ambiguity

According to another aspect, a weapon may provide a method (e.g., resulting from a default action, secondary action, or tertiary action) that temporarily causes a character not to be able to distinguish visually between friends and foe. That is, friends and enemies appear similarly, or ambiguously. When a first player character aims and fires the weapon at a location within a predetermined distance from a second character in the simulated environment of the video game, the video game may cause the second character to enter a state (e.g., referred to as a psychosis state or psychosis effect) whereby the second character cannot distinguish friendly and enemy characters.

According to one aspect, a weapon object fired in a predetermined mode, e.g. a Pistol5object fired using a secondary trigger pull method, performs a Psychosis method that toggles a psychosis attribute519of a character object at which the weapon object is aimed. As a result of the weapon object hitting the character object, the psychosis attribute may temporarily be changed to YES or TRUE. UI module409and/or graphics module411, when rendering other characters for display on a video output device, may determine an appearance of each character based in part on the value of the psychosis attribute519. When the psychosis attribute is NO or FALSE, characters may be rendered as normal. When the psychosis attribute is YES or TRUE, characters may be rendered as described below. Other approaches may be used as well to implement such features.

For example, while in the psychosis state (e.g., psychosis attribute is YES or TRUE) the video game may cause all visible characters to appear as enemies of the second character, thereby causing a player controlling the second character to shoot at all visible characters. In such an example, the video game may render all visible characters as having colors, uniforms and/or some other visual appearance or cue (e.g., changing colors to an enemy colored uniform or appearance, scary features, etc.) which make them appear as enemies to the character under the psychosis effect. If the video game provides a radar that distinguishes friend from enemy, the video game may also cause any character within radar range to all appear as an enemy on the radar image. In such a scenario, the second character's team is put at a temporary disadvantage, because the second character may try to attack all visible characters, including his or her own teammates. That is, if the second character attacks another character, the second character may in fact be attacking one of her own teammates, but cannot tell the difference due to the psychosis effect. The appearance of a character may be defined in a character object data file, which may include predefined textures and visual characteristics for the character, and changing the visual appearance of a character in the psychosis state may involve causing a given character object to identify a different data file for its appearance.

As another example, the video game may cause all visible characters to appear as friends of the second character, thereby causing a player controlling the second character to stop firing altogether. In such an example, the video game may render all visible characters as having colors, uniforms and/or some other visual appearance or cue (e.g., changing colors to a friendly-colored colored uniform or appearance, appealing features, etc.) which make them appear as friends of the character under the psychosis effect. If the video game provides a radar that distinguishes friend from enemy, the video game may also cause the characters within radar range to all appear as friends on the radar image. In such a scenario, the second character's team is put at a temporary disadvantage, because the second character does not know who to attack. If the second character attacks another character, the second character may in fact attack one of her own teammates.

As another example, the video game may cause all visible characters to appear as neutral characters, thereby causing a player controlling the second character to be unsure whether to shoot visible characters or not. In such an example, the video game may render all visible characters as having colors, uniforms and/or some other visual appearance or cue which make them appear neutral to the character under the psychosis effect. If the video game provides a radar that distinguishes friend from enemy, the video game may also cause the characters within radar range to all appear as neutral characters on the radar image.

The duration of the psychosis effect may be any predetermined amount of time (e.g., 5 seconds, 10 seconds, etc.), and the amount of time may depend on how close the weapon was aimed to the second player (e.g., a direct hit, graze, etc.), may depend on the distance of the first character to the second character when the first character fired the weapon, or may depend on other factors. According to an illustrative aspect, if the second character under the psychosis effect kills a member of his or her own team while under the psychosis effect, the kill might be attributed to the first player character that shot or otherwise caused the second character to be under the psychosis effect. A character may or may not be immune from the psychosis effect for some predetermined period of time after the psychosis effect has ended, e.g., by inhibiting toggling of the psychosis attribute519.

FIG. 13illustrates a method for performing a psychosis effect according to one or more illustrative aspects. In step1301a first player character aims and fires a weapon in a primary, secondary, or tertiary mode that provides a psychosis effect upon hitting another character. In step1303, video game401determines whether the first player character hit the second character. If so, in step1305the video game causes the second character to perceive all visible characters as enemies to the second character, e.g., by rendering all visible characters in uniforms or otherwise altering a visual appearance of friendly characters to make them appear as enemies to the second player character. Video game401may also cause the second character's radar to display his/her own teammates as enemies.

In step1307, video game401determines whether the second character killed any of his or her own teammates or friends while under the psychosis effect. If so, the video game may attribute such kills to the first player character who shot the second character or otherwise caused the second character to be under the psychosis effect. For example, if a first character object performs a method that fires a weapon object at a second character object, resulting in the death of the second character object, and both the first and second character objects have the same value for their respective team attribute523, and the psychosis attribute519of the first character object is YES or TRUE, the video game401attributes the kill to a third character object that caused the psychosis attribute519of the first character object to be true.

The change in visual appearance may be graphically depicted on an output device (e.g., display device591) corresponding to the player whose character is under the psychosis effect. The visual appearance may also to visually depict friends as enemies, e.g., by depicting friends as wearing enemy uniforms, taking on the physical appearance of an enemy, or any other visual appearance or cue.

Duplicate Character Image

According to another illustrative aspect, a weapon or device may provide a method (e.g., a primary action, secondary action, or tertiary action) which, when activated, initiates a duplicate image mode that causes a hologram to appear that duplicates the appearance and/or actions of a first player character to other characters in the video game. The hologram of the first player character may appear at a constant offset from the actual first player character, and may maintain a same orientation as the actual image of the first player character (e.g., if the actual player character rotates 90 degrees to the right, the hologram image also rotates 90 degrees to the right; if the actual player moves two steps forward, the hologram image also moves two steps forward). The hologram first player character thereby duplicates the appearance and actions of the actual first player character. Those characters and player characters that view the original and hologram of the first player character will have difficulty visually telling the difference between the two, and thereby do not know which one to shoot. If characters shoot the hologram, the player character might take no damage, thereby providing the first player character a strategic advantage in the video game. However, shooting the hologram of the first player may temporarily disrupt the hologram, e.g., by displaying static in the hologram, or by the hologram image shuddering briefly, thereby indicating to the shooter which image is real and which image is the hologram. In addition, a character might run slower (e.g., a speed attribute of a character object might be reduced) when a hologram is active, thereby balancing any advantage the player might gain by using a holographic duplicate image. According to an alternative aspect, the hologram may move independently of the actual image. That is, the player may provide first predetermined input via controller104to enter a first mode to control the actual player character, and second predetermined input to enter a second mode to control the hologram image of the player character.

FIG. 14illustrates a screenshot1401of video game401from a first person perspective of a second player character1405prior to a first player character1403, identified as SChang1O38, activating a hologram mode.FIG. 15illustrates a screenshot1501of video game401from the first person perspective of the second player1405subsequent to the first player character1403activating the hologram mode. InFIG. 15, character image1507and1509represent the actual player character and hologram player character, although second player character1405cannot distinguish between the two unless the second player character watched as the hologram appeared. The identification of player1403, SChang1O38, is rendered between the two images1507,1509so as not to inadvertently identify the actual image and the hologram image.

FIG. 16illustrates a method for creating and controlling a hologram according to one or more illustrative aspects. In step1601, the video game401displays only the actual image of the first player character, e.g., prior to the first player character initiating a hologram or duplicate image mode. The first player character may appear to other characters, e.g., as illustrated inFIG. 14. In step1603, the video game receives input indicating that a predetermined action associated with creating a hologram image of the first player character has been performed. The predetermined action may include, e.g., the first player character activating a weapon or device that causes a hologram image to appear, or activating an alternate mode of a weapon or device that causes a hologram image to appear.

In one illustrative embodiment, the character object of the first player character may perform a secondary action while Possessing a Machine Gun3object. The Machine Gun3object performs a secondary trigger pull method, which causes the Machine Gun3method to perform the Toggle_Hologram method. The Toggle Hologram method may check the status of secondary ammo attribute563to determine whether any ammunition remains. Here, because the hologram is not a weapon, the secondary ammo may instead refer to an amount of time accumulated or available for use in producing the hologram image. If the secondary ammo attribute563indicates that there is hologram time remaining, then the Toggle_Hologram method changes the hologram attribute564to ON. Toggle_Hologram method573may continue to execute, e.g., running a timer to decrement the secondary ammo attribute563, based on the length time the hologram attribute564is ON.

In step1605, based on the predetermined action, video game401displays a hologram image of the first player character as well as the actual image of the first player character, for example, as illustrated inFIG. 15. In step1607, as video game401receives input from the player controlling the first player character, video game401moves the hologram image based on the received input. As indicated above, the movement may be identical to or separate from movement of the actual image of the first player character.

In step1609, video game401determines whether the hologram is disrupted, e.g., due to enemy weapon fire, depletion of energy sustaining the hologram, depletion of secondary ammo attribute563, etc. If the hologram is disrupted, then in step1611video game401alters the displayed hologram image within the simulated environment of the video game, e.g., by displaying static in the hologram, a brief shudder of the hologram, removing the hologram altogether, etc. There are a number of ways this change in appearance can be implemented. For example, a hologram object routine may access the same visual appearance data for the duplicated character, but may alter color values prior to display to generate the different appearance, the shudder, static, etc.

According to various illustrative aspects, the player character might not be able to inflict damage using the hologram image, but can use the hologram image to safely scout enemy territory without taking damage. That is, UI module409and/or graphics module411may render the video display to depict a view of the simulated environment as if viewed from the hologram's location and direction of view. However, while controlling the hologram image, the player may leave the actual character vulnerable to attack from others. In yet another alternative, once the hologram mode is activated, the player can only control the hologram image, and the actual player character remains stationary.

According to another aspect, while the player character cannot take damage as a result of the hologram being hit by enemy fire, the hologram may be destroyed, e.g., because it gets disrupted or destroyed by enemy attack. If the hologram is hit, the player character may take no damage, but the hologram may be destroyed and the player character might be required to wait a predetermined amount of time before initiating or creating another hologram. Or if the hologram had traveled far, the player character might be required to create a new hologram at his or her present location, thereby requiring the new hologram to also travel the same distance to reach the location where the previous hologram was destroyed. The various behavior characteristics of the hologram may be implemented as a separate hologram routine instantiated when the hologram function is activated.

The features described above are preferably encoded in computer software as executable instructions that can be executed on a computing device, such as a personal computer or video game console, to result in the display of the screens shown in the figures. The executable instructions may be stored on a computer-readable medium, such as one or more computer disks, RAMs, CD-ROMs, DVDs, game cartridges, etc. Also, although various features are described above, it is not necessary to practice them all in the same embodiment. Instead, various combinations and subcombinations may be implemented as desired, and the scope of the present invention should only be limited by the claims that follow.

Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims

  1. One or more computer readable storage device storing executable instructions for performing a video game method of representing characters on a radar image displayed on a video output device, said method comprising steps of: determining a first simulated noise level associated with a first object in a simulated environment operating under control of the video game;and displaying on the radar image, said radar image corresponding to a first character, a first radar blip corresponding to the first object, said first radar blip having a first characteristic based on the first simulated noise level associated with the first object determining a second simulated noise level associated with the first object in the simulated environment operating under control of the video game, wherein said second simulated noise level is determined to be louder than said first simulated noise level;and displaying on the radar image corresponding to the first character, a second radar blip corresponding to the first object, said second radar blip having a first characteristic based on the second simulated noise level associated with the first object, wherein the first characteristic of the first radar blip comprises a first amount of time based on the first simulated noise level, wherein the first characteristic of the second radar blip comprises a second amount of time based on the second simulated noise level, said second amount of time being longer than said first amount of time, wherein displaying the first radar blip comprises displaying the first radar blip for the first amount of time, and wherein displaying the second radar blip comprises displaying the second radar blip for the second amount of time.
  1. The computer readable storage media of claim 1 , wherein the first simulated noise level corresponds to a first weapon being used by a second character in the simulated environment, and wherein the second simulated noise level corresponds to a second weapon being used by the second character in the simulated environment.
  2. The computer readable storage media of claim 1 , wherein the first simulated noise level corresponds to a first mode of a first weapon being used by a second character in the simulated environment, and wherein the second simulated noise level corresponds to a second mode of the first weapon being used by the second character in the simulated environment.
  3. The computer readable storage media of claim 1 , wherein the first simulated noise level corresponds to a weapon in use by a second character in the simulated environment of the video game, and wherein the first radar blip corresponding to the first object corresponds to an approximate current location of the second character.
  4. The computer readable storage media of claim 4 , further comprising the step of, while the first radar blip is displayed on the radar image, maintaining the first radar blip to correspond to the approximate current location of the second character as the second character moves in the simulated environment.
  5. The computer readable storage media of claim 1 , wherein the first simulated noise level corresponds to a weapon in use by a second character in the simulated environment of the video game, and wherein the first radar blip corresponding to the first object corresponds to a point of impact of a projectile fired from the first weapon.
  6. One or more computer readable storage device storing executable instructions for performing a video game method of representing characters on a radar image displayed on a video output device, said method comprising steps of: providing an acoustic radar system to a first character in a simulated environment of the video game, said acoustic radar system identifying a location of a second character in the simulated environment based on a noise level associated with the second character;receiving first input indicating said second character has fired a first type of projectile in the simulated environment;displaying, on the acoustic radar system of the first character, a first radar blip corresponding to a location of impact of the first type of projectile;receiving second input indicating said second character has fired a second type of projectile in the simulated environment;and displaying, on the acoustic radar system of the first character, a second radar blip corresponding to a location from which the second character fired the second type of projectile, wherein said first type of projectile comprises a decoy bullet, and wherein said second type of projectile comprises a non-decoy bullet, and wherein said decoy bullet is fired by a first operational state of a first weapon and said non-decoy bullet is fired by a second operational state of the first weapon.
  7. The computer readable storage media of claim 7 , the method further comprising the step of providing an audible report at the location of impact within the simulated environment of the video game.
  8. The computer readable storage media of claim 7 , wherein the receiving and displaying steps occur in time proximity such that a player of the video game perceives the receiving and displaying steps to occur simultaneously.
  9. The computer readable storage media of claim 7 , the method further comprising causing a time delay between the receiving and displaying steps.
  10. The computer readable storage media of claim 7 , further comprising steps of: attributing a first damage amount based on the first type of projectile;and attributing a second damage amount, higher than said first damage amount, based on the second type of projectile.
  11. One or more computer readable storage device storing executable instructions for performing a video game method of representing characters on a radar image displayed on a video output device, said method comprising steps of: providing a radar system to a first character in a simulated environment of the video game, wherein said first character's radar system, when in a normal operation state, identifies on the radar image a location of a second character in the simulated environment relative to a location of the first character in the simulated environment;receiving first input indicating said second character has performed a predetermined action directed at the first character in the simulated environment;placing the first character's radar system in a jammed state for a predetermined amount of time, wherein when in said jammed state, said radar system does not identify the location of the second character;and while the first character's radar system is in the jammed state, displaying an indication on a non-jammed radar system of another character that the first character's radar system is in the jammed state.
  12. The computer readable storage media of claim 12 , wherein said predetermined action comprises the second character firing a predetermined weapon at the first character in the simulated environment of the video game.
  13. The computer readable storage media of claim 13 , wherein the predetermined weapon comprises a secondary mode of a rifle, said rifle having a primary mode providing a long- range sniper capability.
  14. The computer readable storage media of claim 12 , wherein said predetermined amount of time is approximately 10 seconds.
  15. The computer readable storage media of claim 12 , the method further comprising the step of requiring the first character's radar system to be in the normal operation state for a predetermined amount of time prior to entering the jammed state a second time.

Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.