U.S. Pat. No. 7,094,153

VIRTUAL SPACE CONTROL METHOD

AssigneeSony

Issue DateMarch 6, 2002

Patent Arcade analysis Read the full post

U.S. Patent No. 7,094,153: Virtual space control method

 

U.S. Patent No. 7,094,153: Virtual space control method
Issued Aug. 22, 2006, to Sony

 

Summary:

The ‘153 patent describes a video game system where the camera viewpoint is changed depending on the character’s movement. Under this invention, the player has full control over his character. Whenever that character completes a certain objective, the camera is programmed to change and reflect a different viewpoint so that the player has a better view of the action at hand. The camera changes, not the view of the character, but of the whole world in orientation with the head of the character.

Abstract:

A program execution apparatus moves the fixation point in a virtual space in response to a change in the direction of a prescribed part of a virtual character in the virtual space. When the program execution apparatus moves the fixation point in a virtual space in response to a change in the direction of a prescribed part of a virtual character in the virtual space, it causes the occurrence of a prescribed object in the virtual space, thereby achieving a video game having a high level of reality and an improved level of entertainment.

Illustrative Claim:

1. A virtual space control method comprising the steps of: changing an orientation of a prescribed part of a virtual character in a virtual space; changing a screen image in response to the change in orientation of the prescribed part, wherein the screen image represents a virtual field of view of the virtual space defined by a viewpoint other than a viewpoint of the virtual character and including a whole image of the virtual character; moving the virtual character in the virtual space; and detecting an occurrence of a prescribed event, wherein the step of changing the orientation includes a step of changing the orientation of the prescribed part in response to the occurrence of the prescribed event, wherein the step of changing the screen image has a step of changing the screen image in response to the movement of the virtual character and to the change in orientation of the prescribed part, and wherein the prescribed event is selected from a plurality of events occurring in the virtual space.

Illustrative Figure

Abstract

A program execution apparatus moves the fixation point in a virtual space in response to a change in the direction of a prescribed part of a virtual character in the virtual space. When the program execution apparatus moves the fixation point in a virtual space in response to a change in the direction of a prescribed part of a virtual character in the virtual space, it causes the occurrence of a prescribed object in the virtual space, thereby achieving a video game having a high level of reality and an improved level of entertainment.

Description

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Various embodiments of the present invention will be described with reference to the accompanying drawings. It is to be noted that the same or similar reference numerals are applied to the same or similar parts and elements throughout the drawings, and the description of the same or similar parts and elements will be omitted or simplified. General Description of a Video Game In this video game, in response to, for example, operation of a controller or to the generation of a prescribed event described further below, one part (element) of a game character is caused to move and, in response to a change in the direction of the part of the character, a fixation point is caused to move, thereby a virtual field of view (not of the character, but of the camera) in the game space being caused to move. Details of the configuration and functions of a video game machine and the controller thereof for achieving this video game are presented later. FIG. 1throughFIG. 3show the condition in virtual fields of view200C,200R, and200L (with a camera field of view of, for example 120°) displayed on a monitor screen of a video game machine, in which there are disposed a main game character201(character object) operable by a controller, and other objects (for example, house objects203and207, building objects204and206, and a tree object205and so on). The camera210inFIG. 1throughFIG. 3is a virtual camera that is capturing virtual fields of view (camera fields of view)200C,200R, and200L, respectively. The camera210is not, it should be noted, displayed within the virtual space of the actual game. One part of the character201(in this example, the head object202) can be operated independently of the body object by a predetermined actuator on the controller. It will be understood that, in the present invention, ...

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

Various embodiments of the present invention will be described with reference to the accompanying drawings. It is to be noted that the same or similar reference numerals are applied to the same or similar parts and elements throughout the drawings, and the description of the same or similar parts and elements will be omitted or simplified.

General Description of a Video Game

In this video game, in response to, for example, operation of a controller or to the generation of a prescribed event described further below, one part (element) of a game character is caused to move and, in response to a change in the direction of the part of the character, a fixation point is caused to move, thereby a virtual field of view (not of the character, but of the camera) in the game space being caused to move. Details of the configuration and functions of a video game machine and the controller thereof for achieving this video game are presented later.

FIG. 1throughFIG. 3show the condition in virtual fields of view200C,200R, and200L (with a camera field of view of, for example 120°) displayed on a monitor screen of a video game machine, in which there are disposed a main game character201(character object) operable by a controller, and other objects (for example, house objects203and207, building objects204and206, and a tree object205and so on). The camera210inFIG. 1throughFIG. 3is a virtual camera that is capturing virtual fields of view (camera fields of view)200C,200R, and200L, respectively. The camera210is not, it should be noted, displayed within the virtual space of the actual game.

One part of the character201(in this example, the head object202) can be operated independently of the body object by a predetermined actuator on the controller. It will be understood that, in the present invention, the part that is operable by the controller is not limited to the head object202, and can alternatively be, for example, a hand object, a leg object, a hip object, a finger object, an eye object, a mouth object or the like, and in fact can be any constituent part of the character's body. The use of the head object herein is only example for the purpose of describing an embodiment of the present invention.

The angle range of operation of the above-noted head object, that is the range of angle through which the front of the face of the character201can be moved is established as 60° leftward and rightward and 45° upward and downward with respect to the frontward direction of the body object of the character201(the frontward direction of the character before causing the orientation of the head object to change). The 60° leftward and rightward and 45° upward and downward serve as operating limits when the orientation of the head object202is changed. Therefore, even if there is large amount of operation of a prescribed actuator, the head object202will not be operated so as to exceed these operational limit angles.

The amount of operation of the prescribed actuator of the controller affects the operation speed of the head object202. For example, in the case in which there is a large operation of the prescribed actuator, the operation speed of the head object is high, but when the amount of operation is small, the head object moves slowly. The prescribed actuator of the controller will be described in further detail below.

When the orientation of the head object202(orientation of the front of the face) is changed, the orientation of the neck object, the orientation of the chest object, and the orientation of the hips object are all influenced by the movement of the head object202. The degree to which the neck, chest, and hips objects are influenced by the movement of the head object202is 60% for the neck object, 20% for the chest object, and 20% for the hips object. As a specific example, if the head object202(front of the face) is moved so as to be at an angle of 60° either to the left or to the right with respect to the frontward direction of the body object before changing the orientation of the head object202(this called the frontward orientation before change), the hips object has its orientation changed 12° from the frontward direction before the above change, the chest object has its orientation changed a further 12° (for a total of 24° from the frontward direction before the change), and the neck object has its orientation changed a further 36° (for a total of 60° from the frontward direction before the change). In a similar manner, if the head object202is moved so as to be at angle of 45° either upward or downward with respect to the frontward direction before changing the orientation of the head object202, the hips object has its orientation changed 9° from the frontward orientation before the change, the chest object has its orientation changed a further 9° (for a total of 18° from the frontward orientation before the change), and the neck object has its orientation has its orientation changed a further 27° (for a total of 45° from the frontward orientation before the change).

Additionally, as described above, if the orientation of the head object202(orientation of the face) of the character201is changed in response to an operation of the controller, the virtual field of view (not the character field of view, but the camera field of view) is moved in response to the change in the orientation of the head object202(face orientation)

FIG. 1shows an example of a virtual field of view for the case in which the orientation of the head object202of the character201, as shown by the arrow220C in the drawing, is the same orientation as the frontward orientation of the body object. In the example shown inFIG. 1, the field of view direction of the camera210, as shown by the arrow221C in the drawing, is the same as the orientation of the head object202, this being the frontward direction of the body object.

InFIG. 2, we see the virtual field of view for the case in which the controller is operated by the player, so that the orientation of the head object202of the character201, as shown by the arrow220R in the drawing, is caused to change toward the right, relative to the frontward orientation of the body object. In the case ofFIG. 2, the field of view direction of the camera210, as indicated by the arrow221R in the drawing, is the same direction (right side) as the direction of the orientation of the head object202(face orientation). In this condition, the image showing the virtual field of view, compared with the case ofFIG. 1, shows objects farther to the right side (for example, the building object206inFIG. 2) but in this image objects on the left side (for example, part of the house objects203inFIG. 2) are hidden. That is, the image that shows the virtual field of view of the camera210is thought of as being what is seen when the character201turns his head toward the right side, so that things farther to the right are shown.

Additionally,FIG. 3shows an example of the virtual field of view for the case in which the orientation of the head object202of the character201, as shown by the arrow220L in the drawing, is changed toward the left side with respect to the frontward direction of the body object. In this condition inFIG. 3, the image showing the virtual field of view, compared to the case ofFIG. 1, shows objects farther to the left side (for example, the house object207inFIG. 3) but in this image objects on the right side (for example, part of the building object204inFIG. 3) are hidden. That is, the image that shows the virtual field of view of the camera210is thought of as being what is seen when the character201turns his head toward the left side, so that things farther to the left are shown.

It should be noted that drawings illustrating the movement of the head object202of the character201upward and downward have been omitted. For example, in the case in which the head object202is moved upward, the image showing the virtual field of view, compared with that shown inFIG. 1, shows the scene farther upward (for example, clouds and the like if outdoors, or the ceiling if indoors). On the other hand, if the head object202is moved downward, the image showing the virtual field of view, compared to that ofFIG. 1, shows the scene farther downward (for example, the ground if outside or the floor if indoors).

As described above, the movement of the virtual field of view in response to a change in the direction of the head object202of the character201is made by moving the fixation point of the camera210. That is, the movement of the virtual field of view in the game space is achieved by shifting the fixation point of the camera210set with respect to the center of the character in accordance with the direction of operation of the head object202. The change of direction of the field of view of the camera can be linked to a change in the degree of perspective in the game screen, or to a change in the movement of another character.

The change in the orientation of the head object202of the character201can further be linked to an operation to change the speed or direction of movement of the character201itself, by means of a controller. In the case in which the movement or the like of the character201is linked to a change in orientation of the head object202, it is possible, for example, to achieve a game within a game space, in which there exist items and characters that are not possible for a player to discover unless the player operates the controller so as to move the character while changing the orientation of the head object202of the character201.

As described above, a character having a part that is operated is not limited to the above-described main character201, but can also be a non-player character (NPC) that is not operable by the player. It is possible to apply a method similar to that of moving a single part of the main character201when it is in the playable condition to the movement of a part of an NPC.

Additionally, it is possible, for example to have one part of a character operate at the point in time at which, in accordance with the progress in a game or responsive to operation of the controller by the player, a prescribed event occurs. That is, when a prescribed event accompanies progress in the game, for example, the character head object is made to face a prescribed orientation. In the case in which the head object of a character, for example, is caused to face a prescribed orientation in response to the occurrence of prescribed event, this can be achieved similarly to the case described above, by having the movement of the virtual field of view cause the movement of the fixation point of the camera field of view. A character having a part that is caused to operate in response to the occurrence of a prescribed event can be not only the main character201, but can also be an NPC.

The above-noted prescribed event can be, for example, a scene in a soccer game or the like in which there is eye contact between the main character and an NPC. In this manner, if a prescribed event occurs when there is eye contact between the main character and an NPC in the soccer game or the like, and the head object of the NPC or the main character is operated in response thereto, it is possible to achieve an extremely realistic game, in which game operation can be established with the assumption of team play.

As described above, the movement of one part of a character201(for example, orientation of the head object202) is controllable in response to a controller operation in this video game, and further by causing the fixation point of the camera210to move in correspondence with a change in the orientation of the head object202so that the virtual field of view in the game space (not the character's viewing point, but the camera's field of view) is moved, there is a dramatic improvement in the sense of reality of the game, in comparison to previous game, in which it was merely possible to operate a part of a character by operation of a controller.

This video game not only can separately control the movement of the character201and the movement of one part of the character201, but can also link the movement of the character201and the movement of the one part of the character201. With this video game, therefore, it is possible for example to cause the character201to look back while running, or to walk while glancing to the left and right.

Additionally, this video game enables game play in which it is possible only in the case in which the movement of the character201is linked to change in the orientation of the head object202, this being the case, for example, of more entertaining game play, in which there is an item or the like which cannot be discovered unless the orientation of the head object202of the character201is caused to change and the character is moved.

Video Game System

FIG. 4shows the general configuration of a video game system1that provides a video game such as described above.

The video game system1ofFIG. 4has a video game machine2as one example of a program execution apparatus, which not only executes the above-described video game, but also can perform playback and the like of a movie, for example, a controller20connected to the video game machine2, and which is an operating terminal that is operable by a users (player), and a television monitor apparatus10which displays game content and images or the like and also outputs audio.

General Description of the Video Game Machine

The video game machine2has memory card slots8A and8B, controller ports7A and7B, a disc tray3, buttons4and9, an IEEE (Institute of Electrical and Electronics Engineers) 1394 connector6, and a USB (Universal Serial Bus) connector5and the like. The memory card slots8A and8B are configured so as to removably accept a memory card26. The controller ports7A and7B are configured to removably accept a connector12of a cable13connected to the controller20. The disc tray3is configured to removably load an optical disc such as a DVD-ROM or a CD-ROM. A button9is an open/close button for opening and closing the disc tray3. A button4is an on/standby/reset button, for switching the power supply on or placing it in standby and performing reset of a game. Although not shown, the video game machine2is provided on its rear panel with such elements as a power switch, an audio/video output connector (AV multi-output connector), a PC card slot, an optical digital output connector, an AC line input connector or the like.

The video game machine2executes a game application program (hereinafter referred to as simply a game program) read out from a storage medium of an optical disc such as the above-noted CD-ROM, DVD-ROM, also based upon instructions from a player received via the controller20. The video game machine2can also execute a game program that has been downloaded from various communication lines (transfer media) via a telephone line, a LAN, a cable TV line, a communication satellite or the like.

The video game machine shown inFIG. 4can have two controllers20connected at its controller ports7A and7B. By connecting two controllers to the video game machine2, it is possible for two players to play a game. The memory card26installed in the memory card slots8A and8B of the video game machine2have stored therein various game data, generated by the execution of video games, thereby enabling to restart play of a game in progress, by reading out saved game data and using it to continue the game.

Additionally, the video game machine2not only executes a video game based on a game program, but also, for example, can be used for playback of audio data stored on a CD, or video and audio data (such as for a movie or the like) stored on a DVD, as well as for other operations, based on a variety of application programs. A driver program for playing back a DVD is stored, for example, in a memory card26. The video game machine2, therefore, reads out the driver program for playback of a DVD, and plays back the DVD in accordance with that driver program.

General Description of the Controller

The controller20has left grip35, a right grip36, a left operating part21, a right operating part22, a left analog operating part31, a right analog operating part32, a first left pushbutton23L and a first right pushbutton23R and, although not illustrated, a second left pushbutton and a second right pushbutton. The left grip35is a part that is gripped and held within the left hand, and the right grip36is a part that is gripped and held within the right hand, the player operating each with the thumbs of the left and right hands, respectively. The left analog operating part31and the right analog operating part32are operated as analog joysticks with the left and right thumbs in the state where the grips35and36are gripped with the left and right hands of the player. The first left pushbutton23L and the second left pushbutton (not shown in the drawing) disposed therebelow are operated as pushbuttons by, for example, the index finger and middle finger of the player's left hand, and the first right pushbutton23R and the second right pushbutton (not shown in the drawing) disposed there below are operated as pushbuttons by, for example, the index finger and middle finger of the player's right hand.

The above-noted left operating part21is provided with “up”, “down”, “left” and “right” direction keys, used, for example, when a player moves not a part of a character but rather the entire character on the screen. The “up”, “down”, “left” and “right” direction keys are used not only to issue up, down, left, and right direction commands, but can also be used for issuing commands for an oblique direction. For example, if the up key and the right key are pressed simultaneously, it is possible to issue a command for the upper-right direction. The same is true of the other direction keys. For example, if the down direction key and the left direction key are pressed simultaneously, a command is given for the lower-left direction.

The right operating part22has four command buttons (these buttons being respectively marked by engraved □, Δ, ×, and ∘ marks), to which different functions are assigned by a game application program. For example, a menu item selection function is assigned to the Δ button, a cancel function for canceling a selected item is assigned to the × button, and a specification function for establishing selected items is assigned to the ∘ button, and a function for specifying display or non-display of a table of contents or the like is assigned to the □ button. It will be understood that these are merely examples of pushbutton or keys on assignments, and that various other assignments can be made.

The left analog operating part31and the right analog operating part32each have a rotational actuator, which can be tilted in an arbitrary direction about a rotational pivot point on an operating axis, and a variable analog output means, which outputs a variable analog value responsive to an operation of the rotational actuator. The rotational actuator is mounted to the end part of an operating shaft mounted so as to be restored to a neutral position by a resilient member, so that when not tilted by a player, it is restored to the upright attitude (the condition in which there is no tilt) and held in that position (reference position). The variable analog output means has, for example, a variable resistance element, the resistance value of which changes in response to operation of the rotational actuator. When the left analog operating part31and the right analog operating part32are operated so as to impart a tilt thereto, coordinate values on an XY coordinate system are detected in response to the amount of inclination with respect to the reference position and the direction of the inclination, these coordinate values being sent to the video game machine2as the operation output.

In this case, the video game machine2controls the movement of a game character on a game screen in response to the operation output signal from the left analog operating part31and the right analog operating part32. That is, a prescribed actuator function is assigned to the right analog operating part32, so that video game machine2, in response to an operation output from the right analog operating part32, causes one part of the character201during a game (for example, the head object202) move independently of the body object. The left analog operating part31has assigned to it a function that is the same as the up, down, left, and right direction keys.

The controller20is further provided with a mode selection switch33, a light indicator34, a selection button25, a start button24and the like. The mode selection switch33selects a mode to activate (analog operating mode) or inactivate (digital operating mode) the functions of the left and right operating parts21and22or the left and right analog operating parts31and32. The light indicator34is an LED (light-emitting diode) or the like for the purpose of notifying a player of the above-noted selected operating mode by a light. The start button24is a pushbutton that allows a player to give a command to start a game or playback, or to pause. The selection button25is for a player to give a command to cause display on a monitor screen10of a menu or an operating panel. If the mode selection switch33is used to select the analog operating mode, the light indicator34is controlled so as to flash, and the left and right analog operating parts31and32become operative, but if the digital operating mode is selected, the light indicator34is extinguished and operation of the left and right analog operating parts31and32is disabled.

When the various buttons and operating parts provided on the controller20are operated by the player, the controller20generates an operating signal corresponding to the operation, this operating signal being sent to the video game machine2via a cable13, a connector12, and a controller port7.

In addition, controller20is provided within the left and right grips35and36with a vibration generating mechanism, which generates a vibration, for example by causing a weight that is eccentrically disposed with respect to a motor to rotate, thereby enabling operation of the vibration generating mechanism in response to a command from the video game machine2. That is, there is a function that, by causing the vibration generating mechanism to operate, imparts vibration to the hand of the player.

Internal Configuration of the Video Game Machine

Next, an outline of internal circuitry configuration of the video game machine will be described usingFIG. 5.

The video game machine2has a main CPU100, a graphic processor unit (GPU)101, an I/O processor (IOP)104, an optical disc playback unit112, a main memory (RAM)105, a MASK ROM106, and a sound processor unit (SPU)108and the like. The main CPU100performs control of the signal processor and internal constituent elements, in accordance with various application programs which realize the foregoing video games. The graphic processor unit101performs image processing. The I/O processor IOP104performs interface processing between the external and internal devices, and processing to maintain downward compatibility. The main memory105has the functions of a working area for the main CPU100, and a buffer for temporary storage of data read out from an optical disc. The MASK-ROM106stores an initialization program for the video game machine2, and a program (called a handler) for interrupting the processing of the CPU100and I/O processor104when various switches and buttons are pressed so as to execute processing responsive to the operation of the switch or button. The SPU108performs sound signal processing. The optical disc playback unit112performs playback of an optical disc such as a DVD or CD or the like, onto which is stored an application program or a multi-media data. The optical disc playback unit112is made up of a spindle motor, an optical pickup, an RF amplifier113, and a slide mechanism and the like. The spindle motor causes the DVD or other optical disc to rotate. The optical pickup reads a signal recorded on an optical disc. The slide mechanism causes the optical disc to move in a radial direction with respect to an optical disc. The RF amplifier113amplifies an output signal from the optical pickup.

The video game machine2further has a CD/DVD digital signal processor110(hereinafter referred to simply as the DSP110), a driver111, a mechanical controller109, and a card-type connector107(here in after referred to simply as the PC card slot107). The DSP110converts the output signal from the RF amplifier113of the optical disc playback unit112to binary form, and applies error correction processing (CIRC processing) and expansion decoding processing thereto, so as to play back the signal recorded on the optical disc. The driver111and the mechanical controller109perform rotational control of the spindle motor of the optical disc play back unit112, and focus and tracking control of the optical pickup. The PC card slot107is an interface device for the purpose of making connection, for example, to a communication card or to an external hard disk drive.

Each of the above-noted constituent elements are mutually connected via bus lines102,103, and the like. The main CPU100and GPU101are connected by means of a dedicated bus. The connection between the main CPU100and the IO processor104is made by SBUS. The connection between the IO processor104and the DSP110, the MASK ROM106, the sound processor unit108, and the PC card slot107is made by the SBUS.

The main CPU100, by executing an initialization program or an operating system program for the main CPU stored in the MASK ROM106, controls the overall operation of the video game machine2. The main CPU100, by executing various application programs including the game application program for executing the video game, controls various operations in the video game machine2. These various application programs are read out from an optical disc such as a CD-ROM or DVD-ROM and loaded into the main memory105, or downloaded via a communication network.

The I/O processor104, by executing an operating system program for an I/O processor stored in the MASKROM106, performs such tasks as input and output between PAD and memory card connectors7A,7B,8A, and8B, input and output of data at the USB connector5, input and output of data at the PC card slot, and data protocol conversion of this data. Note that the MASK ROM106stores a device ID of the video game machine2.

The graphic processor unit101functions as a geometry transfer engine for performing processing of coordinate conversion and a rendering processor, and performs plotting in accordance with graphic commands from the main CPU100, storing the plotted graphics in a frame buffer. For example, the graphic processor unit101, in serving as a geometric transfer engine, in the case in which various application program stored on an optical disc uses so-called three-dimensional (3D) graphics, performs geometry calculation processing, so as to perform coordinate calculation of polygons for forming three-dimensional objects. Additionally, by performing rendering processing, the graphic processor unit101performs various calculations for generating an image possibly obtained by photographing such three-dimensional objects with a virtual camera that is perspective conversion for rendering (i.e., calculation of coordinate values of the vertexes of the individual polygons composing a three-dimensional image projected on to a virtual camera screen), and the final calculated image data being written into a frame buffer. The graphic processor unit101outputs a video signal responsive to these generated images.

The sound processor unit108has functions such as an ADPCM (adaptive differential pulse code modulation) decoding function and an audio signal playback function. The ADPCM decoding function is a function that decodes audio data by adaptive prediction. The audio playback function reads waveform data stored in a sound buffer (not shown in the drawing) that is built in or is externally connected to the SPU108, so as to play back and output an audio signal such as sound effects. The waveform data stored in the sound buffer is modulated so as to generate various sound waveform data. That is, the SPU108based on instructions from the main CPU100, plays back an audio signal of music and sound effects of waveform data stored in the sound buffer, and operates as a so-called sampling sound source.

In an entertainment system2having the above-noted configuration, when for example the power supply is switched on, an initialization program for the main CPU100and an operating system program for the I/O processor104are each read out of the MASK ROM106and, the main CPU100and the I/O processor104executing these respective operating system programs. By doing this, the main CPU100performs overall management of the various parts of the video game machine2. The I/O processor104also controls input and output with the memory card26. When the main CPU100executes the operating system program, after performing initialization processing, such as verification of operation conditions, the main CPU100controls the optical disc playback section112, so as to read out an application program stored on an optical disc and, after loading into the main memory105, and executes the application program. By executing the application program, the main CPU100, in response to instructions from the player through the controller20via the I/O processor104, controls the graphic processor unit101and the sound processor unit108, so as to control image display, and control playback of sound effects and music. In this video game machine2, the case in which playback is done of a movie or the like stored on an optical disc is similar, the main CPU100performing control of the graphic processor unit101and the sound processor unit108, in accordance with the instructions (commands) from the player through the controller20via the I/O processor104, so as to control image display of the movie, and control the playback of sound effects and music and the like.

General Configuration of a Game Program

A game program for creating a video game in which it is at least possible to move one part of a character and also move the camera field of view in response to operation of the controller20is described below.

The game program is recorded on a recording medium such as a DVD-ROM, CD-ROM, or other optical disc, or on in a semiconductor memory, and can be downloaded via a transfer medium of a communication network or the like, and has a data structure such as shown, for example, inFIG. 6. The data structure shown inFIG. 6merely conceptually presents the main parts of a program section and data section included in a game program, and does not indicate the configuration of an actual program.

As shown inFIG. 6, a game program330can be divided into a program section340, for the purpose of execution by the main CPU100of the video game machine2, and data section360, having various data used in the execution of the video game.

The data section360minimally has polygon and texture data361and sound source data362as the various data used in executing the video game.

The polygon texture data361is data for generating polygons and textures for generating main characters, NPC objects, background images, and other objects during a video game. The sound source data362is waveform data used when the sound processor unit108of the video game machine2shown inFIG. 5generates game sounds, music, and sound effects.

The program section340is a program for executing the video game, and minimally has a progress control program341, a disc control program342, a controller management program343, an image control program344, a sound control program345, a fixation point control program346, a character control program347, and a saved data management program349.

The game progress control program341is for the purpose of controlling the progress of the video game. The disc control program342is for the purpose of controlling data readout from the optical disc or from an HD drive, in response to the start of and progress of a video game. The controller management program343is for the purpose of managing input signals responsive to pushing of buttons or operating of the left and right analog operating parts on the controller20by a player, and managing the controller20operating mode and vibration generation and the like. The image control program344is a program for the purpose of generating a game image and displaying the game image on a television monitor. The sound control program345is a program for generation and output of sounds and music during a game.

The fixation point control program346is a program for controlling the fixation point when the virtual field of view (camera field of view) is caused to move. The character control program347is a program for controlling the operation and the like of game characters, this character control program347encompassing a part control program348, which is a control program for moving one part of a character, for example the head object202of the character201as described earlier. In addition, the saved data management program349is a program for the purpose of causing saving and readout to/from the memory card26of data such as game point counts and intermediate game data generated by execution of the video game and managing this saved data.

Flow of Game Program Execution

The flow of the main processing in the game program shown inFIG. 6is described below in relation to the flowcharts shown inFIG. 7andFIG. 8. The flow in the flowcharts described below is achieved by running the various program sections of the game program in the main CPU100within the video game machine2. In the description to follow, the operation of one part of a character (for example the head object202) and the movement of the camera field of view are controlled in response to operation of the right analog operating part32of the controller20, this being a feature of the present invention.

Flow of Control of One Part of a Character and of the Camera Field of View

FIG. 7shows the flow of processing when the game program330, in response to operation of a prescribed actuator (the right analog operating part32of the controller20), moves the head object202of the character201, and also causes movement of the camera field of view.

InFIG. 7, as the processing at step S1, the controller management program343monitors the operating condition of various buttons and operating parts of the controller20. If a judgment is made at step S1that the right analog operating part32of the controller20has been operated, the controller management program343, as the processing of step S2, makes a judgment as to the direction and amount of operation of the right analog operating part32.

Next, as the processing of step S3, the character control program347, based on information indicating the direction and amount of operation of the right analog operating part32, makes a judgment as to whether or not the current angle of the head object202of the character201is within a pre-set operable range and whether or not it has reached an operating limit angle. If the judgment is made at step S3that the current angle of the head object202is within the operable angle range, game program processing proceeds to step S6. If, however, the judgment at step S3is that the current angle of the head object202is at the operating limit angle, the processing of the game program proceeds to step S4.

In the case of proceeding to step S6, the character control program347and the part control program348, in response to the operation direction of the right analog operating part32, cause operation of the head object202, and of the neck, chest, and hips objects, the operation of which are influenced by the operation of the head object202, with respective pre-established proportions. Specifically, the programs347and348cause operation of the head object202and in response to the operation of the head object202, cause 60% operation of the neck object, 20% operation of the chest object, and 20% operation of the hips object. Simultaneously with this, the programs347and348also control the speed of operation of the head object202of the character201in response to the amount of operation (analog value) of the right analog operating part32.

At step S6, the fixation point control program346, in response to the operating direction of the head object202of the character201, performs processing so as to move the virtual field of view in the game space (that is, so as to move the fixation point of the camera210). By doing this, movement of the virtual field of view occurs within the game space. After the processing at step S6, the flow of processing returns to step S1.

If the judgment is made at step S3, however, that the angle has reached the operating angle limit, so that the flow of processing jumps to step S4, the character control program347makes a judgment as to whether or not the direction of operation of the right analog operating part32indicates that the operating angle limit of the head object202of the character201is exceeded.

At this point, in the judgment processing performed at step S4, if the judgment is made that the direction of operation of the right analog operating part32indicates that the operating angle limit of the head object202is not exceeded, that is, if it indicates a direction that is opposite from the direction exceeding the operating angle limit, the processing of the game program proceeds to step S6. On the other hand, if the judgment processing at step S4indicates that the direction of operation of the right analog operating part32is such as to exceed the operating angle limit of the head object202, processing of the game program proceeds to step S5.

When the processing of step S5is encountered, the character control program347and the part control program348cause the operation of the head object202to stop at the position of the operating angle limit and the fixation point control program346stops the movement of the virtual field of view (movement of the fixation point of the camera210) in the game space, whereupon return is made to step S1.

Flow of Character Operation Processing in Response to the Occurrence of an Event

FIG. 8shows the flow of processing in the case when, upon the occurrence of a prescribed event accompanying progress in the game, for example, the game program330moves one part (the head object) of a character in a prescribed direction. The character in the example ofFIG. 8is not restricted to a main character, but can also be an NPC.

Referring toFIG. 8, when the progress control program341detects the occurrence of a prescribed event with the progress of the game at step S11, as the processing of step S12, it performs a judgment as to whether or not control of the character operation set with regard to the event forcibly moves the head object of the character. If at step S12the judgment is made that the forced operation is to be made, the processing of the game program proceeds to step S13. If at step S12the judgment is made, however, that the event does not cause forced movement, the flow of processing proceeds to step S15.

In the case in which processing proceeds to step S13, the character control program347and the part control program348establish the direction of operation of the head object of the character, after which, as the processing of step S14, the head object of the character is forcibly bent in this operating direction up to the operating angle limit.

If at step S12the judgment is made, however, that forcible movement is not to be done, so that processing proceeds to step S15, the programs347and348establish the target coordinates for movement of the head object of the character. The target coordinates in this case can be envisioned as being, for example, the desired coordinates on a map of the game space, or the coordinates at which another character exists, in the case in which the orientation of the head object is changed by the character so as to bring the other character into view. In the case in which a character changes the orientation of its head object so as to view anther character, the other character that comes into view is not restricted to being an NPC as seen by a main character, but can also include a main character as seen from an NPC.

After the coordinates are established as noted above, the programs347and348, as the processing at step S16, bend the head object of the NPC within the operating angle limit thereof as described above, so that the front of the face of the character is oriented toward the above-noted established direction.

As described above, the video game machine2, in response to operation of the right analog operating part32of the controller20or the occurrence of an event, controls the orientation of the head object202of the character201, and also moves the fixation point of the camera210, so as to cause the virtual field of view (camera field of view) to move in the game space, thereby achieving a game with a much higher level of reality compared with video games in the past.

Because the video game machine2links the movement of the character201with the orientation of the head object202, it is possible to achieve a high level of entertainment.

Finally, it should be noted that the foregoing embodiment is merely an exemplary form of the present invention, to which the present invention is not restricted, and that the present invention can take other various forms, within the scope of the technical concept thereof, but different from the foregoing described embodiments.

Claims

  1. A virtual space control method comprising the steps of: changing an orientation of a prescribed part of a virtual character in a virtual space;changing a screen image in response to the change in orientation of the prescribed part, wherein the screen image represents a virtual field of view of the virtual space defined by a viewpoint other than a viewpoint of the virtual character and including a whole image of the virtual character;moving the virtual character in the virtual space;and detecting an occurrence of a prescribed event, wherein the step of changing the orientation includes a step of changing the orientation of the prescribed part in response to the occurrence of the prescribed event, wherein the step of changing the screen image has a step of changing the screen image in response to the movement of the virtual character and to the change in orientation of the prescribed part, and wherein the prescribed event is selected from a plurality of events occurring in the virtual space.
  1. The virtual space control method according to claim 1 , wherein the step of changing the orientation of the prescribed part includes the step of changing the orientation of a head of the virtual character as the change in orientation of the prescribed part, and the step of changing the screen image includes the step of changing the viewpoint defining the virtual field of view in response to the change in orientation of the head of the virtual character.
  2. The virtual space control method according to claim 1 , further comprising the step of: receiving a operation command input from the virtual character, wherein the step of changing the orientation includes a step of changing the orientation of the prescribed part in response to an operation command input.
  3. The virtual space control method according to claim 1 , further comprising the step of: generating a prescribed object in the virtual space only when a the movement of me virtual character occurs, and the orientation of the prescribed part is changed in a prescribed manner.
  4. The virtual space control method according to claim 1 , further comprising the step of: setting target coordinates in the virtual space, wherein the step of changing the orientation includes a step of changing the orientation of the prescribed part of the virtual character toward the target coordinates.
  5. The virtual space control method according to claim 1 , further comprising the step of: setting a limit to an orientation changeable range of the prescribed part of the virtual setting character.
  6. The virtual space control method according to claim 1 , further comprising the step of: causing a change in orientation of another part of the virtual character influenced by the change in orientation of the prescribed part, the change in orientation of said another part being made in a pre-established prescribed proportion to the change in orientation of the prescribed part.
  7. The virtual space control method according to claim 1 , wherein the virtual character comprises the prescribed part, a first part, and a second part, wherein the prescribed part is connected to the first part, wherein the first part is connected to the second part, wherein, when the prescribed part moves at a first angle, the first part moves at a second angle, and the second part moves at a third angle, and wherein the first angle is not less than the sum of the second angle and the third angle.
  8. The virtual space control method according to claim 8 , wherein a ratio of the second angle to the first angle is established.
  9. The virtual space control method according to claim 8 , wherein a ratio of the third angle to the first angle is established.
  10. A computer-readable recording medium having recorded therein a virtual space control program to be executed on a computer, the virtual space control program being configured to execute the steps of: changing an orientation of a prescribed part of a virtual character in a virtual space;moving the virtual character in the virtual space;changing a screen image in response to the change in orientation of the prescribed part and the movement of the virtual character in the virtual space, wherein the screen image represents a virtual field of view defined by a viewpoint other than a viewpoint of the virtual character and includes a whole image of the virtual character;and detecting occurrence of a prescribed event, wherein the step of changing the orientation of the prescribed part includes a step of changing the orientation of the prescribed part in response to the occurrence of the prescribed event, and wherein the prescribed event is selected from a plurality of events occurring in the virtual space.
  11. The computer-readable recording medium having recorded therein the virtual space control program to be executed on a computer according to claim 11 , wherein the step of changing the orientation of the prescribed part includes the step of changing the orientation of a head of the virtual character as the change in orientation of the prescribed part, and the step of changing the screen image includes the step of changing the viewpoint defining the virtual field of view in response to the change in orientation of the head of the virtual character.
  12. The computer-readable recording medium having recorded therein the virtual space control program to be executed on a computer according to claim 11 , the virtual space control program being further configured to execute the step of: receiving an operation command input from the virtual character, wherein the step of changing the orientation of the prescribed part includes a step of changing the orientation of the prescribed part in response to the operation command input.
  13. The computer-readable recording medium having recorded therein the virtual space control program to be executed on a computer according to claim 11 , the virtual space control program being further configured to execute the step of: generating a prescribed object in the virtual space only when the movement of the virtual character occurs, and the orientation of the prescribed part is changed in a prescribed manner.
  14. The computer-readable recording medium having recorded therein the virtual space control program to be executed on a computer according to claim 11 , the virtual space control program being further configured to execute the step of: setting target coordinates in the virtual space, wherein the step of changing the orientation includes a step of changing the orientation of the prescribed part of the virtual character toward the target coordinates.
  15. The computer-readable recording medium having recorded therein the virtual space control program to be executed on a computer according to claim 11 , the virtual space control program being further configured to execute the step of: setting a limit to an orientation changeable range of the prescribed part of the virtual character.
  16. The computer-readable recording medium having recorded therein the virtual space control program to be executed on a computer according to claim 11 , the virtual space control program being further configured to execute the step of: causing a change in orientation of another part of the virtual character influenced by the change in orientation of the prescribed part, the change in orientation of said another part being made in a pre-established prescribed proportion to the change in orientation of the prescribed part.
  17. A program execution apparatus that executes a virtual space control program, the virtual space control program being configured to perform the steps of: changing an orientation of a prescribed part of a virtual character in a virtual space;changing a screen image in response to the change in orientation of the prescribed part and the movement of the virtual character in the virtual space, wherein the screen image represents a virtual field of view defined by a viewpoint other than a viewpoint of the virtual character and includes a whole image of the virtual character;and detecting an occurrence of a prescribed event, wherein the step of changing the orientation includes a step of changing the orientation of the prescribed part in response to the occurrence of the prescribed event, and wherein the prescribed event is selected from a plurality of events occurring in the virtual space.
  18. The program execution apparatus according to claim 18 , wherein the step of changing the orientation of the prescribed part includes the step of changing the orientation of a head of the virtual character as the change in orientation of the prescribed part, and the step of changing the screen image includes the step of changing the viewpoint defining the virtual field of view in response to the change in orientation of the head of the virtual character.
  19. The program execution apparatus according to claim 18 , the virtual space control program being further configured to perform the step of: receiving a operation command input from the virtual character, wherein the step of changing the orientation includes a step of changing the orientation of the prescribed part in response to an operation command input.
  20. The program execution apparatus according to claim 18 , the virtual space control program being further configured to perform the step of: generating a prescribed object in the virtual space only when the movement of the virtual character occurs, and the orientation of the prescribed part is changed in a prescribed manner.
  21. The program execution apparatus according to claim 18 , the virtual space control program being further configured to perform the step of: setting target coordinates in the virtual space, wherein the step of changing the orientation includes a step of changing the orientation of the prescribed part of the virtual character toward the target coordinates.
  22. The program execution apparatus according to claim 18 , the virtual space control program being further configured to perform the step of: setting a limit to an orientation changeable range of the prescribed part of the virtual character.
  23. The program execution apparatus according to claim 18 , the virtual space control program being further configured to perform the step of: causing a change in orientation of another part of the virtual character influenced by the change in orientation of the prescribed part, the change in orientation of said another part being made in a pre-established prescribed proportion to the change in orientation of the prescribed part.
  24. A computer, that executes a virtual space control program, the virtual space control program being configured to perform the steps of: changing an orientation of a prescribed part of a virtual character in a virtual space;moving the virtual character in the virtual space;changing a screen image in response to the change in orientation of the prescribed part and the movement of the virtual character in the virtual space, wherein the screen image represents a virtual field of view defined by a viewpoint other than a viewpoint of the virtual character and includes a whole image of the virtual character;and detecting an occurrence of a prescribed event, wherein the step of changing the orientation includes a step of changing the orientation of the prescribed part in response to the occurrence of the prescribed event, and wherein the prescribed event is selected from a plurality of events occurring in the virtual space.

Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.