U.S. Pat. No. 8,753,207

GAME SYSTEM, GAME PROCESSING METHOD, RECORDING MEDIUM STORING GAME PROGRAM, AND GAME DEVICE

AssigneeNintendo Co., Ltd.

Issue DateJanuary 23, 2012

Illustrative Figure

Abstract

In an example game device, a second character is caused to rotate based on an input operation performed on a stick of a terminal device and a change in an attitude of the terminal device. In the game device, when an enemy character is present behind the second character, a rotation angle based on an input operation performed on the stick is adjusted so that the second character is easily caused to face the enemy character. In the game device, when an orientation of the second character is changed based on the change in the attitude of the terminal device, a rotation angle based on the change in the attitude of the terminal device is adjusted so that the second character is easily caused to face the enemy character. In the game device, the second character is caused to rotate based the adjusted rotation angle.

Description

DETAILED DESCRIPTION OF NON-LIMITING EXAMPLE EMBODIMENTS [1. General Configuration of Game System] A game system1will now be described with reference to the drawings.FIG. 1is a non-limiting example external view of the game system1. InFIG. 1, a game system1includes a non-portable display device (hereinafter referred to as a “television”)2such as a television receiver, a console-type game device3, an optical disc4, a controller5, a marker device6, and a terminal device7. In the game system1, the game device3performs a game process based on a game operation performed using the controller5, and displays a game image obtained through the game process on the television2and/or the terminal device7. In the game device3, the optical disc4typifying an interchangeable information storage medium used for the game device3is removably inserted. An information processing program (a game program, for example) to be executed by the game device3is stored on the optical disc4. The game device3has, on a front surface thereof, an insertion opening for the optical disc4. The game device3reads and executes the information processing program stored on the optical disc4which has been inserted in the insertion opening, to perform a game process. The television2is connected to the game device3by a connecting cord. A game image obtained as a result of a game process performed by the game device3is displayed on the television2. The television2includes a speaker2a(seeFIG. 2) which outputs a game sound obtained as a result of the game process. In alternative embodiments, the game device3and the non-portable display device may be an integral unit. Also, the communication between the game device3and the television2may be wireless communication. The marker device6is provided along the periphery of the screen (on the upper side of the screen inFIG. 1) of the television2. The user (player) can perform a game operation by moving the controller5, details of which will be described later. ...

DETAILED DESCRIPTION OF NON-LIMITING EXAMPLE EMBODIMENTS

[1. General Configuration of Game System]

A game system1will now be described with reference to the drawings.FIG. 1is a non-limiting example external view of the game system1. InFIG. 1, a game system1includes a non-portable display device (hereinafter referred to as a “television”)2such as a television receiver, a console-type game device3, an optical disc4, a controller5, a marker device6, and a terminal device7. In the game system1, the game device3performs a game process based on a game operation performed using the controller5, and displays a game image obtained through the game process on the television2and/or the terminal device7.

In the game device3, the optical disc4typifying an interchangeable information storage medium used for the game device3is removably inserted. An information processing program (a game program, for example) to be executed by the game device3is stored on the optical disc4. The game device3has, on a front surface thereof, an insertion opening for the optical disc4. The game device3reads and executes the information processing program stored on the optical disc4which has been inserted in the insertion opening, to perform a game process.

The television2is connected to the game device3by a connecting cord. A game image obtained as a result of a game process performed by the game device3is displayed on the television2. The television2includes a speaker2a(seeFIG. 2) which outputs a game sound obtained as a result of the game process. In alternative embodiments, the game device3and the non-portable display device may be an integral unit. Also, the communication between the game device3and the television2may be wireless communication.

The marker device6is provided along the periphery of the screen (on the upper side of the screen inFIG. 1) of the television2. The user (player) can perform a game operation by moving the controller5, details of which will be described later. The marker device6is used by the game device3for calculating a movement, a position, an attitude, etc., of the controller5. The marker device6includes two markers6R and6L at opposite ends thereof. Specifically, the marker6R (as well as the marker6L) includes one or more infrared light emitting diodes (LEDs), and emits infrared light in a forward direction of the television2. The marker device6is connected to the game device3via either a wired or wireless connection, and the game device3is able to control the lighting of each infrared LED of the marker device6. Note that the marker device6is movable, and the user can place the marker device6at any position. WhileFIG. 1shows an embodiment in which the marker device6is placed on top of the television2, the position and direction of the marker device6are not limited to this particular arrangement.

The controller5provides the game device3with operation data representing the content of an operation performed on the controller itself. The controller5and the game device3can communicate with each other via wireless communication. In the present embodiment, the controller5and the game device3use, for example, Bluetooth (Registered Trademark) technology for the wireless communication therebetween. In other embodiments, the controller5and the game device3may be connected via a wired connection. While only one controller is included in the game system1in the present embodiment, a plurality of controllers may be included in the game system1. In other words, the game device3can communicate with a plurality of controllers. Multiple players can play a game by using a predetermined number of controllers at the same time. The detailed configuration of the controller5will be described below.

The terminal device7is sized to be grasped by the user's hand or hands. The user can hold and move the terminal device7, or can place and use the terminal device7at an arbitrary position. The terminal device7, whose detailed configuration will be described below, includes a liquid crystal display (LCD)51as a display, and input mechanisms (e.g., a touch panel52, a gyroscopic sensor74, etc., to be described later). The terminal device7and the game device3can communicate with each other via a wireless connection (or via a wired connection). The terminal device7receives from the game device3data of an image (e.g., a game image) generated by the game device3, and displays the image on the LCD51. While an LCD is used as the display device in the embodiment, the terminal device7may include any other display device such as a display device utilizing electroluminescence (EL), for example. The terminal device7transmits operation data representing the content of an operation performed on the terminal device itself to the game device3.

[2. Internal Configuration of Game Device3]

Next, an internal configuration of the game device3will be described with reference toFIG. 2.FIG. 2is a non-limiting example block diagram showing the internal configuration of the game device3. The game device3includes a CPU10, a system LSI11, an external main memory12, a ROM/RTC13, a disc drive14, and an AV-IC15.

The CPU10performs a game process by executing a game program stored on the optical disc4, and functions as a game processor. The CPU10is connected to the system LSI11. The external main memory12, the ROM/RTC13, the disc drive14, and the AV-IC15, as well as the CPU10, are connected to the system LSI11. The system LSI11performs the following processes: controlling data transmission between each component connected thereto; generating an image to be displayed; acquiring data from an external device(s); and the like. The internal configuration of the system LSI11will be described below. The external main memory12, which is of a volatile type, stores a program such as a game program read from the optical disc4, a game program read from a flash memory17, or the like, and various data. The external main memory12is used as a work area and a buffer area for the CPU10. The ROM/RTC13includes a ROM (a so-called boot ROM) containing a boot program for the game device3, and a clock circuit (real time clock (RTC)) for counting time. The disc drive14reads program data, texture data, and the like from the optical disc4, and writes the read data into an internal main memory11e(to be described below) or the external main memory12.

The system LSI11includes an input/output processor (I/O processor)11a, a graphics processor unit (GPU)11b, a digital signal processor (DSP)11c, a video RAM (VRAM)11d, and the internal main memory11e. Although not shown in the figures, these components11ato11eare connected to each other through an internal bus.

The GPU11b, which forms a part of a rendering mechanism, generates an image in accordance with a graphic command (rendering command) from the CPU10. The VRAM11dstores data (data such as polygon data and texture data) required by the GPU11bto execute graphics commands. When an image is generated, the GPU11bgenerates image data using data stored in the VRAM11d. In the present embodiment, the game device3generates both a game image to be displayed on the television2and a game image to be displayed on the terminal device7. The game image to be displayed on the television2may also be hereinafter referred to as a “television game image,” and the game image to be displayed on the terminal device7may also be hereinafter referred to as a “terminal game image.”

The DSP11c, which functions as an audio processor, generates audio data using sound data and sound waveform (e.g., tone quality) data stored in one or both of the internal main memory11eand the external main memory12. In the present embodiment, game audio is output from the speaker of the television2, and game audio is output from the speaker of the terminal device7.

As described above, of images and audio generated in the game device3, data of an image and audio to be output from the television2is read out by the AV-IC15. The AV-IC15outputs the read image data to the television2via an AV connector16, and outputs the read audio data to the speaker2aprovided in the television2. Thus, images are displayed on the television2, and sound is output from the speaker2a.

Of images and audio generated in the game device3, data of an image and audio to be output from the terminal device7is transmitted to the terminal device7by an input/output processor11a, etc. The data transmission to the terminal device7by the input/output processor11a, or the like, will be described below.

The input/output processor11aexchanges data with components connected thereto, and downloads data from an external device(s). The input/output processor11ais connected to the flash memory17, a network communication module18, a controller communication module19, an extension connector20, a memory card connector21, and a codec LSI27. An antenna22is connected to the network communication module18. An antenna23is connected to the controller communication module19. The codec LSI27is connected to a terminal communication module28, and an antenna29is connected to the terminal communication module28.

The game device3can be connected to a network such as the Internet to communicate with external information processing devices (e.g., other game devices, various servers, computers, etc.). That is, the input/output processor11acan be connected to a network such as the Internet via the network communication module18and the antenna22to communicate with an external information processing device(s) connected to the network. The input/output processor11aregularly accesses the flash memory17to detect the presence or absence of any data which needs to be transmitted to the network, and when there is data, transmits the data to the network via the network communication module18and the antenna22. The input/output processor11aalso receives data transmitted from an external information processing device and data downloaded from a download server via the network, the antenna22, and the network communication module18, and stores the received data into the flash memory17. The CPU10executes a game program to read data stored in the flash memory17and use the data in the game program. The flash memory17may store saved game data (e.g., data representing game results or data representing intermediate game results) of a game played using the game device3in addition to data exchanged between the game device3and an external information processing device. The flash memory17may also store a game program(s).

The game device3can receive operation data from the controller5. That is, the input/output processor11areceives operation data transmitted from the controller5via the antenna23and the controller communication module19, and stores (temporarily) the data in a buffer area of the internal main memory11eor the external main memory12.

The game device3can exchange data such as images and audio with the terminal device7. When transmitting a game image (terminal game image) to the terminal device7, the input/output processor11aoutputs data of the game image generated by the GPU11bto the codec LSI27. The codec LSI27performs a predetermined compression process on the image data from the input/output processor11a. The terminal communication module28wirelessly communicates with the terminal device7. Therefore, the image data compressed by the codec LSI27is transmitted by the terminal communication module28to the terminal device7via the antenna29. In the present embodiment, the image data transmitted from the game device3to the terminal device7is image data used in a game, and the playability of a game can be adversely influenced if there is a delay in displaying an image in the game. Therefore, it is preferred to eliminate a delay as much as possible in transmission of image data from the game device3to the terminal device7. Therefore, in the present embodiment, the codec LSI27compresses image data using a compression technique with high efficiency such as the H.264 standard, for example. Other compression techniques may be used, and image data may be transmitted uncompressed if the communication speed is sufficient. The terminal communication module28is, for example, a Wi-Fi certified communication module, and may perform wireless communication at high speed with the terminal device7using, for example, a multiple input multiple output (MIMO) technique employed in the IEEE 802.11n standard, or other communication schemes.

The game device3transmits audio data to the terminal device7, in addition to image data. That is, the input/output processor11aoutputs audio data generated by the DSP11cto the terminal communication module28via the codec LSI27. The codec LSI27performs a compression process on audio data, as with image data. While the compression scheme for audio data may be any scheme, it is preferably a scheme with a high compression ratio and less audio degradation. In other embodiments, audio data may be transmitted uncompressed. The terminal communication module28transmits the compressed image data and audio data to the terminal device7via the antenna29.

The game device3can receive various data from the terminal device7. In the present embodiment, the terminal device7transmits operation data, image data, and audio data, details of which will be described below. These pieces of data transmitted from the terminal device7are received by the terminal communication module28via the antenna29. The image data and the audio data transmitted from the terminal device7has been subjected to a compression process similar to that on image data and audio data transmitted from the game device3to the terminal device7. Therefore, the compressed image data and audio data are sent from the terminal communication module28to the codec LSI27, which in turn performs a decompression process on the pieces of data and outputs the resulting pieces of data to the input/output processor11a. On the other hand, the operation data from the terminal device7may not be subjected to a compression process since the amount of the data is small as compared with images and audio. It may or may not be encrypted as necessary. After being received by the terminal communication module28, the operation data is output to the input/output processor11avia the codec LSI27. The input/output processor11astores (temporarily) data received from the terminal device7in a buffer area of the internal main memory11eor the external main memory12.

The game device3can be connected to another device or an external storage medium. That is, the input/output processor11ais connected to the extension connector20and the memory card connector21. The extension connector20is a connector for an interface, such as a USB or SCSI interface. The extension connector20can receive a medium such as an external storage medium, a peripheral device such as another controller, or a wired communication connector which enables communication with a network in place of the network communication module18. The memory card connector21is a connector for connecting, to the game device3, an external storage medium such as a memory card. For example, the input/output processor11acan access an external storage medium via the extension connector20or the memory card connector21to store data into the external storage medium or read data from the external storage medium.

The game device3includes a power button24, a reset button25, and an eject button26. The power button24and the reset button25are connected to the system LSI11. When the power button24is turned on, power is supplied to components of the game device3from an external power supply through an AC adaptor (not shown). When the reset button25is pressed, the system LSI11restarts the boot program of the game device3. The eject button26is connected to the disc drive14. When the eject button26is pressed, the optical disc4is ejected from the disc drive14.

In other embodiments, some of the components of the game device3may be provided as extension devices separate from the game device3. In this case, an extension device may be connected to the game device3via the extension connector20, for example. Specifically, an extension device may include components of the codec LSI27, the terminal communication module28, and the antenna29, for example, and can be attached/detached to/from the extension connector20. In this case, by connecting the extension device to a game device which does not include the above components, the game device can communicate with the terminal device7.

[3. Configuration of Controller5]

Next, with reference toFIGS. 3 and 4, the controller5will be described.FIG. 3is a non-limiting example perspective view showing an external configuration of the controller5.FIG. 4is a non-limiting example block diagram showing an internal configuration of the controller5. The perspective view ofFIG. 3shows the controller5as viewed from the top and the rear.

As shown inFIGS. 3 and 4, the controller5has a housing31formed by, for example, plastic molding. The housing31has a generally parallelepiped shape extending in a longitudinal (front-rear) direction (Z1-axis direction shown inFIG. 3), and is sized to be grasped by one hand of an adult or a child. The user can perform game operations by pressing buttons provided on the controller5, and by moving the controller5itself to change the position and attitude (tilt) thereof.

The housing31has a plurality of operation buttons. As shown inFIG. 3, on a top surface of the housing31, a cross button32a, a first button32b, a second button32c, an “A” button32d, a minus button32e, a home button32f, a plus button32g, and a power button32hare provided. A recessed portion is formed on a bottom surface of the housing31, and a “B” button32iis provided on a rear, sloped surface of the recessed portion. The operation buttons32ato32iare assigned, as necessary, their respective functions in accordance with the game program executed by the game device3. The power button32his used to remotely turn on and off the game device3.

On a rear surface of the housing31, the connector33is provided. The connector33is used to connect other devices (e.g., a sub-controller having an analog stick, other sensor units, etc.) to the controller5.

In a rear portion of the top surface of the housing31, a plurality (four inFIG. 3) of LEDs34ato34dare provided. The controller5is assigned a controller type (number) so as to be distinguishable from other controllers.

The controller5also has an image capturing/processing section35(FIG. 4), and a light incident surface35aof an image capturing/processing section35is provided on a front surface of the housing31. The light incident surface35ais made of a material which transmits at least infrared light emitted from the markers6R and6L.

On the top surface of the housing31, sound holes31athrough which sound from a speaker provided in the controller5is emitted out are provided between the first button32band the home button32f.

Note that the shape of the controller5, the shapes of the operation buttons, etc., are only for illustrative purposes. Other shapes, numbers, and positions are possible.

FIG. 4is a non-limiting example block diagram showing an internal configuration of the controller5. The controller5includes an operation section32(the operation buttons32ato32i), the image capturing/processing section35, a communication section36, the acceleration sensor37, and a gyroscopic sensor48. The controller5transmits data representing the content of an operation performed on the controller itself, as operation data, to the game device3. The operation data transmitted by the controller5may also be hereinafter referred to as “controller operation data,” and the operation data transmitted by the terminal device7may also be hereinafter referred to as “terminal operation data.”

The operation section32includes the operation buttons32ato32idescribed above, and outputs, to the microcomputer42of the communication section36, operation button data indicating the input states of the operation buttons32ato32i(e.g., whether or not the operation buttons32ato32iare pressed).

The image capturing/processing section35includes a infrared filter38, a lens39, an image capturing element40, and an image processing circuit41. The infrared filter38transmits only infrared light contained in light incident on the front surface of the controller5. The lens39collects the infrared light transmitted through the infrared filter38so that the light is incident on the image capturing element40. The image capturing element40is a solid-state image capturing device, such as, for example, a CMOS sensor or a CCD sensor, which receives the infrared light collected by the lens39, and outputs an image signal. The marker section55of the terminal device7and the marker device6of which images are to be captured are formed by markers which output infrared light. Therefore, the infrared filter38enables the image capturing element40to receive only the infrared light transmitted through the infrared filter38and generate image data, whereby an image of an object to be imaged (the marker section55and/or the marker device6) can be captured more accurately. In the description that follows, the image data generated by the image capturing element40is processed by the image processing circuit41. The image processing circuit41calculates a position of the object to be imaged within the captured image. The image processing circuit41outputs coordinates of the calculated position to the microcomputer42of the communication section36. The data representing the coordinates is transmitted as operation data to the game device3by the microcomputer42. The coordinates are hereinafter referred to as “marker coordinates.” The marker coordinates change depending on an orientation (a tilt angle) and/or a position of the controller5itself, and therefore, the game device3can calculate the orientation and position of the controller5using the marker coordinates.

The acceleration sensor37detects accelerations (including a gravitational acceleration) of the controller5. While the acceleration sensor37is assumed to be an electrostatic capacitance type micro-electromechanical system (MEMS) acceleration sensor, other types of acceleration sensors may be used.

In the present embodiment, the acceleration sensor37detects a linear acceleration in each of three axial directions, i.e., the up-down direction (Y1-axis direction shown inFIG. 3), the left-right direction (the X1-axis direction shown inFIG. 3), and the front-rear direction (the Z1-axis direction shown inFIG. 3) of the controller5.

Data (acceleration data) representing the acceleration detected by the acceleration sensor37is output to the communication section36. The acceleration detected by the acceleration sensor37changes depending on the orientation (tilt angle) and the movement of the controller5itself, and therefore, the game device3is capable of calculating the orientation (attitude) and the movement of the controller5using the obtained acceleration data.

One skilled in the art will readily understand from the description herein that additional information relating to the controller5can be estimated or calculated (determined) through a process by a computer, such as a processor (for example, the CPU10) of the game device3or a processor (for example, the microcomputer42) of the controller5, based on an acceleration signal output from the acceleration sensor37(this applies also to an acceleration sensor73to be described later). For example, assuming that the computer performs a process on the premise that the controller5including the acceleration sensor37is in the static state (that is, in the case in which the process is performed on the premise that the acceleration detected by the acceleration sensor contains only the gravitational acceleration), when the controller5is actually in the static state, it is possible to determine whether or not or how much the controller5is tilted relative to the direction of gravity, based on the detected acceleration. Specifically, when the state in which the detection axis of the acceleration sensor37faces vertically downward is used as a reference, whether or not the controller5is tilted relative to the reference can be determined based on whether or not 1 G (gravitational acceleration) is present, and the degree of tilt of the controller5relative to the reference can be determined based on the magnitude thereof. The multi-axis acceleration sensor37can more precisely determine the degree of tilt of the controller5relative to the direction of gravity by performing a process on the acceleration signals of the axes. In this case, the processor may calculate, based on the output from the acceleration sensor37, the tilt angle of the controller5, or the tilt direction of the controller5without calculating the tilt angle. Thus, by using the acceleration sensor37in combination with the processor, it is possible to determine the tilt angle or the attitude of the controller5.

On the other hand, when it is assumed that the controller5is in the dynamic state (in which the controller5is being moved), the acceleration sensor37detects the acceleration based on the movement of the controller5, in addition to the gravitational acceleration, and it is therefore possible to determine the movement direction of the controller5by removing the gravitational acceleration component from the detected acceleration through a predetermined process. Even when it is assumed that the controller5is in the dynamic state, it is possible to determine the tilt of the controller5relative to the direction of gravity by removing the acceleration component based on the movement of the acceleration sensor from the detected acceleration through a predetermined process. In other embodiments, the acceleration sensor37may include an embedded processor or another type of dedicated processor for performing a predetermined process on an acceleration signal detected by a built-in acceleration detector before the acceleration signal is output to the microcomputer42. For example, when the acceleration sensor37is used to detect a static acceleration (for example, the gravitational acceleration), the embedded or dedicated processor may convert the acceleration signal to a tilt angle (or other preferred parameters).

The gyroscopic sensor48detects angular velocities about three axes (the X1-, Y1-, and Z1-axes in the embodiment). In the present specification, with respect to the image capturing direction (the Z1-axis positive direction) of the controller5, a rotation direction about the X1-axis is referred to as a pitch direction, a rotation direction about the Y1-axis as a yaw direction, and a rotation direction about the Z1-axis as a roll direction. The number and combination of gyroscopic sensors to be used are not limited to any particular number and combination as long as the gyroscopic sensor48can detect angular velocities about three axes. For example, the gyroscopic sensor48may be a 3-axis gyroscopic sensor, or angular velocities about three axes may be detected by a combination of a 2-axis gyroscopic sensor and a 1-axis gyroscopic sensor. Data representing the angular velocity detected by the gyroscopic sensor48is output to the communication section36. The gyroscopic sensor48may be a gyroscopic sensor that detects an angular velocity or velocities about one axis or two axes.

The communication section36includes the microcomputer42, a memory43, the wireless module44, and the antenna45. The microcomputer42controls the wireless module44for wirelessly transmitting, to the game device3, data acquired by the microcomputer42while using the memory43as a storage area in the process.

Pieces of data output from the operation section32, the image capturing/processing section35, the acceleration sensor37, and the gyroscopic sensor48to the microcomputer42are temporarily stored in the memory43. These pieces of data are transmitted as the operation data (controller operation data) to the game device3.

As described above, as operation data representing an operation performed on the controller itself, the controller5can transmit marker coordinate data, acceleration data, angular velocity data, and operation button data. The game device3performs a game process using the operation data as a game input. Therefore, by using the controller5, the user can perform a game operation of moving the controller5itself, in addition to a conventional typical game operation of pressing the operation buttons. Examples of the game operation of moving the controller5itself include an operation of tilting the controller5to an intended attitude, an operation of specifying an intended position on the screen with the controller5, etc.

While the controller5does not include a display for displaying a game image in the embodiment, it may include a display for displaying, for example, an image representing a battery level, etc.

[4. Configuration of Terminal Device7]

Next, a configuration of the terminal device7will be described with reference toFIGS. 5 to 7.FIG. 5is a non-limiting example plan view showing an external configuration of the terminal device7. InFIG. 5, (a) is a front view of the terminal device7, (b) is a top view thereof, (c) is a right side view thereof, and (d) is a bottom view thereof.FIG. 6is a non-limiting example diagram showing a user holding the terminal device7in a landscape position.

As shown inFIG. 5, the terminal device7includes a housing50generally in a horizontally-elongated rectangular plate shape. That is, it can also be said that the terminal device7is a tablet-type information processing device. The housing50is sized to be grasped by the user.

The terminal device7includes an LCD51on a front surface (front side) of the housing50. The LCD51is provided near the center of the front surface of the housing50. Therefore, the user can hold and move the terminal device7while viewing the screen of the LCD51, by holding portions of the housing50on opposite sides of the LCD51, as shown inFIG. 6. WhileFIG. 6shows an example in which the user holds the terminal device7in a landscape position (being wider than it is long) by holding portions of the housing50on left and right sides of the LCD51, the user can also hold the terminal device7in a portrait position (being longer than it is wide).

As shown in (a) ofFIG. 5, the terminal device7includes a touch panel52on the screen of the LCD51as an operation mechanism. The touch panel52may be of a single-touch type or a multi-touch type. While a touch pen60is usually used for performing an input operation on the touch panel52, the present exemplary embodiment is not limited to using the touch pen60, and an input operation may be performed on the touch panel52with a finger of the user. The housing50is provided with a hole60afor accommodating the touch pen60used for performing an input operation on the touch panel52(see (b) ofFIG. 5).

As shown inFIG. 5, the terminal device7includes two analog sticks53A and53B and a plurality of buttons (keys)54A to54M, as operation mechanisms (operation sections). The analog sticks53A and53B are each a direction-selection device. The analog sticks53A and53B are each configured so that the movable member (stick portion) operated with a finger of the user can be slid in any direction (at any angle in the up, down, left, right and diagonal directions) with respect to the front surface of the housing50. That is, the analog sticks53A and53B are each a direction input device which is also called a slide pad. The movable member of each of the analog sticks53A and53B may be of a type that is tilted in any direction with respect to the front surface of the housing50. Since the present embodiment uses analog sticks of a type that has a movable member which is slidable, the user can operate the analog sticks53A and53B without significantly moving the thumbs and therefore while holding the housing50more firmly.

The left analog stick53A is provided on the left side of the screen of the LCD51, and the right analog stick53B is provided on the right side of the screen of the LCD51. As shown inFIG. 6, the analog sticks53A and53B are provided at positions that allow the user to operate the analog sticks53A and53B while holding the left and right portions of the terminal device7(on the left and right sides of the LCD51), and therefore, the user can easily operate the analog sticks53A and53B even when holding and moving the terminal device7.

The buttons54A to54L are operation mechanisms (operation sections) for making predetermined inputs, and are keys that can be pressed down. As will be discussed below, the buttons54A to54L are provided at positions that allow the user to operate the buttons54A to54L while holding the left and right portions of the terminal device7(seeFIG. 6).

As shown in (a) ofFIG. 5, the cross button (direction-input button)54A and the buttons54B to54H and54M, of the operation buttons54A to54L, are provided on the front surface of the housing50.

The cross button54A is provided on the left side of the LCD51and under the left analog stick53A. The cross button54A has a cross shape, and can be used to select at least up, down, left, and right directions.

The buttons54B to54D are provided on the lower side of the LCD51. The terminal device7includes the power button54M for turning on and off the terminal device7. The game device3can be remotely turned on and off by operating the power button54M. The four buttons54E to54H are provided on the right side of the LCD51and under the right analog stick53B. Moreover, the four buttons54E to54H are provided on the upper, lower, left and right sides (of the center position between the four buttons54E to54H). Therefore, with the terminal device7, the four buttons54E to54H can also serve as buttons with which the user selects the up, down, left and right directions.

In the present embodiment, a projecting portion (an eaves portion59) is provided on the back side of the housing50(the side opposite to the front surface where the LCD51is provided) (see (c) ofFIG. 5). As shown in (c) ofFIG. 5, the eaves portion59is a mountain-shaped member which projects from the back surface of the generally plate-shaped housing50. The projecting portion has a height (thickness) that allows fingers of the user holding the back surface of the housing50to rest thereon.

As shown in (a), (b), and (c) ofFIG. 5, a first L button541and a first R button54J are provided in the right and left sides, respectively, on the upper surface of the housing50. In the present embodiment, the first L button541and the first R button54J are provided on diagonally upper portions (a left upper portion and a right upper portion) of the housing50.

As shown in (c) ofFIG. 5, a second L button54K and a second R button54L are provided on the projecting portion (the eaves portion59). The second L button54K is provided in the vicinity of the left end of the eaves portion59. The second R button54L is provided in the vicinity of the right end of the eaves portion59.

The buttons54A to54L are each assigned a function in accordance with the game program. For example, the cross button54A and the buttons54E to54H may be used for a direction-selection operation, a selection operation, etc., and the buttons54B to54E may be used for a decision operation, a cancel operation, etc. The terminal device7may include a button for turning on and off the LCD51, and a button for performing a connection setting (pairing) with the game device3.

As shown in (a) ofFIG. 5, the terminal device7includes the marker section55including a marker55A and a marker55B on the front surface of the housing50. The marker section55is provided on the upper side of the LCD51. The markers55A and55B are each formed by one or more infrared LEDs, as are the markers6R and6L of the marker device6. The infrared LEDs of the markers55A and55B are provided behind or further inside than a window portion that is transmissive to infrared light. The marker section55is used by the game device3to calculate the movement, etc., of the controller5, as is the marker device6described above. The game device3can control the lighting of the infrared LEDs of the marker section55.

The terminal device7includes a camera56as an image capturing mechanism. The camera56includes an image capturing element (e.g., a CCD image sensor, a CMOS image sensor, or the like) having a predetermined resolution, and a lens.

The terminal device7includes a microphone69as an audio input mechanism. A microphone hole50cis provided on the front surface of the housing50. The microphone69is provided inside the housing50behind the microphone hole50c. The microphone69detects ambient sound of the terminal device7such as the voice of the user.

The terminal device7includes a speaker77as an audio output mechanism. As shown in (a) ofFIG. 5, speaker holes57are provided in a lower portion of the front surface of the housing50. The output sound from the speaker77is output from the speaker holes57. In the present embodiment, the terminal device7includes two speakers, and the speaker holes57are provided at the respective positions of the left and right speakers. The terminal device7includes a knob64for adjusting the sound volume of the speaker77. The terminal device7includes an audio output terminal62for connecting an audio output section such as an earphone thereto.

The housing50includes a window63through which an infrared signal from an infrared communication module82is emitted out from the terminal device7.

The terminal device7includes an extension connector58for connecting another device (additional device) to the terminal device7. The extension connector58is a communication terminal for exchanging data (information) with another device connected to the terminal device7.

In addition to the extension connector58, the terminal device7includes a charging terminal66for obtaining power from an additional device. In the present embodiment, the charging terminal66is provided on a lower side surface of the housing50. Therefore, when the terminal device7and an additional device are connected to each other, it is possible to supply power from one to the other, in addition to exchanging information therebetween, via the extension connector58. The terminal device7includes a charging connector, and the housing50includes a cover portion61for protecting the charging connector. Although the charging connector (the cover portion61) is provided on an upper side surface of the housing50in the present embodiment, the charging connector (the cover portion61) may be provided on a left, right, or lower side surface of the housing50.

The housing50of the terminal device7includes holes65aand65bthrough which a strap cord can be tied to the terminal device7.

With the terminal device7shown inFIG. 5, the shape of each operation button, the shape of the housing50, the number and positions of the components, etc., are merely illustrative, and the present exemplary embodiment can be implemented in other shapes, numbers, and positions.

Next, an internal configuration of the terminal device7will be described with reference toFIG. 7.FIG. 7is a non-limiting example block diagram showing the internal configuration of the terminal device7. As shown inFIG. 7, the terminal device7includes, in addition to the components shown inFIG. 5, a touch panel controller71, a magnetic sensor72, the acceleration sensor73, the gyroscopic sensor74, a user interface controller (UI controller)75, a codec LSI76, the speaker77, a sound IC78, the microphone79, a wireless module80, an antenna81, the infrared communication module82, a flash memory83, a power supply IC84, a battery85, and a vibrator89. These electronic components are mounted on an electronic circuit board and accommodated in the housing50.

The UI controller75is a circuit for controlling the input/output of data to/from various input/output sections. The UI controller75is connected to the touch panel controller71, an analog stick53(the analog sticks53A and53B), an operation button54(the operation buttons54A to54L), the marker section55, the magnetic sensor72, the acceleration sensor73, the gyroscopic sensor74, and the vibrator89. The UI controller75is connected to the codec LSI76and the extension connector58. The power supply IC84is connected to the UI controller75, and power is supplied to each section via the UI controller75. The built-in battery85is connected to the power supply IC84to supply power. The charger86or a cable with which power can be obtained from an external power source can be connected to the power supply IC84via a charging connector, and the terminal device7can receive power supply from or be charged by an external power source using the charger86or the cable. The terminal device7may be charged by attaching the terminal device7to a cradle (not shown) having a charging function.

The touch panel controller71is a circuit which is connected to the touch panel52and controls the touch panel52. The touch panel controller71generates touch position data in a predetermined format based on a signal from the touch panel52, and outputs the data to the UI controller75. The touch position data represents, for example, the coordinates of a position on the input surface of the touch panel52at which an input operation is performed.

The analog stick53outputs, to the UI controller75, stick data representing a direction and an amount in which the stick portion operated with a finger of the user has been slid (or tilted). The operation button54outputs, to the UI controller75, operation button data representing the input state of each of the operation buttons54A to54L (e.g., whether the button is pressed).

The magnetic sensor72detects an azimuth by sensing the magnitude and direction of the magnetic field. Azimuth data representing the detected azimuth is output to the UI controller75. The UI controller75outputs a control instruction for the magnetic sensor72to the magnetic sensor72. While there are sensors using a magnetic impedance (MI) element, a fluxgate sensor, a Hall element, a giant magneto-resistive (GMR) element, a tunnel magneto-resistance (TMR) element, an anisotropic magneto-resistive (AMR) element, etc., the magnetic sensor72may be any sensor as long as the sensor can detect the azimuth. Strictly speaking, in a place where there is a magnetic field other than the geomagnetic field, the obtained azimuth data does not represent the azimuth. Nevertheless, if the terminal device7moves, the azimuth data changes, and it is therefore possible to calculate a change in the attitude of the terminal device7.

The acceleration sensor73is provided inside the housing50and detects the magnitude of a linear acceleration along each of the directions of the three axes (the X-, Y-, and Z-axes shown in (a) ofFIG. 5). Specifically, the acceleration sensor73detects the magnitude of the linear acceleration along each of the axes, where the X-axis lies in a longitudinal direction of the housing50, the Y-axis lies in a width direction of the housing50, and the Z-axis lies in a direction vertical to the surface of the housing50. Acceleration data representing the detected acceleration is output to the UI controller75. The UI controller75outputs a control instruction for the acceleration sensor73to the acceleration sensor73. While the acceleration sensor73is assumed to be a capacitive-type MEMS-type acceleration sensor, for example, in the present embodiment, other types of acceleration sensors may be employed in other embodiments. The acceleration sensor73may be an acceleration sensor which detects an acceleration or accelerations in one or two axial detections.

The gyroscopic sensor74is provided inside the housing50and detects angular velocities about the three axes, i.e., the X-, Y-, and Z-axes. Angular velocity data representing the detected angular velocities is output to the UI controller75. The UI controller75outputs a control instruction for the gyroscopic sensor74to the gyroscopic sensor74. The number and combination of gyroscopic sensors used for detecting angular velocities about the three axes may be any number and combination, and the gyroscopic sensor74may be formed by a 2-axis gyroscopic sensor and a 1-axis gyroscopic sensor, as is the gyroscopic sensor48. The gyroscopic sensor74may be a gyroscopic sensor which detects an acceleration or accelerations in one or two axial detections.

The UI controller75outputs, to the codec LSI76, operation data including touch position data, stick data, operation button data, azimuth data, acceleration data, and angular velocity data received from the components described above. If another device is connected to the terminal device7via the extension connector58, data representing an operation performed on the other device may be further included in the operation data.

The codec LSI76is a circuit for performing a compression process on data to be transmitted to the game device3, and a decompression process on data transmitted from the game device3. The LCD51, the camera56, the sound IC78, the wireless module80, the flash memory83, and the infrared communication module82are connected to the codec LSI76. The codec LSI76includes a CPU87and an internal memory88. While the terminal device7does not perform a game process itself, the terminal device7executes programs for management and communication thereof. When the terminal device7is turned on, a program stored in the flash memory83is read out to the internal memory88and executed by the CPU87, whereby the terminal device7is started up. Some area of the internal memory88is used as a VRAM for the LCD51.

The camera56captures an image and outputs the captured image data to the codec LSI76in accordance with an instruction from the game device3. A control instruction for the camera56, such as an image capturing instruction, is output from the codec LSI76to the camera56.

The sound IC78is a circuit which is connected to the speaker77and the microphone79and controls input/output of audio data to/from the speaker77and the microphone79. That is, when audio data is received from the codec LSI76, the sound IC78outputs an audio signal obtained by performing D/A conversion on the audio data to the speaker77, which in turn outputs sound. The microphone79detects sound entering the terminal device7(the voice of the user, etc.), and outputs an audio signal representing the sound to the sound IC78. The sound IC78performs A/D conversion on the audio signal from the microphone79, and outputs audio data in a predetermined format to the codec LSI76.

The codec LSI76transmits image data from the camera56, audio data from the microphone79, and operation data (terminal operation data) from the UI controller75to the game device3via the wireless module80. In the present embodiment, the codec LSI76performs a compression process similar to that of the codec LSI27on image data and audio data. The terminal operation data and the compressed image data and audio data are output, as transmit data, to the wireless module80. The antenna81is connected to the wireless module80. The wireless module80transmits the transmit data to the game device3via the antenna81. The wireless module80has a function similar to that of the terminal communication module28of the game device3. That is, the wireless module80has a function of connecting to a wireless LAN by a scheme in conformity with the IEEE 802.11n standard, for example. The transmitted data may or may not be encrypted as necessary.

As described above, the transmit data transmitted from the terminal device7to the game device3includes operation data (terminal operation data), image data, and audio data. When another device is connected to the terminal device7via the extension connector58, data received from the other device may also be contained in the transmit data. The infrared communication module82establishes infrared communication in conformity with the IRDA standard, for example, with another device. The codec LSI76may transmit, to the game device3, data received via infrared communication while the data is contained in the transmit data as necessary.

As described above, compressed image data and audio data are transmitted from the game device3to the terminal device7. These pieces of data are received by the codec LSI76via the antenna81and the wireless module80. The codec LSI76decompresses the received image data and audio data. The decompressed image data is output to the LCD51, which in turn displays an image on the LCD51. That is, the codec LSI76(the CPU87) displays the received image data on the display section. The decompressed audio data is output to the sound IC78, which in turn causes the speaker77to emit sound.

[5. General Description of Game Process]

Next, a game process executed in the game system1of the present embodiment will be generally described. A game in the present embodiment is played by a plurality of players. In the present embodiment, one terminal device7and a plurality of controllers5are connected to the game device3via wireless communication. In the game of the present embodiment, the maximum number of controllers5which are allowed to connect to the game device3is three.

In the description that follows, the game of the present embodiment is assumed to be played by four players which are three first players (first players A-C) who operates the controllers5(controllers5a-5c) and one second player who operates the terminal device7.

FIG. 8is a non-limiting example diagram showing an example television game image displayed on the television2.FIG. 9is a non-limiting example diagram showing an example terminal game image displayed on the LCD51of the terminal device7.

As shown inFIG. 8, the screen of the television2is divided in four equal regions, in which images90a,90b,90c, and90dare displayed. As shown inFIG. 8, the television2displays first characters91a,91b, and91c, and a second character92. A plurality of enemy characters (93a-93c) are also displayed on the television2.

The first character91ais a virtual character which is provided in a game space (a three-dimensional (or two-dimensional) virtual world) and is operated by the first player A. The first character91aholds a sword object95aand attacks the enemy character93using the sword object95a. The first character91bis a virtual character which is provided in the game space and is operated by the first player B. The first character91bholds a sword object95band attacks the enemy character93using the sword object95b. The first character91cis a virtual character which is provided in the game space and is operated by the first player C. The first character91cholds a sword object95cand attacks the enemy character93using the sword object95c. The second character92is a virtual character which is provided in the game space and is operated by the second player. The second character92holds a bow object96and an arrow object97and attacks the enemy character93by shooting the arrow object97in the game space. The enemy character93is a virtual character which is controlled by the game device3.

In the game of the present embodiment, the first players A-C and the second player move in the game space while cooperating with each other to kill or beat the enemy character93. Specifically, the player characters (91a-91cand92) move from a game start position to a game end position in the game space while killing or beating the enemy character93.

As shown inFIG. 8, the television2displays the images90a-90din the four equal regions (upper left, lower left, upper right, and lower right regions) into which the screen is divided. Specifically, the upper left region of the screen shows the image90awhich is an image of the game space as viewed from directly behind the first character91awhich is operated by the first player A using the controller5a. The image90aof the game space is captured by a first virtual camera A which is set based on a position and an operation in the game space of the first character91a. A shooting direction of the first virtual camera A is the same as the orientation in the game space of the first character91a. The upper right region of the screen shows the image90bwhich is an image of the game space as viewed from directly behind the first character91bwhich is operated by the first player B using the controller5b. The image90bof the game space is captured by a first virtual camera B which is set based on a position and an orientation in the game space of the first character91b. A shooting direction of the first virtual camera B is the same as the orientation in the game space of the first character91b. The lower left region of the screen shows the image90cwhich is an image of the game space as viewed from directly behind the first character91cwhich is operated by the first player C using the controller5c. The image90cof the game space is captured by a first virtual camera C which is set based on a position and an orientation in the game space of the first character91c. A shooting direction of the first virtual camera C is the same as the orientation in the game space of the first character91c. The lower right region of the screen shows the image90dwhich is an image of the game space as viewed from diagonally behind the second character92which is operated by the second player using the terminal device7. The image90dof the game space is captured by a second virtual camera which is set based on a position and an orientation in the game space of the second character92. The second virtual camera is located at a predetermined position at the right rear of the second character92(above the vicinity of the ground of the game space). Therefore, the image90dcaptured by the second virtual camera is an image of the game space containing the second character92as viewed from diagonally behind the second character and above.

In the present embodiment, the first virtual camera A is set directly behind the first character91a, and therefore, the first character91ais translucent in the image90a. As a result, the player can visually recognize a character(s) which is located deeper in the depth direction of the screen than the first character91ain the image90a. This holds true for the other images90band90d, etc. The positions of the first virtual cameras A-C may be set at the viewpoints of the first characters91a-91c.

On the other hand, as shown inFIG. 9, the LCD51of the terminal device7displays an image90eof the game space as viewed from the rear of the second character92. The image90eis an image of the game space captured by a third virtual camera which is set based on a position and an orientation in the game space of the second character92. The third virtual camera is located behind the second character92(here, also offset slightly rightwardly from the center line in the front-rear direction of the second character92). An attitude (shooting direction) of the third virtual camera is set based on the orientation in the game space of the second character92. As described below, the orientation of the second character92is changed based on an input operation performed on the left analog stick53A and the attitude of the terminal device7. Therefore, the attitude of the third virtual camera is changed based on the input operation performed on the left analog stick53A and the attitude of the terminal device7.

A position in the game space is represented by coordinate values along the axes of a rectangular coordinate system (xyz coordinate system) which is fixed to the game space. The y-axis extends upward along a direction perpendicular to the ground of the game space, and the x- and z-axes extend in parallel to the ground of the game space. The first characters91a-91cand the second character92move on the ground of the game space (xz-plane) while changing the orientation (direction parallel to the xz-plane). The first character91automatically moves under a predetermined rule. The second character92moves on the ground of the game space while changing the orientation in accordance with an operation performed on the terminal device7. A control for the position and orientation of the second character92will be described below.

Next, the movement (changes in the orientation and position) of the first character91will be described. The first character91automatically moves on a path which is previously set in the game space.FIG. 10is a non-limiting example diagram showing movement paths of the first characters91a-91c.FIG. 10simply shows the game space as viewed from above. As shown inFIG. 10, the first characters91a-91cand the enemy characters93aand93bare present in the game space. It is assumed that there is the game start position in a lower portion ofFIG. 10and there is the game end position in an upper portion ofFIG. 10. As shown inFIG. 10, paths98a,98b, and98cindicated by dash-dot lines are previously set in the game space. The paths98a-98care movement paths of the characters which are not actually displayed on the screen and are internally set in the game device3.

Specifically, the first character91anormally automatically moves on the path98a. The first character91bnormally automatically moves on the path98b. The first character91cnormally automatically moves on the path98c. Here, if an enemy character93is located within a predetermined range (distance) from the first character91, the first character91leaves the path98and approaches or moves toward the enemy character93which is present within the predetermined range. For example, as shown inFIG. 10, if a distance between the first character91aand the enemy character93ais greater than a predetermined value, the orientation of the first character91ais set to a direction along the path98a, and the position of the first character91achanges with time so that the first character91ais positioned on the path98a(time t=t0). In other words, if the distance between the first character91aand the enemy character93ais greater than the predetermined value, the first character91amoves on the path98awhile changing the orientation. If a predetermined period of time has elapsed since time t=t0, i.e., time t=t1, the distance between the first character91aand the enemy character93a(and93b) is smaller than or equal to the predetermined value. In this case, the first character91abegins to move toward the enemy character93a. That is, the orientation of the first character91ais changed to a direction from the position of the first character91ato the position of the enemy character93a, and the first character91amoves toward the enemy character93a. Similarly, since the distance between the first character91band the enemy character93ais smaller than or equal to the predetermined value, the first character91balso begins to move toward the enemy character93a. On the other hand, since the distance between the first character91cand the enemy character93ais greater than the predetermined value, the first character91cdoes not move toward the enemy character93aand moves on the path98c.

FIG. 11is a non-limiting example diagram showing details of the movement of the first character91a. As shown inFIG. 11, a guide object94awhich moves on the path98ais provided in the game space. A guide object94is provided for each first character91, and is internally set in the game device3. The guide object94is not actually displayed on the screen. The guide object94ais used to control the movement of the first character91a, and automatically moves on the path98a. If no enemy characters93are present around the first character91a, the first character91amoves, following the guide object94a. Specifically, the orientation of the first character91ais set to a direction from the position of the first character91atoward the position of the guide object94a, and the position of the first character91ais changed to be closer to the position of the guide object94a. On the other hand, if an enemy character93is present around the first character91a, the first character91aapproaches or moves toward the enemy character93. In other words, if an enemy character93is present around the first character91a, the first character91amoves toward the enemy character93. If the enemy character93is killed or beaten, so that no enemy characters93are present around the first character91a, the first character91amoves again, following the guide object94a. Specifically, as shown inFIG. 11, at time t=t1, the guide object94ais present on the path98a, and the first character91ais also located on the path98a. Here, at time t=t1, if the distance between the first character91aand an enemy character93is smaller than or equal to the predetermined value, the first character91abegins to move toward the enemy character93. If a predetermined period of time has elapsed since time t=t1, i.e., time t=t2, the first character91aleaves the path98a. Thereafter, if another predetermined period of time has elapsed, the first character91amoves to a position in the vicinity of the enemy character93, the first character91afights with the enemy character93. During this fighting, the guide object94amoves on the path98awhile the distance between the first character91aand the guide object94ais prevented from being greater than or equal to a predetermined value. If the fighting between the first character91aand the enemy character93has continued for a long period of time, the guide object94astops. If the first character91akills or beats the enemy character93at time t=t3, so that there are no enemy characters93around the first character91a, the first character91aresumes moving, following the guide object94a(toward the guide object94a).

Thus, each first character91normally automatically moves in the game space, following the corresponding guide object94, and when an enemy character93is present within the predetermined range, moves toward the enemy character93.

FIG. 12is a non-limiting example diagram showing the image90awhich is displayed in the upper left region of the television2when the first character91abegins to move toward a plurality of enemy characters93. As shown inFIG. 12, when the distance between the first character91aand an enemy character93is smaller than or equal to the predetermined value, then if a plurality of enemy characters93are present, the image90ashows a selection object99a. The selection object99ais displayed above the head of an enemy character93awhich is to be attacked by the first character91a. In other words, the selection object99aindicates a target to be attacked by the first character91a. The first character91aautomatically approaches or moves toward the enemy character93aselected by the selection object99a(without the first player A specifying a direction). When the first player A operates the cross button32aof the controller5, the position of the selection object99ais changed so that the selection object99ais displayed above the head of another enemy character93b. As a result, the first player A switches the attack target from one enemy character93to another.

As shown inFIG. 10, if the distance between the first character91band the enemy character93is smaller than or equal to the predetermined value, the first character91balso moves toward the enemy character93. Although not shown, similar toFIG. 12, the image90bdisplayed in the upper right region of the television2shows a selection object99bin addition to the first character91band the enemy characters93aand93b. In this case, the image90aalso shows the selection object99bindicating a target to be attacked by the first character91b. The selection object99bis displayed in a display form different from that of the selection object99a. For example, if the first character91ais displayed in red color, the selection object99ais displayed in red color, and if the first character91bis displayed in blue color, the selection object99bis displayed in blue color. As a result, by viewing the image90a, the first player A can recognize the attack target of the first character91aoperated by himself or herself and the attack target of the first character91boperated by the first player B. That is, by viewing the images90a-90c, each player operating the controller5can simultaneously recognize which of the enemy characters93is a target to be attacked by himself or herself and which of the enemy characters93is a target to be attacked by other players.

The first character91and the second character92attack the enemy character93as follows. That is, the first character91aattacks the enemy character93using the sword object95a. When the first player A swings the controller5a, the first character91aperforms a motion of swinging the sword object95a. Specifically, the attitude in the game space of the sword object95ais changed, corresponding to a change in the attitude in the real space of the controller5a. For example, when the first player A swings the controller5afrom left to right, the first character91aperforms a motion of swinging the sword object95afrom left to right. When the sword object95ais swung, then if an enemy character93is present within a short distance (a distance corresponding to the length of the sword object95a) in front of the first character91a, the sword object95ahits the enemy character93, i.e., an attack is successful. If a predetermined number of attacks on the enemy character93are successful, the enemy character93is killed or beaten. Similarly, the first characters91band91cattack the enemy character93using the sword objects95band95c, respectively.

On the other hand, the second character92shoots the arrow object97in the game space to attack the enemy character93. For example, when the second player slides the right analog stick53B of the terminal device7in a predetermined direction (e.g., the down direction) using his or her finger, a circular sight is displayed on the LCD51of the terminal device7. In this case, when the second player releases the right analog stick53B, the right analog stick53B returns to the original position (the right analog stick53B returns to the center position). As a result, the arrow object97is shot in the game space from the position of the second character92toward the center of the circular sight displayed on the LCD51. Thus, the second character92can attack the enemy character93at a long distance from the second character92by shooting the arrow object97. As shown inFIG. 8, the number of remaining arrow objects97is displayed on the television2(inFIG. 8, the number of remaining arrow object97is four). When the second player performs a predetermined operation on the terminal device7(e.g., the back surface of the terminal device7is caused to face in a direction toward the ground), an arrow object97is reloaded, and the number of remaining arrow objects97becomes a predetermined value.

The enemy character93attacks player characters (the first character91and the second character92). If one of the player characters (here, four characters) is killed or beaten by a predetermined number of attacks from the enemy character93, the game is over. Therefore, the players enjoy playing the game by cooperating with each other to kill or beat the enemy character93so that none of the players is killed or beaten by the enemy character93.

Next, a control of the position and orientation of the second character92will be described. The second character92moves based on an input operation performed on the left analog stick53A of the terminal device7.FIG. 13is a non-limiting example diagram showing a movement and a rotation of the second character92based on an input direction of the left analog stick53A. InFIG. 13, an AY-axis direction indicates the up direction of the left analog stick53A (the Y-axis direction in (a) ofFIG. 5), and an AX-axis direction indicates the right direction of the left analog stick53A (the X-axis negative direction in (a) ofFIG. 5). Specifically, when the left analog stick53A is slid in the up direction, the second character92moves forward (i.e., moves in a depth direction away from the player of the screen ofFIG. 9). When the left analog stick53A is slid in the down direction, the second character92retreats or moves backward without changing the orientation (i.e., moves in a depth direction toward the player of the screen ofFIG. 9while facing in the depth direction away from the player). Thus, when the up direction is input using the left analog stick53A, the second character92moves forward, and when the down direction is input using the left analog stick53A, the second character92retreats or moves backward.

The orientation of the second character92is changed based on a first input operation and a second input operation. The first input operation is performed on the left analog stick53A. Specifically, as shown inFIG. 13, when the left analog stick53A is slid in the right direction, the second character92turns clockwise. That is, when the left analog stick53A is slid in the right direction, the second character92rotates clockwise as viewed from above in the game space (in this case, only the orientation of the second character92is changed, and the position of the second character92is not changed). When the left analog stick53A is slid in the left direction, the second character92turns counterclockwise. When the left analog stick53A is slid in a diagonal direction, the second character92moves while turning clockwise or counterclockwise. For example, when the left analog stick53A is slid diagonally upward and to the right (in an upper right direction), the second character92moves forward while turning clockwise.

The second input operation is performed by changing the attitude in the real space of the terminal device7. That is, the orientation of the second character92is changed based on a change in the attitude of the terminal device7.FIG. 14is a non-limiting example diagram showing the terminal device7as viewed from above in the real space, indicating a change in the attitude in the real space of the terminal device7. As shown inFIG. 14, when the terminal device7is rotated about the Y-axis (rotated about the axis of the gravity direction from the attitude ofFIG. 6), the orientation in the game space of the second character92is changed based on the amount of the rotation. For example, when the terminal device7is rotated clockwise by a predetermined angle as viewed from above in the real space, the second character92is also rotated clockwise by the predetermined angle as viewed from above in the game space. For example, when the second player causes the back surface of the terminal device (a surface opposite to the surface on which the LCD51is provided) to face in a right direction, the second character92faces in a right direction. For example, when the back surface of the terminal device7is caused to face in an up direction (the terminal device7is rotated about the X-axis), the orientation of the second character92is not changed. In other words, the orientation of the second character92is set to be parallel to the ground (xz-plane) of the game space, and therefore, even when the back surface of the terminal device7is caused to face in an up direction, the second character92does not face in an up direction in the game space. In another embodiment, the orientation of the second character92may be changed along an up-down direction (a direction parallel to the y-axis) in the game space.

Thus, the second character92is caused to move based on the up and down directions input using the left analog stick53A (a sliding operation in the up and down directions). The orientation of the second character is changed based on the left and right directions input using the left analog stick53A and a change in the attitude of the terminal device7.

The attitudes of the second and third virtual cameras are changed based on a change in the orientation of the second character92. Specifically, the shooting direction vector of the second virtual camera is set to have a fixed angle with respect to the orientation (front direction vector) of the second character92. As a result, the television2displays the image90dwhich is an image of the game space containing the second character92which is captured from a position at the right rear of and above the second character92. The orientation in the xz-plane of the shooting direction vector of the third virtual camera (the orientation of a vector obtained by projecting the shooting direction vector onto the xz-plane of the game space) is set to be the same as the orientation of the second character92. The orientation in the up-down direction (direction parallel to the y-axis) of the shooting direction vector of the third virtual camera is set based on the attitude of the terminal device7. For example, when the back surface of the terminal device7is caused to face upward in the real space, the shooting direction vector of the third virtual camera is also set to face upward in the game space. Therefore, when the second player rotates the terminal device7clockwise as shown inFIG. 14, the third virtual camera is also rotated clockwise, so that an image of a right portion of the game space before the rotation is displayed on the LCD51of the terminal device7. When the second player causes the back surface of the terminal device7to face upward in the real space, an image of an upper region of the game space is displayed on the LCD51of the terminal device7.

Here, as shown inFIG. 15, when the enemy character93is present directly behind the second character92, the second player tries to cause the second character92to turn to face in the opposite direction from the original in order to attack the enemy character93.FIG. 15is a non-limiting example diagram showing the image90ddisplayed in the lower right region of the television2when the enemy character93is present directly behind the second character92. In order to cause the second character92to turn to face in the opposite direction from the original, the second player operates the left analog stick53A of the terminal device7while viewing the screen of the television2or the screen of the terminal device7. In this case, the enemy character93is displayed below the second character92on the screen of the television2, and therefore, the second player slides the left analog stick53A in the down direction. As described above, when an input operation performed on the left analog stick53A is the down direction, the second character92retreats or moves backward (the position of the second character92moves backward while the orientation of the second character92is not changed). Alternatively, when the second player causes the second character92to turn to face in the opposite direction from the original, the terminal device7is rotated to a large extent (about the Y-axis). Thus, when the second character92is caused to turn to face in the opposite direction from the original, the difficulty of the operation may increase.

Therefore, in the present embodiment, when the enemy character93is present behind the second character92, the following two processes (a first and a second process) are performed in order to more easily cause the second character92to face the enemy character93. That is, in the first process, if the enemy character93is present behind the second character92, then when the down direction is input using the left analog stick53A, the second character92is caused to turn. Specifically, a back direction degree BR of the enemy character93with respect to the second character92is calculated, and based on the back direction degree BR, the second character92is caused to turn. Here, the term “back direction degree BR” refers to a value which indicates to what degree the enemy character93is present behind the second character92(how much the enemy character93is closer to a position directly behind the second character92), and changes depending on an angle determined by the second character92and the enemy character93. Specifically, the back direction degree BR increases as the enemy character93is closer to a position directly behind the second character92. For example, when the enemy character93is located directly behind the second character92(the enemy character93is located in a direction of 180 degrees, where the front direction of the second character92is zero degrees), the back direction degree BR is one. When the enemy character93is located to the right or left of the second character92(the enemy character93is located in a direction of 90 or −90 degrees, where the front direction of the second character92is zero degrees), the back direction degree BR is zero. When the enemy character93is located in a direction of 90 to 180 degrees (−90 to −180 degrees), the back direction degree BR ranges from zero to one. Although details will be described below, the amount of turn (the amount of rotation) of the second character92increases with an increase in the back direction degree BR.

In the second process, when the enemy character93is present behind the second character92, then if the terminal device7is rotated in a direction which causes the second character92to turn toward the enemy character93, the second character92is rotated by a larger rotation amount than the actual rotation amount of the terminal device7. For example, when the enemy character93is located directly behind the second character92, the second character92may be rotated by a rotation amount which is 1.5 times as large as the rotation amount about the Y-axis of the terminal device7. For example, when the enemy character93is present directly behind the second character92, then if the terminal device7is rotated clockwise by an angle (e.g., 120 degrees) smaller than 180 degrees, the second character92is caused to face the enemy character93. Thereafter, if the terminal device7is rotated to the right by a predetermined angle (e.g., 30 degrees), the second character92is caused to turn in a direction away from the enemy character93, and therefore, the second character92is caused to turn by the predetermined angle (e.g., 30 degrees).

As described above, in the present embodiment, when the enemy character93is present behind the second character92, the process of more easily causing the second character92to face the enemy character93is performed.

[6. Details of Game Process]

Next, the game process executed in the game system will be described in detail. Firstly, various data items used in the game process will be described.FIG. 16is a non-limiting example diagram showing data items used in the game process.FIG. 16shows main data items stored in a main memory (an external main memory12or an internal main memory11e) of the game device3. As shown inFIG. 16, the main memory of the game device3stores a game program100, controller operation data110, terminal operation data120, and processing data130. The main memory includes, in addition to the data items ofFIG. 16, data items required for a game, such as image data of objects appearing in the game, audio data used in the game, etc.

The whole or apart of the game program100is read from the optical disc4into the main memory with appropriate timing after the game device3is turned on. The game program100may be obtained from the flash memory17or a device external to the game device3(via, for example, the Internet) instead of the optical disc4. A part of the game program100(e.g., a program for calculating the attitudes of the controller5and/or the terminal device7) may be previously stored in the game device3.

The controller operation data110represents an operation performed on the controller5. The controller operation data110is output (transmitted) from the controller5based on an operation performed on the controller5. The controller operation data110is transmitted by the controller5and received by the game device3, and is stored in the main memory. The controller operation data110includes angular velocity data111, operation button data112, and acceleration data113. The game device3obtains operation data from a plurality of controllers5(specifically, the controllers5a-5c), and stores the controller operation data110transmitted from each of the controllers5into the main memory. A predetermined number of most recent (latest) pieces of controller operation data110may be stored in the form of time series data for each controller5.

The angular velocity data111represents an angular velocity detected by the gyroscopic sensor48of the controller5. Here, the angular velocity data111represents an angular velocity about each axis of an X1Y1Z1 coordinate system (seeFIG. 3) specific to the controller5. Thus, in the present embodiment, the controller5includes the gyroscopic sensor48, and the controller operation data110includes the angular velocity data111as a physical quantity for calculating the attitude of the controller5. Therefore, the game device3can accurately calculate the attitude of the controller5based the angular velocity. Specifically, the game device3calculates the attitude (rotation angles about the axes of the X1Y1Z1 coordinate system from the initial attitude) of the controller5by integrating, with respect to time, an angular velocity about each of the X1-, Y1-, and Z1-axes detected by the gyroscopic sensor48.

The operation button data112represents input states of the operation buttons32a-32iprovided on the controller5. Specifically, the operation button data112represents whether or not each of the operation buttons32a-32ihas been pressed down.

The acceleration data113represents an acceleration detected by the acceleration sensor37of the controller5. Here, the acceleration data113represents an acceleration about each axis of the X1Y1Z1 coordinate system specific to the controller5.

The terminal operation data120represents an operation performed on the terminal device7. The terminal operation data120is output (transmitted) from the terminal device7based on an operation performed on the terminal device7. The terminal operation data120is transmitted by the terminal device7and received by the game device3, and is stored in the main memory. The terminal operation data120includes angular velocity data121, left stick data122, right stick data123, and acceleration data124. The terminal operation data120includes, in addition to these data items, operation data of each button, and azimuth data representing an azimuth detected by the magnetic sensor72of the terminal device7.

The angular velocity data121represents an angular velocity detected by the gyroscopic sensor74of the terminal device7. Here, the angular velocity data121represents an angular velocity about each axis of the XYZ coordinate system (seeFIG. 5) specific to the terminal device7.

The left stick data122represents input information of the left analog stick53A. Specifically, the left stick data122is represented by a two-dimensional input vector (InX, InY) which indicates an input direction of the left analog stick53A. InX is a value along the AX-axis ofFIG. 13, and InY is a value along the AY-axis. Here, InX and InY range from −1 to 1, and the maximum length of the input vector is 1. For example, when the up direction is input using the left analog stick53A, the input vector is (0, 1), and when the down direction is input, the input vector is (0, −1). When the right direction is input using the left analog stick53A, the input vector is (1, 0), and when the left direction is input, the input vector is (−1, 0). The CPU10calculates the input vector (InX, InY) based on the operation information of the left analog stick53A contained in operation data transmitted from the terminal device7, and stores the input vector (InX, InY) as the left stick data122in the main memory.

The right stick data123represents input information of the right analog stick53B. Similar to the left stick data122, the right stick data123is represented by a two-dimensional vector.

The processing data130is used in a game process described below (FIG. 17). The processing data130includes first character data131, second character data132, enemy character data133, terminal attitude data134, first virtual camera data135, second virtual camera data136, and third virtual camera data137. The processing data130includes, in addition to the data items ofFIG. 16, various data items used in the game process, such as data representing the attitude of the controller5, data representing parameters set for objects appearing in a game, etc.

The first character data131represents various information items relating to the first character91, including data representing the position and orientation (attitude) in the game space of each of the first characters91a-91c. The first character data131also includes data representing the vitality of each first character, data representing the attitudes of the sword objects91a-91c, data representing the positions of the guide objects94a-94ccorresponding to the respective first characters, and data representing a target to be attacked by each first character.

The second character data132represents various information items relating to the second character92, including data representing the position and orientation (attitude) in the game space of the second character92. The second character data132also includes data representing the number of remaining arrow objects97, data representing the vitality of the second character92, etc.

The enemy character data133represents various information items relating to the enemy characters93, including the position and orientation in the game space of each enemy character93. The enemy character data133also includes data representing the vitality of each enemy character93, etc.

The terminal attitude data134represents the attitude of the terminal device7. The attitude of the terminal device7may, for example, be represented by a rotation matrix indicating a rotation from a basic attitude (e.g., the attitude ofFIG. 6) to the current attitude, or rotation angles about the X-, Y-, and Z-axes. The terminal attitude data134is calculated based on the angular velocity data121contained in the terminal operation data120from the terminal device7. Specifically, the terminal attitude data134is calculated by integrating, with respect to time, an angular velocity about each of the X-, Y-, and Z-axes detected by the gyroscopic sensor74.

The first virtual camera data135represents the positions and attitudes in the game space of the first virtual cameras A-C which are set (fixed) behind the first characters91a-91c, respectively. As described above, the first virtual camera A is set behind the first character91a, and the shooting direction of the virtual camera is set to be the same as the orientation of the first character91a. The first virtual camera B is set behind the first character91b, and the shooting direction of the virtual camera is set to be the same as the orientation of the first character91b. The first virtual camera C is set behind the first character91c, and the shooting direction of the virtual camera is set to be the same as the orientation of the first character91c.

The second virtual camera data136represents the position and attitude in the game space of the second virtual camera which is set (fixed) at the right rear of the second character92.

The third virtual camera data137represents the position and attitude in the game space of the third virtual camera which is set (fixed) behind the second character92.

(Description of Flowchart)

Next, the game process executed in the game device3will be described in detail with reference toFIGS. 17-21.FIG. 17is a non-limiting example main flowchart showing a flow of the game process executed in the game device3. When the game device3is turned on, the CPU10of the game device3executes a boot program stored in a boot ROM (not shown) to initialize units such as the main memory. A game program stored in the optical disc4is read into the main memory, and the CPU10begins to execute the game program. The process of the flowchart ofFIG. 17is executed after the above process has been completed. In the game device3, the game program may be executed immediately after the game device3is turned on, or alternatively, a built-in program for displaying a predetermined menu screen may be initially executed after the game device3is turned on, and thereafter, the game program may be executed in response to, for example, an instruction to start a game which is issued by a user's selection operation to the menu screen.

The steps of the flowcharts ofFIGS. 17-21are merely illustrative, and the order in which the steps are performed may be changed as long as similar advantages are obtained. The values of variables and constants, etc., are also merely illustrative, and other values may be optionally used. In the present embodiment, it is assumed that the steps of the flowcharts are executed by the CPU10. Alternatively, a part of the steps may be executed by a processor or a dedicated circuit other than the CPU10.

In step S1, the CPU10executes an initial process. The initial process is used to construct a virtual game space, arrange objects appearing in the game space (the first and second characters, the enemy characters, the virtual cameras, other objects put in the game space, etc.) at initial positions, and set various parameters used in the game process to initial values. For example, the CPU10initializes the front direction vector indicating the orientation of the second character92contained in the second character data132, a rotation angle to be input using the left analog stick53A, and a rotation angle to be input using the attitude of the terminal device7. The CPU10also initializes the back direction degree BR (described in detail below). After step S1, the CPU10executes step S2. Thereafter, a loop of steps S2-S9is repeatedly executed at a rate of once per predetermined period of time (one frame time, e.g., 1/60 sec).

In step S2, the CPU10obtains operation data which has been transmitted from the terminal device7and the three controllers5and stored in the main memory. The terminal device7and the controllers5repeatedly transmit operation data (terminal operation data and controller operation data) to the game device3. In the game device3, the terminal transmission module28sequentially receives terminal operation data, which is then sequentially stored into the main memory by the input/output processor11a. The controller communication module19sequentially receives controller operation data, which is then sequentially stored into the main memory by the input/output processor11a. A transmission/reception interval between the controller5and the game device3and a transmission/reception interval between the terminal device7and the game device3are preferably shorter than a game processing time (one frame time), and are one two-hundredth of a second, for example. In step S2, the CPU10reads latest controller operation data110and latest terminal operation data120from the main memory. After step S2, step S3is executed.

In step S3, the CPU10executes a rotation process. The rotation process is to rotate, in the game space, the second character92which is operated using the terminal device7. The rotation process will be described in detail hereinafter with reference toFIG. 18.

FIG. 18is a non-limiting example flowchart showing a detailed flow of the rotation process (step S3) ofFIG. 17.

In step S11, the CPU10finds or selects an enemy character93which is located at a distance having a predetermined value or less from the second character92and for which the distance from the second character92is smallest. Specifically, the CPU10calculates a distance between the second character92and each enemy character93by referencing the second character data132and the enemy character data133. Thereafter, the CPU10finds or selects an enemy character93whose distance from the second character92is smaller than or equal to the predetermined value and is smallest. Even when an enemy character93has the smallest distance from the second character92, then if the distance exceeds the predetermined value, the CPU10does not select the enemy character93. Next, the CPU10executes step S12.

In step S12, the CPU10determines whether or not an enemy character93has been selected. If the determination result is positive (i.e., in step S11, an enemy character93has been selected in step S11), the CPU10next executes step S13. On the other hand, if the determination result is negative, the CPU10next executes step S14.

In step S13, the CPU10calculates the back direction degree BR. The back direction degree BR changes depending on an angle determined by the second character92and the enemy character93.FIG. 22is a non-limiting example diagram showing the second character92and the enemy character93as viewed from above in the game space, and the angle determined by the second character92and the enemy character93. Specifically, the CPU10calculates an angle DEG formed by a vector (front direction vector) indicating the orientation of the second character92and a vector from the second character92toward the enemy character93. When the absolute value of the angle DEG determined by the second character92and the enemy character93is greater than 90 degrees, the CPU10calculates the back direction degree by:
the back direction degreeBR=(the absolute value of DEG−90)/90  (1)

When the absolute value of the angle DEG is smaller than or equal to 90 degrees, the CPU10sets the back direction degree BR to zero. That is, the back direction degree BR is one when the enemy character93is located directly behind the second character92(180 degrees), and zero when the enemy character93is located in front of a line extending through the second character92in the left-right direction. The CPU10stores the calculated back direction degree BR into the main memory, and next executes step S15.

On the other hand, in step S14, since no enemy characters93are present around the second character92(within the predetermined range), the CPU10sets the back direction degree BR to zero. The CPU10next executes step S15.

In step S15, the CPU10calculates a rotation angle which is input using a stick. In step S15, the CPU10calculates a rotation angle of the second character92based on an input operation (the first input operation) which has been made using the left analog stick53A. The stick rotation angle calculation process will be described in detail hereinafter with reference toFIG. 19.

FIG. 19is a non-limiting example flowchart showing a detailed flow of the stick rotation angle calculation process (step S15) ofFIG. 18.

In step S21, the CPU10obtains the input vector (InX, InY) (the left stick data122) indicating an input operation performed on the left analog stick53A by referencing the main memory. Next, the CPU10executes step S22.

In step S22, the CPU10calculates an angle Di and a length Len of the input vector. The angle Di of the input vector is measured with reference to the up direction of the left analog stick53A. Specifically, the CPU10calculates the angle Di by:
Di=arccos(InY/Len)  (2)

If InX is negative, the CPU10multiplies Di calculated by expression (2) by −1 and stores the result as the angle Di into the main memory. If Len is zero, Di=0. Next, the CPU10executes step S23.

In step S23, the CPU10calculates a value X based on the angle Di and the length Len of the input vector. Specifically, the CPU10calculates the value X by:
X=(Di/90)×Len  (3)
Here, the CPU10sets the value X to 1 if X>1 and −1 if XInX according to expression (3) and therefore OutX>InX. That is, if the enemy character93is located directly behind the second character92, the second character92is caused to turn by a larger rotation angle than an actual rotation angle corresponding to an input operation performed on the left analog stick53A. The back direction degree BR increases as the angle determined by the second character92and the enemy character93is closer to 180 degrees. Therefore, as the enemy character93is closer to a position directly behind the second character92, OutX increases. Therefore, as the enemy character93is located closer to a position directly behind the second character92, the angle by which the second character92is rotated increases.

If the enemy character93is located directly behind the second character92, then when the upper right direction is input using the left analog stick53A, OutX=X according to expression (4). Here, if the angle Di of the input vector is 30 degrees and the length Len is 1, InX=½. On the other hand, if the angle Di=30 degrees and the length Len=1, X=⅓ according to expression (3). That is, if the enemy character93is located directly behind the second character92, then when the upper right direction is input using the left analog stick53A, the value OutX may be smaller than the value InX. This means that if the enemy character93is located directly behind the second character92, then when the second player inputs the upper right direction using the left analog stick53A, the clockwise turning of the second character92is weakened. That is, in this case, the second character92is rotated clockwise by a smaller rotation angle than an actual input rotation angle, and therefore, the second character92is more easily caused to move forward. If the enemy character93is located directly behind the second character92, then when the up direction is input using the left analog stick53A, it is considered that the second player is deliberately operating in order to cause the second character92to escape from the enemy character93. Therefore, in this case, by facilitating forward movement of the second character92, the second player can cause the second character92to move in his or her intended manner.

Thus, the actual input value InX of the left analog stick53A is adjusted based on the relative position relationship between the second character92and the enemy character93to calculate OutX. After step S24, the CPU10executes step S25.

In step S25, the CPU10calculates a rotation angle degS of the left analog stick53A. Specifically, the CPU10calculates the rotation angle degS by:
degS=−OutX×constantS(5)
where the constant S is a predetermined value which relates to the rotation speed of the second character92rotated by an operation performed on the left analog stick53A. The sign of OutX is inverted in expression (5) in order to match a rotation direction which is calculated by a gyroscopic sensor rotation angle calculation process described below. After step S25, the CPU10ends the stick rotation angle calculation process ofFIG. 19.

Referring back toFIG. 18, the CPU10next executes step S16. In the gyroscopic sensor rotation angle calculation process of step S16, the CPU10calculates a rotation angle of the second character92based on an input operation (the second input operation) performed by changing the attitude of the terminal device7. The gyroscopic sensor rotation angle calculation process will be described in detail hereinafter with reference toFIG. 20.

FIG. 20is a non-limiting example flowchart showing a detailed flow of the gyroscopic sensor rotation angle calculation process (step S16) ofFIG. 18.

In step S31, by referencing the angular velocity data121in the main memory, the CPU10obtains detection values (values indicating angular velocities about the X-, Y-, and Z-axes) of the gyroscopic sensor74of the terminal device7. Next, the CPU10executes step S32.

In step S32, the CPU10calculates the attitude of the terminal device7. Specifically, the CPU10calculates (obtains) the attitude of the terminal device7based on the detection values obtained in step S31. The CPU10can calculate rotation angles about the X-, Y-, and Z-axes from the initial attitude by integrating, with respect to time, an angular velocity about each of the X-, Y-, and Z-axes detected by the gyroscopic sensor74. The CPU10stores the calculated data representing the attitude of the terminal device7as the terminal attitude data134in the main memory. In step S32, the CPU10may correct the attitude of the terminal device7based on the acceleration data124(an acceleration detected by the acceleration sensor73of the terminal device7). Next, the CPU10executes step S33.

In step S33, the CPU10calculates a rotation angle degG in a yaw direction (about the Y-axis) during one frame based on the attitude of the terminal device7calculated in the previous frame (the attitude calculated in step S32in the previous process loop) and the attitude of the terminal device7calculated in the current frame. Next, the CPU10executes step S34.

In step S34, the CPU10determines whether or not a rotation direction indicated by the rotation angle degG is a direction in which the second character92faces the enemy character93. Here, the CPU10determines whether or not the second character92is rotated to face the enemy character93when the second character92is rotated by the rotation angle degG calculated in step S33. For example, if the angle determined by the second character92and the enemy character93is 150 degrees, then when the rotation angle degG calculated in step S33is 10 degrees (clockwise rotation by 10 degrees), the CPU10determines that the rotation direction indicated by the rotation angle degG is a direction in which the second character92faces the enemy character93. On the other hand, for example, if the angle determined by the second character92and the enemy character93is 150 degrees, then when the rotation angle degG calculated in step S33is −10 degrees (counterclockwise direction by 10 degrees), the CPU10does not determine that the rotation direction indicated by the rotation angle degG is a direction in which the second character92faces the enemy character93. If the determination result is positive, the CPU10next executes step S35. If the determination result is negative, the CPU10ends the gyroscopic sensor rotation angle calculation process ofFIG. 20.

In step S35, the CPU10corrects the rotation angle degG so that degG increases. Specifically, the CPU10corrects the rotation angle degG by:
degG=degG×(1+BR×constantG)  (6)
where the constant G is a predetermined positive value which relates to the rotation speed of the second character92which is rotated based on a change in the attitude of the terminal device7. As can be seen from expression (6), the corrected rotation angle degG increases with an increase in the back direction degree BR.

Thus, when the second character92is rotated by rotating the terminal device7so that the second character92faces the enemy character93, the rotation angle degG is greater than the actual rotation amount of the terminal device7. Therefore, the second character92is rotated in the game space by a larger rotation amount than the actual rotation amount of the terminal device7. As a result, when the player rotates the terminal device7so that the second character92faces the enemy character93, the player does not need to rotate the terminal device7by a large rotation amount so that the second character92faces the enemy character93. For example, when the enemy character93is located directly behind the second character92, the second character92can be rotated by a large angle by rotating the terminal device7slightly. After step S35, the CPU10ends the gyroscopic sensor rotation angle calculation process ofFIG. 20.

Referring back toFIG. 18, the CPU10next executes step S17. In step S17, the CPU10causes the second character92to rotate based on the stick rotation angle degS calculated in step S15and the gyroscopic sensor rotation angle degG calculated in step S16. Specifically, the CPU10calculates the sum of the rotation angle degS and the rotation angle degG, and rotates the front direction vector of the second character92about the y-axis (an axis extending vertically upward from the ground) in the game space by the calculated angle. After step S17, the CPU10ends the rotation process ofFIG. 18.

Referring back toFIG. 17, the CPU10next executes step S4. In step S4, the CPU10executes a movement process. In the movement process, the second character92and the first character91are caused to move in the game space. The movement process will be described in detail hereinafter with reference toFIG. 21.

FIG. 21is a non-limiting example flowchart showing a detailed flow of the movement process (step S4) ofFIG. 17.

In step S41, the CPU10determines whether or not a movement direction of the second character92is a front direction. Specifically, the CPU10determines whether or not the value InY of the input vector is greater than or equal to zero. If the determination result is negative, the CPU10next executes step S42. On the other hand, if the determination result is positive, the CPU10next executes step S43.

In step S42, the CPU10calculates a movement amount of the second character92based on the back direction degree BR and the value InY of the stick input vector. Specifically, the CPU10calculates the movement amount of the second character92so that the movement amount of the second character92increases with a decrease in the back direction degree BR. More specifically, if the back direction degree BR is one, the movement amount of the second character92is zero. If the back direction degree BR is zero, the movement amount of the second character92is a value corresponding to the value InY of the input vector. The CPU10also calculates the movement amount of the second character92so that the movement amount of the second character92increases with an increase in the absolute value of the value InY of the input vector. After step S42, the CPU10executes step S44.

In step S43, the CPU10calculates the movement amount of the second character92based on the value InY of the input vector. For example, the CPU10calculates the product of InY and a predetermined constant as the movement amount of the second character92. Next, the CPU10executes step S44.

In step S44, the CPU10causes the second character92to move based on the movement amount calculated in step S42or S43and the orientation of the second character92rotated in step S3. For example, the CPU10multiplies the calculated movement amount by a unit vector indicating the orientation of the second character92to calculate a movement vector. Thereafter, the CPU10adds the movement vector to a position vector indicating the current position of the second character92, thereby updating the position of the second character92.

As described above, the second character92is caused to rotate (step S3), and the movement amount of the second character92is calculated (step S42or S43). Thus, by controlling the movement of the second character92, the second character92performs a motion as follows. For example, if the enemy character93is located directly after the second character92, then when the player inputs the down direction using the left analog stick53A, the second character92does not retreat or move backward (movement amount is zero) and turns. Moreover, if the player continues to input the down direction of the left analog stick53A, the enemy character93is located diagonally behind the second character92rather than directly behind the second character92. In this case, the back direction degree BR is smaller than one, and the movement amount of the second character92is greater than zero. Therefore, the second character92retreats or moves backward while turning. That is, if the enemy character93is located directly behind the second character92, then when the player continues to input the down direction of the left analog stick53A, the second character92initially only rotates before moving while rotating. As a result, the second character92rotates while moving and keeping a distance from the enemy character93. That is, if the enemy character93is located behind the second character92, the player can cause the second character92to face the enemy character93while keeping a distance from the enemy character93. This motion can be said to be suitable for an attack using a bow. In other words, when a weapon such as a bow is used to attack a distant enemy, then if the distance to the attack target is excessively short, it is difficult to attack the target. In some instances, it may be better to keep a distance from the attack target.

Next, the CPU10executes step S45. By executing steps S45and S46, the first character91is caused to move in the game space.

In step S45, the CPU10causes the guide object94a-94ccorresponding to the first characters91a-91cto move along predetermined paths. That is, the CPU10causes the guide object94a-94cto move along the paths98a-98cpreviously set. The CPU10controls the position of the guide object94so that a distance between the first character91and the corresponding guide object94is not greater than or equal to a predetermined value. Next, the CPU10executes step S46.

In step S46, the CPU10causes the first character91to move based on the positions of the guide object94and the enemy character93. Specifically, if no enemy characters93are present within a predetermined range around the first character91a, the CPU10causes the first character91ato move, following the guide object94a(updates the position and orientation of the first character91a). When an enemy character93is present within the predetermined range, the CPU10causes the first character91ato move toward the enemy character93(updates the position and orientation of the first character91a, and stores the updated values into the main memory). If the first character91ais located closer to the enemy character93and is fighting with the enemy character93, the CPU10does not cause the first character91ato move until the first character91akills or beats the enemy character93. The CPU10similarly updates the positions and orientations of the other first characters91. After step S46, the CPU10ends the movement process ofFIG. 21.

Referring back toFIG. 17, the CPU10next executes step S5. In step S5, the CPU10executes a game process. In the game process, a process of causing each character to attack and a process based on the attack result (e.g., a process of decreasing the vitality of the attacked enemy character93, etc.) are performed. Specifically, the attitude of the controller5is calculated based on the angular velocity data111(data representing an angular velocity detected by the gyroscopic sensor48of the controller5), and based on this attitude, the attitude of the sword object95is determined. Thereafter, based on a position relationship between the sword object95and the enemy character93, it is determined whether or not the attack on the enemy character93is successful (the sword object95has hit the enemy character93). In this case, the CPU10may determine whether or not the controller5has been swung, by referencing the acceleration data113(data representing an acceleration detected by the acceleration sensor37of the controller5). The CPU10also determines, by referencing the right stick data123, whether or not to shoot the arrow object97in response to an input operation performed on the right analog stick53B, and based on the determination result, shoots the arrow object97in the game space. In this case, for example, the arrow object97is shot from the position of the second character92toward a position in the game space corresponding to the center of the screen of the LCD51. The arrow object97thus shot is caused to move in the game space on a path which is determined, taking the influence of gravity and the like into consideration. Thereafter, it is determined whether or not the moving arrow object97has contacted another object (the enemy character93or other obstacles), and if the moving arrow object97has contacted another object, the arrow object97is stopped. Also, a process of causing the enemy character93to move in the game space is performed. When the cross button32aof each controller5is pressed, the selection object99is moved so that the attack target is changed to another enemy character93. The CPU10also updates the position and orientation of the virtual camera set for each player character (91,92) based on the position and orientation of each player character (or the attitude of the terminal device7). The CPU10next executes step S6.

In step S6, the CPU10executes a process of generating a television game image. In step S6, the images90a,90b,90c, and90dto be displayed on the television2are generated. Specifically, the CPU10obtains an image by shooting the game space using the first virtual camera A set behind the first character91a. Here, if a plurality of enemy characters93are present within a predetermined range around the first character91a, the CPU10superimposes the image of the selection object99aon the obtained image to generate the image90ato be displayed on the upper left region of the television2. Similarly, the CPU10shoots the game space using the first virtual camera B set behind the first character91bto generate the image90b. The CPU10shoots the game space using the first virtual camera C set behind the first character91cto generate the image90c. The CPU10shoots the game space using the second virtual camera set at the right rear of the second character92to generate the image90d. Thereafter, the CPU10combines the four generated images90a-90dto generate a television game image. The image90ais put in the upper left region of the television game image, the image90bis put in the upper right region, the image90cis put in the lower left region, and the image90dis put in the lower right region. The CPU10next executes step S7.

In step S7, the CPU10executes a process of generating a terminal game image. Specifically, the CPU10shoots the game space using the third virtual camera set behind the second character92to generate the image90e(terminal game image). The CPU10next executes step S8.

In step S8, the CPU10executes a display process (process of outputting a game image). Here, the television game image generated in step S6is output to the television2, and the terminal game image generated in step S7is output (transmitted) to the terminal device7. As a result, the television2displays an image as shown inFIG. 11, and the LCD51of the terminal device7displays an image as shown inFIG. 12. In step S8, audio data is output along with the game image to the television2and/or the terminal device7, and game audio is output from the speaker2aof the television2and/or the speaker77of the terminal device7. The CPU10next executes step S9.

In step S9, the CPU10determines whether or not to end the game. The determination in step S9is performed based on, for example, whether or not the game is over, whether or not the user issues an instruction to stop the game, or the like. If any of the first characters91a-91cand the second character92is killed or beaten by the enemy character93(attacked a predetermined number of times), the game is over. If the determination result of step S9is negative, step S2is executed again. On the other hand, if the determination result of step S9is positive, the CPU10ends the game process ofFIG. 20.

As described above, in the game of the present embodiment, at least one player who operates the controller5and a player who operates the terminal device7cooperate with each other to play the game. The first character91operated using the controller5automatically moves based on the predetermined path in the game space. The second character92operated using the terminal device7moves based on an input operation performed on the terminal device7(an input operation performed on the left analog stick53A, and an input operation performed by changing the attitude of the terminal device7). That is, the movement direction and movement amount of the first character91are automatically controlled, and the movement direction and movement amount of the second character92are controlled by a player's input. Thus, the first character91moves with a first degree of freedom (specifically, moves along the predetermined path), and the second character92moves with a second degree of freedom higher than the first degree of freedom (specifically, moves through predetermined arbitrary positions on a plane based on an operation performed on the terminal device7). Thus, the movement of the first character91operated by the controller5is limited, and the second character92operated by the terminal device7is allowed to freely move, whereby a novel game played by a plurality of players can be provided. That is, in the game, the first player who operates the first character91plays the game under a predetermined limit (with the first degree of freedom) while viewing the screen of the television2, and the second player who operates the second character92causes the second character92to move freely (with the second degree of freedom) while viewing the screens of the television2and the terminal device7. Since the second player can cause the second character92to move freely, the second player can play the game under conditions more advantageous than those for the first player. On the other hand, the first player plays the game under the predetermined limit, however, since the first character91automatically moves, the first player does not need to perform an operation for moving the first character91, and can play the game by performing a simple operation (an operation of swing the controller5in order to attack the enemy character93). Therefore, for example, a player who is good at playing the game may use the terminal device7to operate the second character92, and a player or players who are not good at playing the game may use the controller5to operate the first character91, whereby a plurality of players can cooperate with each other to play the game irrespective of the level of skill. For example, when the first character91which is operated by a player who is not good at playing the game is likely to be attacked by the enemy character93, a player who is good at playing the game can freely operate the second character92to kill or beat the enemy character93. Since the second character92can attack using the arrow object97, the second character92can attack the enemy character93which is located further away from the second character92. Therefore, the second player can be said to have an advantage over the first player, and therefore, to be qualified to play the game while helping the first player. Conversely, a player who is good at playing the game may operate the first character91, and a player who is not good at playing the game may operate the second character92. In some games in which a plurality of players play under the same conditions (with the same degree of freedom), each player may perform an arbitrary operation to interfere with the course of the game. However, in the game of the present embodiment, the movement of the first character91is limited, and therefore, each player is prevented from arbitrarily operating the corresponding player character and therefore interfering with the course of the game. In the present embodiment, each player can play the game in his or her role with his or her degree of freedom.

In the present embodiment, the first virtual cameras corresponding to the respective first characters91are set in the game space, and the second virtual camera corresponding to the second character92is set in the game space. Images captured by the first virtual cameras and the second virtual camera are displayed on the screen of the television2. Moreover, the third virtual camera corresponding to the second character92is set in the game space, and an image captured by the third virtual camera is displayed on the LCD51of the terminal device7. As a result, one first player can recognize situations of the other first players and the second player by viewing the screen of the television2. The second player can view the two screens, thereby playing the game based on a greater amount of information than that for the first players.

In the present embodiment, if the enemy character93is present behind the second character92(at the rear of a line extending through the second character92in the left-right direction; a direction of 90 to 180 degrees or −180 to −90 degrees), the second character92is more easily caused to face the enemy character93. Specifically, if the enemy character93is located behind the second character92, then even when the down direction is input using the left analog stick53A, the second character92turns without (or while) retreating or moving backward. Also, if the enemy character93is located behind the second character92, then when the terminal device7is rotated in a direction which causes the second character92to face the enemy character93, the second character92is rotated by a larger rotation amount than that of the terminal device7. Thus, the second character92is more easily caused to face the enemy character93based on a position relationship between the second character92and the enemy character93. Therefore, even if the enemy character93is located behind the second character92, the second player can easily cause the second character92to face the enemy character93.

[7. Variations]

The above embodiment is merely illustrative. In other embodiments, for example, configurations described below may be used.

For example, in the present embodiment, three of four players operate the controller5, and one player operates the terminal device7. In another embodiment, a plurality of terminal devices7may be connected to the game device3(by a wireless connection), and a plurality of players may operate the respective corresponding terminal devices7. In another embodiment, there may be four or more players. Alternatively, two players may play the game, where one player operates the controller5, and the other player operates the terminal device7.

In the present embodiment, a plurality of players cooperate with each other to play the game. In another embodiment, a game may be provided in which a plurality of players fight with each other. The present exemplary embodiment may be applicable to any game which is played by a plurality of players.

In the present embodiment, the second and third virtual cameras corresponding to the second character92are set in the game space, an image captured by the second virtual camera is displayed on the television2, and an image captured by the third virtual camera is displayed on the terminal device7. In another embodiment, an image captured by the second virtual camera (an image corresponding to the second character92) may not be displayed on the television2. In this case, an image of the game space as viewed from behind the second character92is not displayed on the television2, and therefore, it is difficult for the first player to recognize the situation of the second character92. The game can be played without recognizing the situation of the second character92.

In the present embodiment, an image captured by the third virtual camera is displayed on the terminal device7. In another embodiment, an image corresponding to a motion of the second character92may be displayed on the terminal device7. Here, the image corresponding to a motion of the second character92may be an image which is changed based on the motion of the second character92, or may or may not be an image of the game space captured by a virtual camera. For example, a map image for showing the position and orientation in the game space of each character (91,92,93) may be displayed on the terminal device7.

In the present embodiment, the first character91automatically moves along a predetermined path (the orientation and position of the first character91change automatically), and the second character92moves on a two-dimensional plane in the game space in a direction which is input by the second player. In another embodiment, the first character91may move in response to an operation performed by the first player based on a predetermined path. For example, when an operation portion corresponding to the up direction of the cross button32aof the controller5is pressed, the first character91may move forward on the predetermined path. Alternatively, for example, the first character91may move within a predetermined range containing the predetermined path, automatically or by a player's operation. Alternatively, for example, only the orientation of the first character91may be controlled by a player, and the movement amount of the first character91may be automatically controlled. Alternatively, only the movement amount of the first character91may be controlled by a player, and the orientation of the first character91may be automatically controlled. In another embodiment, the second character92may move based on the second player's operation within a limited range (larger than the movement range of the first character91) on a two-dimensional plane. That is, the first character91moves with the first degree of freedom, and the second character92moves with the second degree of freedom higher than the first degree of freedom. Here, the term “degree of freedom” indicates to what degree a player can freely control a character. For example, the degree of freedom may indicate to what degree a player can freely control the position of a character, or to what degree a player can freely control the orientation of a character. For example, the degree of freedom may indicate a range (position) in the game space within which a character is allowed to move in response to a player's operation, or to what degree the orientation in the game space of a character can be changed in response to a player's operation. The degree of freedom may indicate to what degree a character perform a motion freely in response to a player's operation. For example, since the first character91normally automatically moves on a predetermined path (the position and orientation are automatically determined by the game device3), the range in the game space within which the first character91is allowed to move can be said to be smaller than that of the second character92. Since the orientation of the first character91is automatically determined by the game device3, the degree of freedom of the orientation of the first character91can be said to be lower than that of the second character92. Therefore, the degree of freedom of the first character91is lower than that of the second character92.

In the present embodiment, if the enemy character93is not located behind the second character92, then when the down direction is input using the left analog stick53A, the second character92is caused to retreat or move backward rather than turning. In another embodiment, in a similar case, the second character92may be caused to turn while moving.

In the present embodiment, if the enemy character93is not located behind the second character92, then when the down direction is input using the left analog stick53A, the second character92is caused to retreat or move backward. If the enemy character93is located behind the second character92, then when the down direction is input using the left analog stick53A, the second character92is caused to turn rather than retreating or moving backward. In another embodiment, if the enemy character93is located behind the second character92, then when the down direction is input using the left analog stick53A, the second character92may be caused to turn while a backward movement is limited (the movement amount is set to zero or is reduced). In another embodiment, if the enemy character93is located behind the second character92, the amount of turn of the second character92may be set to be greater than when the enemy character93is not located behind the second character92.

That is, in another embodiment, if the enemy character93is located behind the second character92, then when the down direction is input using the left analog stick53A, the stick rotation angle (control data representing a rotation direction and a rotation amount) may be adjusted so that the second character92is more easily caused to face the enemy character93.

In the present embodiment, if the enemy character93is located behind the second character92, the rotation amount of the second character92caused by the rotation of the terminal device7is set to be greater than when the enemy character93is not located behind the second character92. That is, if the enemy character93is located behind the second character92, the gyroscopic sensor rotation angle (control data representing a rotation direction and a rotation amount) is adjusted so that the second character92is more easily caused to face the enemy character93.

As described above, if the enemy character93is located behind the second character92, control data representing a rotation direction and a rotation amount of the second character92may be adjusted so that the second character92is more easily caused to face the enemy character93.

In the present embodiment, if the enemy character93is located behind the second character92, the control data is adjusted so that the second character92is more easily caused to face the enemy character93. In another embodiment, if the enemy character93is located behind the second character92, the control data may be adjusted so that the second character92is more easily caused to face a direction opposite to the enemy character93. For example, the control data may be adjusted so that the second character92is more easily caused to escape from a strong enemy. In another embodiment, if the enemy character93is located in front of the second character92, the control data may be adjusted so that the second character92is more easily caused to face in a direction opposite to the enemy character93.

That is, if there is a predetermined position relationship between a player object and another predetermined object, control data representing a rotation direction and a rotation amount of the player object may be adjusted so that the player object is easily caused to face in a predetermined direction.

In another embodiment, a front direction degree indicating to what degree the enemy character93is located in front of the second character92(how much the enemy character93is closer to a position directly in front of the second character92) may be calculated instead of the back direction degree BR, and the control data may be adjusted based on the front direction degree. In this case, the front direction degree increases with a decrease in the angle determined by the second character92and the enemy character93. Alternatively, a right direction degree may be calculated instead of the back direction degree BR, and the control data may be adjusted based on the right direction degree. That is, a degree corresponding to an angle of the enemy character93is calculated with reference to a specific direction of the second character92, and the control data may be adjusted based on the degree. The degree indicates to what degree the enemy character93is located in a specific direction (e.g., the rear direction) with reference to the second character92(how much the enemy character93is closer to a position in the specific direction). The degree increases as the enemy character93is located in a direction closer to the specific direction with reference to the second character92. More specifically, the degree indicates a degree of a match between a specific direction (e.g., a rear direction, a right direction, a front direction, etc.) as viewed from the second character92, and a direction from the position of the second character92to the position of the enemy character93.

In the present embodiment, the position and orientation of the second character92are controlled based on an input operation performed on the left analog stick53A of the terminal device7. In another embodiment, the position and orientation of the second character92may be controlled based on an input operation performed on other buttons (an operation portion for inputting a direction, such as the cross button54A).

In the present embodiment, an arrow object is used to attack the enemy character93. In another embodiment, any weapon may be used, including, for example, a spherical object (a ball, etc.), a bullet, a cannonball, a spear, a boomerang, etc.

In the present embodiment, the game device3generates and transmits a terminal game image to the terminal device7via wireless communication, and the terminal device7displays the terminal game image. In another embodiment, the terminal device7may generate and display a terminal game image on a display section (the LCD51) of the terminal device7. In this case, the game device3transmits information (a position, an attitude, etc.) about a character or a virtual camera in the game space to the terminal device7, which in turn generates a terminal game image based on the information.

In the present embodiment, the attitude of the terminal device7is calculated based on an angular velocity detected by a gyroscopic sensor. In another embodiment, the attitude of the terminal device7calculated based on the angular velocity detected by the gyroscopic sensor may be corrected based on an acceleration detected by an acceleration sensor, or the attitude of the terminal device7may be calculated based on the acceleration detected by the acceleration sensor. That is, the attitude of the terminal device7may be calculated using one or more inertial sensors (an acceleration sensor, a gyroscopic sensor). In another embodiment, the attitude of the terminal device7may be calculated based on an azimuth detected by a magnetic sensor (a direction indicated by the geomagnetism detected by a magnetic sensor). In another embodiment, the attitude of the terminal device7may be calculated using an image of the terminal device7captured using a camera, etc. Alternatively, the terminal device7outputs attitude data (data representing a detection value of an inertial sensor, data representing a detection value of a magnetic sensor, image data changing depending on the attitude of the terminal device7, etc.) based on the attitude of the terminal device7, and based on the attitude data, the game device3may obtain (calculate) the attitude of the terminal device7.

In another embodiment, a part of the game process executed by the game device3may be executed by the terminal device7. For example, a position, an attitude, a motion, etc. of an object in the game space operated using the terminal device7may be determined in the terminal device7, and the determined information may be transmitted to the game device3. Alternatively, the terminal device7may calculate its own attitude and transmit information about the attitude to the game device3. The other game process may be executed by the game device3based on the received information.

In the present embodiment, the game program is executed by the game device3. In another embodiment, the game program may be executed in a general information processing device (a personal computer, a smartphone, etc.) instead of a game-specialized device. That is, in another embodiment, a general information processing device may function as a game device by executing the game program.

The game program may be stored in a storage medium such as a magnetic disk, a non-volatile memory, etc., instead of an optical disc. The game program may be stored in a computer readable storage medium such as a RAM, a magnetic disk, etc., on a server connected to a network, and may be provided via the network. The game program may be read as a source code into an information processing device, and may be compiled and executed when the program is executed.

In the above embodiment, the process of the flowchart is performed by the CPU10of the game device3executing the game program. In another embodiment, the whole or apart of the process may be performed using a dedicated circuit included in the game device3or a general-purpose processor. At least one processor may operate as a “programmed logic circuit” for executing the process.

In another embodiment, in a game system having a plurality of information processing devices which can communicate with each other, the plurality of information processing devices may share the load of the game process executed by the game device3. For example, the game system may include a plurality of information processing devices connected to a network such as the Internet. In this case, for example, a player performs a game operation on an operation device or a portable display device connectable to the network (e.g., the terminal device7of the above embodiment, and an operation device having a display device, such as a tablet-type computer, a smartphone, etc.). Operation information corresponding to the game operation is transmitted to another information processing device via the network, and the other information processing device executes a game process based on the received operation information, and transmits the execution result to the operation device or the portable display device.

The systems, devices and apparatuses described herein may include one or more processors, which may be located in one place or distributed in a variety of places communicating via one or more networks. Such processor(s) can, for example, use conventional 3D graphics transformations, virtual camera and other techniques to provide appropriate images for display. By way of example and without limitation, the processors can be any of: a processor that is part of or is a separate component co-located with the stationary display and which communicates remotely (e.g., wirelessly) with the movable display; or a processor that is part of or is a separate component co-located with the movable display and communicates remotely (e.g., wirelessly) with the stationary display or associated equipment; or a distributed processing arrangement some of which is contained within the movable display housing and some of which is co-located with the stationary display, the distributed portions communicating together via a connection such as a wireless or wired network; or a processor(s) located remotely (e.g., in the cloud) from both the stationary and movable displays and communicating with each of them via one or more network connections; or any combination or variation of the above. The processors can be implemented using one or more general-purpose processors, one or more specialized graphics processors, or combinations of these. These may be supplemented by specifically-designed ASICs (application specific integrated circuits) and/or logic circuitry. In the case of a distributed processor architecture or arrangement, appropriate data exchange and transmission protocols are used to provide low latency and maintain interactivity, as will be understood by those skilled in the art. Similarly, program instructions, data and other information for implementing the systems and methods described herein may be stored in one or more on-board and/or removable memory devices. Multiple memory devices may be part of the same device or different devices, which are co-located or remotely located with respect to each other.

While certain example systems, methods, devices and apparatuses have been described herein, it is to be understood that the appended claims are not to be limited to the systems, methods, devices and apparatuses disclosed, but on the contrary, are intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims

  1. A game system for controlling a player object provided in a virtual world, comprising: a game device including at least one processor configured to: obtain input data representing an input operation performed by a player using an input device in order to rotate the player object to face a predetermined other object;set, based on the input data, control data representing a direction and amount of a change in an orientation of the player object in the virtual world;adjust the set control data by automatically changing the set control data to increase a degree of rotation of the player object when the player object and the predetermined other object have the predetermined position relationship so that when the player object and the predetermined other object have the predetermined position relationship, use of the input device to rotate the player object to face the predetermined other object is easier for the player compared to when the player object and the predetermined other object do not have the predetermined position relationship;and change the orientation of the player object based on the adjusted control data.
  1. The game system of claim 1 , wherein the at least one processor is configured to automatically adjust the set control data so that the player object is rotated to face the other object.
  2. The game system of claim 2 , wherein the at least one processor is configured to adjust the set control data so that when the other object is located behind the player object, the player object is rotated to face the other object.
  3. The game system of claim 1 , wherein the at least one processor is further configured to: calculate a degree of an angle between a specific direction as viewed from the player object and a direction from the player object toward the other object;and adjust the control data based on the degree.
  4. The game system of claim 4 , wherein the degree increases as the other object is closer to a position directly behind the player object, and wherein the at least one processor is further configured to automatically adjust the set control data so that the player object is rotated more to face the other object as the degree increases.
  5. The game system of claim 1 , wherein the input data includes attitude data based on the attitude of the input device, the at least one process being further configured to obtain the attitude of the input device based on the attitude data, and to set the control data based on the attitude of the input device.
  6. The game system of claim 6 , wherein the at least one processor is further configured to set the control data so that the orientation of the player object is changed by a first change amount corresponding to a change amount of the attitude of the input device, and to automatically adjust the set control data so that when the player object and the other object have the predetermined position relationship, the orientation of the player object is changed by a second change amount larger than the first change amount.
  7. The game system of claim 7 , wherein the at least one processor is further configured to set the control data so that the orientation of the player object is changed toward a direction in the virtual world corresponding to a direction in which the attitude of the input device has been changed, by the first change amount corresponding to the change amount of the attitude of the input device, and to automatically adjust the control data so that when the direction in the virtual world corresponding to the direction in which the attitude of the input device has been changed is a direction in which the player object faces the other object, the orientation of the player object is changed by the second change amount larger than the first change amount.
  8. The game system of claim 1 , wherein the input data includes direction data corresponding to an input operation performed on a direction input section included in the input device, and wherein the at least one processor is further configured to set the control data based on the direction data.
  9. The game system of claim 9 , wherein the at least one processor is further configured to cause the player object to move in the virtual world based on the direction data.
  10. The game system of claim 10 , wherein when the player object and the other object have the predetermined position relationship, then if a movement direction of the player object represented by the direction data is a direction in which the player object faces the other object, the at least one processor is configured to limit a movement of the player object, and to automatically adjust the set control data so that when the player object and the other object have the predetermined position relationship, then if the movement direction of the player object represented by the direction data is a direction in which the player object faces the other object, the player object is rotated to face the other object.
  11. The game system of claim 1 , wherein the input device is a portable display device including a display, and wherein the at least one processor is further configured to set a virtual camera in the virtual world based on the orientation of the player object, and cause the display to display an image of the virtual world captured by the virtual camera.
  12. The game system of claim 1 , wherein the other object is an enemy object which is allowed to move in the virtual world, and wherein the at least one processor is further configured to cause the player object to attack the enemy object based on the input data.
  13. A game processing method performed by at least one computer processor included in a game system for controlling a player object provided in a virtual world comprising: obtaining input data representing an input operation performed by a player using an input in order to rotate the player object to face a predetermined other object;setting, based on the input data and using the at least one computer processor, control data representing a direction and amount of a change in an orientation of the player object in the virtual world;adjusting the set control data by automatically changing the set control data to increase a degree of rotation of the player object when the player object and the predetermined other object have a predetermined position relationship so that when the player object and the predetermined other object have the predetermined position relationship, use of the input device to rotate the player object to face the predetermined other object is easier for the player compared to when the player object and the predetermined other object do not have the predetermined position relationship;and changing, the orientation of the player object based on the adjusted control data.
  14. The game processing method of claim 14 , wherein The set control data is automatically adjusted so that the player object is rotated to face the other object.
  15. The game processing method of claim 15 , wherein the set control data is automatically adjusted so that when the other object is located behind the player object, the player object is rotated to face the other object.
  16. The game processing method of claim 14 , further comprising: calculating a degree of an angle between a specific direction as viewed from the player object and a direction from the player object toward the other object, wherein the control data is adjusted based on the degree.
  17. The game processing method of claim 17 , wherein the degree increases as the other object is closer to a position directly behind the player object, and the set control data is automatically a adjusted so that the player object is rotated more to face the other object as the degree increases.
  18. The game processing method of claim 14 , wherein the input data includes attitude data based on the attitude of the input device, the game processing method further includes obtaining the attitude of the input device based on the attitude data, and the control data is set based on the attitude of the input device.
  19. The game processing method of claim 19 , wherein the control data is set so that the orientation of the player object is changed by a first change amount corresponding to a change amount of the attitude of the input device, and the control data is adjusted so that when the player object and the other object have the predetermined position relationship, the orientation of the player object is changed by a second change amount larger than the first change amount.
  20. The game processing method of claim 20 , wherein the control data is set so that the orientation of the player object is changed toward a direction in the virtual world corresponding to a direction in which the attitude of the input device has been changed, by the first change amount corresponding to the change amount of the attitude of the input device, and the control data is set so that when the direction in the virtual world corresponding to the direction in which the attitude of the input device has been changed is a direction in which the player object faces the other object, the orientation of the player object is changed by the second change amount larger than the first change amount.
  21. The game processing method of claim 14 , wherein the input data includes direction data corresponding to an input operation performed on a direction input section included in the input device, and the control data is set based on the direction data.
  22. The game processing method of claim 22 , further comprising: causing the player object to move in the virtual world based on the direction data.
  23. The game processing method of claim 23 , wherein when the player object and the other object have the predetermined position relationship, then if a movement direction of the player object represented by the direction data is a direction in which the player object faces the other object, a movement of the player object is limited, and the control data is adjusted so that when the player object and the other object have the predetermined position relationship, then if the movement direction of the player object represented by the direction data is a direction in which the player object faces the other object, the player object is caused to face the other object.
  24. The game processing method of claim 14 , wherein the input device is a portable display device including a display, the game processing method further includes setting a virtual camera in the virtual world based on the orientation of the player object, and causing the display to display an image of the virtual world captured by the virtual camera.
  25. The game processing method of claim 14 , wherein the other object is an enemy object which is allowed to move in the virtual world, and the game processing method further includes causing the player object to attack the enemy object based on the input data.
  26. A non-transitory computer readable storage medium storing a game program which, when executed by a computer included in a game device for controlling a player object provided in a virtual world, causes the computer to perform operations comprising: obtaining input data representing an input operation performed by a player using an input device in order to rotate the player object to face a predetermined other object;setting, based on the input data, control data representing a direction and amount of a change in an orientation of the player object in the virtual world;adjusting the set control data by automatically changing the set control data to increase a degree of rotation of the player object when the player object and the predetermined other object have a predetermined position relationship so that when the player object and the predetermined other object have the predetermined position relationship, use of the input device to rotate the player object to face the predetermined other object is easier for the player compared to when the player object and the predetermined other object do not have the predetermined position relationship;and changing the orientation of the player object based on the adjusted control data.
  27. A game device for controlling a player object provided in a virtual world, comprising: a processing system, including at least one computer processor, the processing system being configured to: obtain input data representing an input operation performed by a player using an input device in order to rotate the player object to face a predetermined other object;set, based on the input data, control data representing a direction and amount of a change in an orientation of the player object in the virtual world;adjust the set control data by automatically changing the set control data to increase a degree of rotation of the player object when the player object and the predetermined other object have the predetermined position relationship so that when the player object and the predetermined other object have the predetermined position relationship, use of the input device to rotate the player object to face the predetermined other object is easier for the player compared to when the player object and the predetermined other object do not have the predetermined position relationship;and change the orientation of the player object based on the adjusted control data.

Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.