U.S. Pat. No. 9,101,839

COMPUTER-READABLE STORAGE MEDIUM HAVING STORED THEREIN GAME PROGRAM, GAME APPARATUS, GAME SYSTEM, AND GAME PROCESSING METHOD

AssigneeNINTENDO CO., LTD.

Issue DateFebruary 2, 2012

Illustrative Figure

Abstract

The game apparatus displays on a television a scene in which a pirate shoots an arrow at a first timing. Next, the game apparatus determines, at a second timing which is a timing after a predetermined time period has elapsed from the first timing, whether the attitude of a terminal device is an attitude in accordance with an instruction issued by the pirate when the pirate shot the arrow at the first timing. When the attitude of the terminal device is the attitude in accordance with the instruction issued by the pirate, the game apparatus displays a scene in which the arrow has reached the screen of the terminal device.

Description

DETAILED DESCRIPTION OF NON-LIMITING EXAMPLE EMBODIMENTS [1. Overall Configuration of Game System] Hereinafter, a game system1according to an exemplary embodiment will be described with reference to the drawings.FIG. 1is an external view showing a non-limiting example of a game system1. As shown inFIG. 1, the game system1includes a stationary display device (hereinafter, referred to as a “television”)2typified by, for example, a television receiver, a stationary game apparatus3, an optical disc4, a controller5, a marker device6, and a terminal device7. In the game system1, the game apparatus3executes a game process based on a game operation using the controller5, and the television2and/or the terminal device7display a game image obtained in the game process. Into the game apparatus3, the optical disc4which is an exemplary information storage medium which is exchangeably used for the game apparatus3is detachably inserted. An information processing program (typically, a game program) to be executed by the game apparatus3is stored in the optical disc4. An insertion operating for the optical disc4is formed on the front surface of the game apparatus3. The game apparatus3loads and executes the information processing program stored in the optical disc4having been inserted through the insertion opening, thereby executing the game process. The television2is connected to the game apparatus3through a connection cord. The television2displays a game image obtained in the game process executed by the game apparatus3. The television2includes a speaker2a(FIG. 2), and the speaker2aoutputs game sound obtained as a result of the game process. In another exemplary embodiment, the game apparatus3may be integrated with a stationary display device. Further, the game apparatus3and the television2may wirelessly communicate with each other. The marker device6is provided in the vicinity (above a screen inFIG. 1) of a screen of the television2. As will be described below in detail, a user (a player) is allowed to perform a game operation of ...

DETAILED DESCRIPTION OF NON-LIMITING EXAMPLE EMBODIMENTS

[1. Overall Configuration of Game System]

Hereinafter, a game system1according to an exemplary embodiment will be described with reference to the drawings.FIG. 1is an external view showing a non-limiting example of a game system1. As shown inFIG. 1, the game system1includes a stationary display device (hereinafter, referred to as a “television”)2typified by, for example, a television receiver, a stationary game apparatus3, an optical disc4, a controller5, a marker device6, and a terminal device7. In the game system1, the game apparatus3executes a game process based on a game operation using the controller5, and the television2and/or the terminal device7display a game image obtained in the game process.

Into the game apparatus3, the optical disc4which is an exemplary information storage medium which is exchangeably used for the game apparatus3is detachably inserted. An information processing program (typically, a game program) to be executed by the game apparatus3is stored in the optical disc4. An insertion operating for the optical disc4is formed on the front surface of the game apparatus3. The game apparatus3loads and executes the information processing program stored in the optical disc4having been inserted through the insertion opening, thereby executing the game process.

The television2is connected to the game apparatus3through a connection cord. The television2displays a game image obtained in the game process executed by the game apparatus3. The television2includes a speaker2a(FIG. 2), and the speaker2aoutputs game sound obtained as a result of the game process. In another exemplary embodiment, the game apparatus3may be integrated with a stationary display device. Further, the game apparatus3and the television2may wirelessly communicate with each other.

The marker device6is provided in the vicinity (above a screen inFIG. 1) of a screen of the television2. As will be described below in detail, a user (a player) is allowed to perform a game operation of moving the controller5, and the marker device6is used for causing the game apparatus3to calculate, for example, a movement, a position, and an attitude of the controller5. The marker device6includes two markers, that is, a marker6R and a marker6L, on both ends thereof. Specifically, the marker6R (and the marker6L) is implemented as at least one infrared light emitting diode (LED), and emits infrared light forward from the television2. The marker device6is wire-connected (or may be wirelessly connected) to the game apparatus3, and the game apparatus3is able to control whether each infrared LED of the marker device6is to be lit up. The marker device6is portable, and a user is allowed to set the marker device6at a desired position. InFIG. 1, an exemplary manner is shown in which the marker device6is set on the television2. However, the marker device6may be set at any position and may face in any direction.

The controller5provides the game apparatus3with operation data based on an operation performed on the controller5. In the exemplary embodiment described herein, the controller5includes a main controller8and a sub-controller9, and the sub-controller9is detachably mounted to the main controller8. The controller5and the game apparatus3are able to wirelessly communicate with each other. In the exemplary embodiment described herein, for example, the Bluetooth (registered trademark) technology is used for the wireless communication between the controller5and the game apparatus3. In another exemplary embodiment, the controller5and the game apparatus3may be wire-connected to each other. Further, although, inFIG. 1, the number of the controllers5included in the game system1is one, the game system1may include a plurality of the controllers5. Namely, the game apparatus3can communicate with a plurality of controllers, and multiple persons are allowed to play a game by simultaneously using a predetermined number of controllers. A specific structure of the controller5will be described below in detail.

The terminal device7approximately has such a size as to be held by a user, and the user is allowed to use the terminal device7by holding and moving the terminal device7with his/her hand, or positioning the terminal device7at any desired position. The terminal device7includes a liquid crystal display (LCD)51operating as display means, and input means (such as a touch panel52and a gyro sensor64as described below). The structure of the terminal device7will be described below in detail. The terminal device7and the game apparatus3can wirelessly communicate with each other (or wired communication may be used therebetween). The terminal device7receivers, from the game apparatus3, data of an image (for example, a game image) generated by the game apparatus3, and displays the image on the LCD51. Although, in the exemplary embodiment described herein, an LCD is used as a display device, the terminal device7may have any other display device such as a display device using, for example, electro luminescence (EL). Further, the terminal device7transmits, to the game apparatus3, operation data based on an operation performed on the terminal device7.

[2. Internal Structure of Game Apparatus3]

Next, with reference toFIG. 2, a non-limiting exemplary internal structure of the game apparatus3will be described.FIG. 2is a block diagram showing a non-limiting exemplary internal structure of the game apparatus3. The game apparatus3includes: a central processing unit (CPU)10; a system LSI11; an external main memory12; a ROM/RTC13; a disk drive14; an AV-IC15, and the like.

The CPU10, serving as a game processor, executes a game program stored in the optical disc4to perform a game process. The CPU10is connected to the system LSI11. In addition to the CPU10, the external main memory12, the ROM/RTC13, the disk drive14, and the AV-IC15are also connected to the system LSI11. The system LSI11performs processing such as control of data transmission among respective components connected thereto, generation of an image to be displayed, and acquisition of data from an external apparatus. An internal configuration of the system LSI11will be described below. The external main memory12, which is of a volatile type, stores programs, such as a game program loaded from the optical disc4or a flash memory17, and various data. The external main memory12is used as a work area and a buffer area for the CPU10. The ROM/RTC13includes a ROM (so-called a boot ROM) storing a program for starting up the game apparatus3, and a clock circuit (real time clock: RTC) for counting time. The disk drive14reads, from the optical disc4, program data, texture data and the like, and writes the read data into an internal main memory11edescribed below, or the external main memory12.

An input/output processor (I/O processor)11a, a graphics processor unit (GPU)11b, a digital signal processor (DSP)11c, a VRAM (video RAM)11d, and the internal main memory11e, are included in the system LSI11. These components11ato11eare connected to each other via an internal bus, which is not shown.

The GPU11b, which is a part of rendering means, generates an image according to a graphics command (rendering command) from the CPU10. The VRAM11dstores data (such as polygon data and texture data) to be used by GPU11bfor executing the graphics command. When an image is generated, the GPU11bgenerates image data by using the data stored in the VRAM11d. In the exemplary embodiment described herein, the game apparatus3generates both a game image to be displayed by the television2, and a game image to be displayed by the terminal device7. Hereinafter, the game image to be displayed by the television2may be referred to as a “television game image”, and the game image to be displayed by the terminal device7may be referred to as a “terminal game image”.

The DSP11cfunctions as an audio processor, and generates sound data by using sound data and sound waveform (tone quality) data stored in the internal main memory11eand/or the external main memory12. In the exemplary embodiment described herein, as game sounds, both a game sound outputted from the speaker of the television2, and a game sound outputted by a speaker of the terminal device7are generated, similarly to the game images. Hereinafter, the game sound outputted by the television2may be referred to as a “television game sound”, and the game sound outputted by the terminal device7may be referred to as a “terminal game sound”.

Data of the image and the sound to be outputted by the television2, among the images and the sounds generated by the game apparatus3as described above, is read by the AV-IC15. The AV-IC15outputs the read data of image to the television2via an AV connector16, and also outputs the read data of sound to the speaker2aincluded in the television2. Thus, the image is displayed by the television2, and the sound is outputted from the speaker2a.

On the other hand, data of the image and the sound to be outputted by the terminal device7, among the images and the sounds generated by the game apparatus3, is transmitted to the terminal device7by the input/output processor11a, and/or the like. The transmission of the data to the terminal device7by the input/output processor11a, and/or the like will be described below.

The input/output processor11aexecutes data reception and transmission among the components connected thereto and data downloading from an external apparatus. The input/output processor11ais connected to the flash memory17, a network communication module18, a controller communication module19, an extension connector20, a memory card connector21, and a codec LSI27. To the network communication module18, an antenna22is connected. To the controller communication module19, an antenna23is connected. The codec LSI27is connected to a terminal communication module28, and an antenna29is connected to the terminal communication module28.

The game apparatus3is connected to a network such as the Internet, so that the game apparatus3can communicate with an external information processing apparatus (for example, other game apparatuses, various servers, or various information processing apparatuses). Namely, the input/output processor11ais connected to a network such as the Internet via the network communication module18and the antenna22, to be able to communicate with the external information processing apparatus connected to the network. The input/output processor11aaccesses the flash memory17at regular intervals to detect for presence of data to be transmitted to the network. When the data to be transmitted is detected, the data is transmitted to the network via the network communication module18and the antenna22. Further, the input/output processor11areceives, via the network, the antenna22and the network communication module18, data transmitted from the external information processing apparatus or data downloaded from a download server, and stores the received data in the flash memory17. The CPU10executes the game program to read the data stored in the flash memory17, thereby using the read data on the game program. The flash memory17may store not only the data transmitted and received between the game apparatus3and the external information processing apparatus, but also saved data (result data or intermediate step data of the game) of a game played with the game apparatus3. Further, a game program may be stored in the flash memory17.

Further, the game apparatus3is able to receive the operation data transmitted from the controller5. Namely, the input/output processor11areceives, via the antenna23and the controller communication module19, the operation data transmitted from the controller5, and (temporarily) stores the operation data in a buffer area of the internal main memory11eor the external main memory12.

Further, the game apparatus3is able to transmit to the terminal device7and receive from the terminal device7data of the image, the sound, and the like. When the game image (terminal game image) is transmitted to the terminal device7, the input/output processor11a outputs, to the codec LSI27, data of the game image generated in the GPU11b. The codec LSI27subjects, to a predetermined compression process, the image data outputted by the input/output processor11a. The terminal communication module28wirelessly communicates with the terminal device7. Therefore, the image data compressed by the codec LSI27is transmitted to the terminal device7via the antenna29by the terminal communication module28. In the exemplary embodiment described herein, the image data transmitted from the game apparatus3to the terminal device7is used for a game. Therefore, if transmission of an image to be displayed in the game is delayed, operability in the game is adversely affected. Therefore, it is preferable that delay of the transmission of the image data from the game apparatus3to the terminal device7occurs as little as possible. Therefore, in the exemplary embodiment described herein, the codec LSI27compresses the image data by using a highly efficient compression technique in compliance with, for example, H.264 standard. It is to be noted that other compression techniques may be used, or uncompressed image data may be transmitted when a communication speed is sufficient. Further, the terminal communication module28is a communication module approved by, for example, Wi-Fi, and may perform wireless communication with the terminal device7at a high speed by using the MIMO (multiple input multiple output) techniques adopted in, for example, the IEEE 802.11n standard. Further, another communication mode may be used.

Further, the game apparatus3transmits, to the terminal device7, the sound data as well as the image data. Namely, the input/output processor11aoutputs the sound data generated by the DSP11c, through the codec LSI27, to the terminal communication module28. The codec LSI27subjects the sound data to a compression process, similarly to the image data. Although the compression mode for the sound data may be any mode, a mode in which the compression rate is high and deterioration of sound is reduced is preferably used. Further, in another exemplary embodiment, sound data, which is not subjected to the compression process, may be transmitted. The terminal communication module28transmits the compressed image data and the compressed sound data, via the antenna29, to the terminal device7.

Furthermore, the game apparatus3transmits, according to need, various control data as well as the image data and the sound data described above, to the terminal device7. The control data represents control instructions for components included in the terminal device7, and represents, for example, an instruction for controlling lighting of a marker section (a marker section55shown inFIG. 10), and an instruction for controlling imaging of a camera (a camera56shown inFIG. 10). The input/output processor11atransmits the control data to the terminal device7according to an instruction from the CPU10. Although the codec LSI27does not subject the control data to a compression process in the exemplary embodiment described herein, the compression process may be performed in another exemplary embodiment. The data transmitted from the game apparatus3to the terminal device7as described above may be encrypted according to need, or may not be encrypted.

Further, the game apparatus3is able to receive various data from the terminal device7. In the exemplary embodiment described herein, the terminal device7transmits the operation data, the image data, and the sound data, which will be described below in detail. The data transmitted from the terminal device7is received by the terminal communication module28via the antenna29. In the exemplary embodiment described herein, the image data and sound data transmitted from the terminal device7are subjected to the compression process which is similar to that for the image data and sound data transmitted from the game apparatus3to the terminal device7. Therefore, the received image data and sound data are transferred from the terminal communication module28to the codec LSI27, and the codec LSI27subjects the image data and sound data to a decompression process, and outputs, to the input/output processor11a, the image data and sound data having been subjected to the decompression process. On the other hand, since the operation data transmitted from the terminal device7has an amount of data which is less than an amount of data of an image and a sound, the operation data may not be subjected to the compression process. Further, encryption may be performed according to need, or may not be performed. Therefore, the operation data is received by the terminal communication module28, and is thereafter outputted via the codec LSI27to the input/output processor11a. The input/output processor11a(temporarily) stores the data received from the terminal device7in a buffer area of the internal main memory11eor the external main memory12.

Further, the game apparatus3is able to connect with another device and/or an external storage medium. Namely, to the input/output processor11a, the extension connector20and the memory card connector21are connected. The extension connector20is a connector, such as a USB or an SCSI, for interface. The extension connector20can be connected to a medium such as an external storage medium or a peripheral device such as another controller, or allows communication with a network by connecting with a connector for wired communication instead of using the network communication module18. The memory card connector21is a connector for connecting with an external storage medium such as a memory card. For example, the input/output processor11aaccesses the external storage medium via the extension connector20or the memory card connector21, to store data in the external storage medium or read data from the external storage medium.

The game apparatus3has a power button24, a reset button25, and an ejection button26. The power button24and the reset button25are connected to the system LSI11. When the power button24is pressed so as to be ON, power is supplied to the respective components of the game apparatus3from an external power supply via an AC adapter which is not shown. When the reset button25is pressed, the system LSI11restarts a boot program for the game apparatus3. The ejection button26is connected to the disk drive14. When the ejection button26is pressed, the optical disc4is ejected from the disk drive14.

In another exemplary embodiment, some of the components included in the game apparatus3may be implemented as an extension device which is separated from the game apparatus3. In this case, for example, the extension device may be connected to the game apparatus3via the extension connector20. Specifically, the extension device includes the components such as the codec LSI27, the terminal communication module28, and the antenna29, and the extension device may be detachably connected to the extension connector20. Thus, when the extension device is connected to a game apparatus which does not include the components described above, the game apparatus can be structured so as to be communicable with the terminal device7.

[3. Structure of Controller5]

Next, with reference toFIG. 3toFIG. 7, the controller5will be described. As described above, the controller5includes the main controller8and the sub-controller9.FIG. 3andFIG. 4are perspective views each showing a non-limiting exemplary external structure of the main controller8.FIG. 3is a perspective view showing a non-limiting example of the main controller8as viewed from the top rear side thereofFIG. 4is a perspective view showing a non-limiting example of the main controller8as viewed from the bottom front side thereof.

As shown inFIGS. 3 and 4, the main controller8includes a housing31formed by, for example, plastic molding. The housing31has a substantially parallelepiped shape extending in a longitudinal direction from front to rear (the Z1axis direction shown inFIG. 3). The overall size of the housing31is small enough to be held by one hand of an adult or even a child. A user is allowed to perform a game operation by pressing buttons provided on the main controller8and moving the main controller8to change a position and an attitude (tilt) thereof.

The housing31include a plurality of operation buttons. As shown inFIG. 3, on the top surface of the housing31, a cross button32a, a first button32b, a second button32c, an A button32d, a minus button32e, a home button32f, a plus button32g, and a power button32hare provided. In the specification described herein, the top surface of the housing31on which the buttons32ato32hare provided may be referred to as a “button surface”. On the other hand, on a bottom surface of the housing31, a recessed portion is formed, as shown inFIG. 4. On a slope surface on the rear side of the recessed portion, a B button32iis provided. The operation buttons32ato32iare assigned functions in accordance with an information processing program executed by the game apparatus3according to need. Further, the power button32his used for remotely powering the game apparatus3body on or off. The home button32fand the power button32heach have a top surface thereof buried in the top surface of the housing31. Thus, a user is prevented from inadvertently pressing the home button32for the power button32h.

On the rear surface of the housing31, a connector33is provided. The connector33is used for connecting another device (such as the sub-controller9or another sensor unit) to the main controller8. Further, to the right and the left of the connector33on the rear surface of the housing31, engagement holes33afor preventing removal of the other device from being facilitated are provided.

On the rear side of the top surface of the housing31, a plurality (four inFIG. 3) of LEDs34ato34dare provided. The controller5(the main controller8) is assigned a controller type (number) so as to be distinguishable from the other controllers. For example, the LEDs34ato34dare used for informing a user of the controller type which is currently set to controller5that he or she is using, or of a remaining battery power of the controller5. Specifically, when a game operation is performed by using the controller5, one of the plurality of LEDs34ato34dis lit up according to the controller type.

Further, the main controller8includes an imaging information calculation section35(FIG. 6), and has, on the front surface of the housing31, a light incident surface35aof the imaging information calculation section35, as shown inFIG. 4. The light incident surface35ais formed of a material which allows at least infrared light from the markers6R and6L to pass therethrough.

A sound hole31afor outputting sound to the outside from a speaker47(FIG. 5) included in the main controller8is formed between the first button32band the home button32fon the top surface of the housing31.

Next, with reference toFIGS. 5 and 6, an internal structure of the main controller8will be described.FIG. 5andFIG. 6show a non-limiting exemplary internal structure of the main controller8.FIG. 5is a perspective view showing a non-limiting exemplary state where an upper casing (a part of the housing31) of the main controller8is removed.FIG. 6is a perspective view showing a non-limiting exemplary state where a lower casing (a part of the housing31) of the main controller8is removed.FIG. 6is a perspective view showing a non-limiting exemplary reverse side of a substrate30shown inFIG. 5.

As shown inFIG. 5, the substrate30is fixed inside the housing31. On a top main surface of the substrate30, the operation buttons32ato32h, the LEDs34ato34d, an acceleration sensor37, an antenna45, the speaker47, and the like are provided. These components are connected to a microcomputer42(seeFIG. 6) via lines (not shown) formed on the substrate30and the like. In the exemplary embodiment described herein, the acceleration sensor37is positioned so as to be deviated from the center of the main controller8in the X1axis direction. Thus, a movement of the main controller8is easily calculated when the main controller8is rotated about the Z1axis. Further, the acceleration sensor37is positioned in front of the longitudinal (the Z1axis direction) center of the main controller8. Further, the wireless module44(FIG. 6) and the antenna45allow the controller5(the main controller8) to act as a wireless controller.

On the other hand, as shown inFIG. 6, at the front edge of the bottom main surface of the substrate30, the imaging information calculation section35is provided. The imaging information calculation section35includes an infrared filter38, a lens39, an image pickup element40, and an image processing circuit41located in order, respectively, from the front of the main controller8on the bottom main surface of the substrate30.

Further, on the bottom main surface of the substrate30, the microcomputer42and the vibrator46are provided. The vibrator46is, for example, a vibration motor or a solenoid, and is connected to the microcomputer42via the lines formed on the substrate30and the like. The main controller8is vibrated by an actuation of the vibrator46according to an instruction from the microcomputer42. Thus, the vibration is conveyed to a user's hand holding the main controller8. Thus, a so-called vibration-feedback game is realized. In the exemplary embodiment described herein, the vibrator46is positioned slightly in front of the longitudinal center of the housing31. Namely, the vibrator46is positioned near the end portion of the main controller8so as to be deviated from the longitudinal center thereof, and therefore a vibration of the entirety of the main controller8is enhanced by the vibration of the vibrator46. Further, the connector33is mounted to the rear edge on the bottom main surface of the substrate30. In addition to the components shown inFIG. 5andFIG. 6, the main controller8includes a quartz oscillator for generating a reference clock for the microcomputer42, an amplifier for outputting a sound signal to the speaker47, and the like.

FIG. 7is a perspective view showing a non-limiting exemplary external structure of the sub-controller9. The sub-controller9includes a housing80formed by, for example, plastic molding. The overall size of the housing80is small enough to be held by one hand of an adult or even a child, similarly to the main controller8. A player is allowed to perform a game operation also with the sub-controller9by operating buttons and a stick, and changing a position and an attitude of the controller itself.

As shown inFIG. 7, an analog joystick81is provided on the front edge side (on the Z2-axis positive direction side) of the top surface (on the Y2-axis negative direction side) of the housing80. Further, a front edge surface is formed on the front edge of the housing80so as to be slightly sloped backward, which is not shown. On the front edge surface, a C button and a Z button are provided so as to be aligned in the upward/downward direction (in the Y2-axis direction shown inFIG. 7). The analog joystick81and the respective buttons (the C button and the Z button) are assigned functions in accordance with a game program executed by the game apparatus3according to need. The analog joystick81and the respective buttons may be collectively referred to as an “operation section82(see FIG.8)”.

The sub-controller9includes an acceleration sensor (an acceleration sensor83shown inFIG. 8) inside the housing80, although it is not shown inFIG. 7. In the exemplary embodiment described herein, the acceleration sensor83is implemented as the same acceleration sensor as the acceleration sensor37of the main controller8. However, the acceleration sensor83may not be implemented as the same acceleration sensor as the acceleration sensor37. For example, the acceleration sensor83may be an acceleration sensor operable to detect an acceleration for a predetermined one axis or predetermined two axes.

Further, as shown inFIG. 7, one end of a cable is connected to the rear end of the housing80. The other end of the cable is connected to a connector (a connector84shown inFIG. 8), although it is not shown inFIG. 7. The connector is able to connect with the connector33of the main controller8. Namely, the main controller8and the sub-controller9are connected to each other by connecting between the connector33and the connector84.

It is to be noted that the shape of each of the main controller8and the sub-controller9, the shapes of the operation buttons, the number of the acceleration sensors and the number of the vibrators, the setting positions of the acceleration sensors and the vibrators, and the like, which are as described above with reference toFIG. 3toFIG. 7, are merely examples.

The other shapes, numbers, and setting positions may be used. Further, in the exemplary embodiment described herein, an imaging direction of the imaging means of the main controller8is the Z1axis positive direction. However, the imaging direction may be any direction. Namely, the position (the light incident surface35aof the imaging information calculation section35) of the imaging information calculation section35of the controller5may not be the front surface of the housing31. The imaging information calculation section35may be provided on any other surface on which light from the outside of the housing31can be incident.

FIG. 8is a block diagram showing a non-limiting exemplary structure of the controller5. As shown inFIG. 8, the main controller8includes the operation section32(the operation buttons32ato32i), the imaging information calculation section35, a communication section36, the acceleration sensor37, and the gyro sensor48. Further, the sub-controller9includes the operation section82and the acceleration sensor83. The controller5transmits data representing contents of an operation performed on the controller5, as operation data, to the game apparatus3. In the following description, the operation data transmitted by the controller5may be referred to as “controller operation data”, and the operation data transmitted by the terminal device7may be referred to as “terminal operation data.”

The operation section32includes the operation buttons32ato32idescribed above, and outputs, to the microcomputer42of the communication section36, operation button data representing an input state (whether or not each of the operation buttons32ato32ihas been pressed) of each of the operation buttons32ato32i.

The imaging information calculation section35is a system for analyzing data of an image taken by the imaging means, identifying an area thereof having a high brightness, and calculating the position of the center of gravity, the size, and the like of the area. The imaging information calculation section35has, for example, a maximum sampling period of about 200 frames/sec., and therefore can trace and analyze even a relatively fast motion of the controller5.

The imaging information calculation section35includes the infrared filter38, the lens39, the image pickup element40, and the image processing circuit41. The infrared filter38allows only infrared light to pass therethrough, among light incident on the front surface of the controller5. The lens39collects the infrared light which has passed through the infrared filter38, and outputs the infrared light to the image pickup element40. The image pickup element40is a solid-state image pick-up device such as, for example, a CMOS sensor or a CCD sensor. The image pickup element40receives the infrared light collected by the lens39, and outputs an image signal. The marker section55of the terminal device7and the marker device6, which are imaging subjects the images of which are taken, are formed of markers for outputting infrared light. Accordingly, when the infrared filter38is provided, the image pickup element40receives only the infrared light which has passed through the infrared filter38, and generates image data, so that images of the imaging subjects (the marker section55and/or the maker device6) can be accurately taken. Hereinafter, the image taken by the image pickup element40is referred to as a taken image. The image data generated by the image pickup element40is processed by the image processing circuit41. The image processing circuit41calculates a position of the imaging subject in the taken image. The image processing circuit41outputs data of a coordinate representing the calculated position, to the microcomputer42of the communication section36. The data representing the coordinate is transmitted as the operation data to the game apparatus3by the microcomputer42. Hereinafter, the coordinate is referred to as a “marker coordinate”. The marker coordinate represents various values so as to correspond to an attitude (tilt angle) and/or a position of the controller5. Therefore, the game apparatus3is able to calculate the attitude and/or the position of the controller5by using the marker coordinate.

In another exemplary embodiment, the controller5may not include the image processing circuit41. The taken image itself may be transmitted from the controller5to the game apparatus3. In this case, the game apparatus3includes a circuit or a program having a function equivalent to the function of the image processing circuit41, thereby calculating the marker coordinate.

The acceleration sensor37detects an acceleration (including the gravitational acceleration) of the controller5, that is, a force (including the gravitational force) applied to the controller5. The acceleration sensor37detects a value of an acceleration (linear acceleration) in the straight line direction along the sensing axis direction, among accelerations applied to a detection section of the acceleration sensor37. For example, a multi-axes acceleration sensor having two or more axes detects accelerations of components along the axes, respectively, as an acceleration applied to the detection section of the acceleration sensor. It is to be noted that the acceleration sensor37is an electrostatic capacitance type MEMS (micro electro mechanical system) acceleration sensor. However, another type of acceleration sensor may be used.

In the exemplary embodiment described herein, the acceleration sensor37detects linear accelerations in three axial directions, that is, the up/down direction (the Y1axis direction shown inFIG. 3) of the controller5, the left/right direction (the X1axis direction shown inFIG. 3) of the controller5, and the forward/backward direction (the Z1axis direction shown inFIG. 3) of the controller5. The acceleration sensor37detects an acceleration in the straight line direction along each axis. Therefore, an output of the acceleration sensor37represents a value of the linear acceleration for each of the three axes. Namely, the detected acceleration is represented as a three-dimensional vector in an X1Y1Z1 coordinate system (a controller coordinate system) defined relative to the controller 5.

Data (acceleration data) representing an acceleration detected by the acceleration sensor37is outputted to the communication section36. The acceleration detected by the acceleration sensor37is changed so as to correspond to an attitude (tilt angle) and a movement of the controller5. Therefore, an attitude and a movement of the controller5can be calculated by using the acceleration data obtained by the game apparatus3. In the exemplary embodiment described herein, the game apparatus3calculates an attitude, a tilt angle, and the like of the controller5based on the obtained acceleration data.

When a computer such as a processor (for example, the CPU10) of the game apparatus3or a processor (for example, the microcomputer42) of the controller5performs processing based on a signal of an acceleration outputted from the acceleration sensor37(and an acceleration sensor63described below), additional information relating to the controller5can be inferred or calculated (determined), as one skilled in the art will readily understand from the description herein. For example, a case where it is anticipated that the computer will perform processing on the assumption that the controller5having the acceleration sensor37mounted thereto is in a static state (that is, a case where it is anticipated that the computer will perform processing on the assumption that an acceleration detected by the acceleration sensor will include only the gravitational acceleration) will be described. When the controller5is actually in the static state, it is possible to determine whether or not the controller5tilts relative to the gravity direction and to also determine a degree of the tilt, based on the acceleration having been detected. Specifically, when a state where 1 G (gravitational acceleration) is applied to a detection axis of the acceleration sensor37in the vertically downward direction represents a reference, it is possible to determine whether or not the controller5tilts relative to the reference, based on whether 1 G (gravitational acceleration) is applied, and to determine a degree of tilt of the controller5relative to the reference, based on the magnitude of the detected acceleration. Further, in the case of the multi-axes acceleration sensor37, when a signal of an acceleration of each axis is further subjected to processing, a degree to the tilt of the controller5relative to the gravity direction can be determined with enhanced accuracy. In this case, the processor may calculate a tilt angle of the controller5based on an output from the acceleration sensor37, or may calculate a direction in which the controller5tilts without calculating the tilt angle. Thus, when the acceleration sensor37is used in combination with the processor, a tilt angle or an attitude of the controller5can be determined

On the other hand, in a case where it is anticipated that the controller5will be in a dynamic state (a state in which the controller5is being moved), the acceleration sensor37detects an acceleration based on a movement of the controller5, in addition to the gravitational acceleration. Therefore, when the gravitational acceleration component is eliminated from the detected acceleration through a predetermined process, it is possible to determine a direction in which the controller5moves. Further, when it is anticipated that the controller5will be in the dynamic state, an acceleration component based on the movement of the acceleration sensor is eliminated from the detected acceleration through a predetermined process, whereby it is possible to determine the tilt of the controller5relative to the gravity direction. In another exemplary embodiment, the acceleration sensor37may include an embedded processor or another type of dedicated processor for performing a predetermined process of the acceleration signal detected by embedded acceleration detection means before the acceleration signal is outputted to the microcomputer42. When, for example, the acceleration sensor37is used for detecting a static acceleration (for example, gravitational acceleration), the embedded or dedicated processor could convert the acceleration signal to a tilt angle (or another preferable parameter).

The gyro sensor48detects angular velocities around three axes (in the exemplary embodiment described herein, the X1, Y1, and Z1axes). In the description herein, a direction of rotation around the X1axis is referred to as a pitch direction, a direction of rotation around the Y1axis is referred to as a yaw direction, and a direction of rotation around the Z1axis is referred to as a roll direction. The gyro sensor48may detect angular velocities around the three axes, and the number of the gyro sensors to be used, and a manner in which the gyro sensors to be used are combined may be determined as desired. For example, the gyro sensor48may be a three-axes gyro sensor, or may be a gyro sensor obtained by combining a two-axes gyro sensor and a one axis gyro sensor with each other so as to detect angular velocities around the three axes. Data representing the angular velocity detected by the gyro sensor48is outputted to the communication section36. Further, the gyro sensor48may detect an angular velocity around one axis or two axes.

Further, the operation section82of the sub-controller9includes the analog joystick81, the C button, and the Z button as described above. The operation section82outputs stick data (referred to as sub-stick data) representing a direction in which the analog joystick81is tilted and an amount of the tilt of the analog joystick81, and operation button data (referred to as sub-operation button data) representing an input state (whether or not each button is pressed) of each button, via the connector84, to the main controller8.

Further, the acceleration sensor83of the sub-controller9, which is similar to the acceleration sensor37of the main controller8, detects an acceleration (including the gravitational acceleration) of the sub-controller9, that is, a force (including the gravitational force) applied to the sub-controller9. The acceleration sensor83detects values of accelerations (linear accelerations) in the straight line directions along predetermined three-axial directions, among accelerations applied to a detection section of the acceleration sensor83. Data (referred to as sub-acceleration data) representing the detected acceleration is outputted via the connector84to the main controller8.

As described above, the sub-controller9outputs, to the main controller8, sub-controller data including the sub-stick data, the sub-operation button data, and the sub-acceleration data described above.

The communication section36of the main controller8includes the microcomputer42, a memory43, a wireless module44, and the antenna45. The microcomputer42controls the wireless module44for wirelessly transmitting data obtained by the microcomputer42to the game apparatus3, while using the memory43as a storage area in order to perform processing.

The sub-controller data transmitted from the sub-controller9is inputted to the microcomputer42, and temporarily stored in the memory43. Further, data (referred to as main controller data) outputted to the microcomputer42from the operation section32, the imaging information calculation section35, the acceleration sensor37, and the gyro sensor48is temporarily stored in the memory43. The main controller data and the sub-controller data are transmitted as the operation data (controller operation data) to the game apparatus3. Specifically, the microcomputer42outputs, to the wireless module44, the operation data stored in the memory43at a time at which the data is to be transmitted to the controller communication module19of the game apparatus3. The wireless module44uses, for example, the Bluetooth (registered trademark) technology to modulate the operation data onto a carrier wave of a predetermined frequency, and emits the low power radio wave signal from the antenna45. Namely, the operation data is modulated into the low power radio wave signal by the wireless module44, and transmitted from the controller5. The low power radio wave signal is received by the controller communication module19on the game apparatus3side. The game apparatus3demodulates or decodes the received low power radio wave signal to obtain the operation data. The CPU10of the game apparatus3uses the operation data received from the controller5to perform a game process. The wireless transmission from the communication section36to the controller communication module19is sequentially performed at predetermined time intervals. Since the game process is generally performed at a cycle of 1/60 sec. (as one frame time), data preferably needs to be transmitted at a cycle of 1/60 sec. or a shorter cycle. For example, the communication section36of the controller5outputs the operation data to the controller communication module19of the game apparatus3every 1/200 seconds.

As described above, the main controller8is able to transmit the marker coordinate data, the acceleration data, the angular velocity data, and the operation button data as the operation data representing an operation performed on the main controller8. The sub-controller9is able to transmit the acceleration data, the sub-stick data, and the operation button data as the operation data representing an operation performed on the sub-controller9. Further, the game apparatus3executes the game process by using the operation data as a game input. Therefore, by using the controller5, a user is allowed to perform a game operation of moving the controller5itself in addition to a conventional game operation of pressing each operation button. For example, a user is allowed to perform, for example, operations of tilting the main controller8and/or the sub-controller9at desired attitudes, an operation of indicating a desired position on the screen by using the main controller8, and operations of moving the main controller8and/or the sub-controller9.

Further, although, in the exemplary embodiment described herein, the controller5does not have display means for displaying a game image, the controller5may have display means for displaying, for example, an image indicative of a remaining battery power.

[4. Structure of Terminal Device7]

Next, a structure of the terminal device7will be described with reference toFIGS. 9 to 11.FIG. 9shows a non-limiting exemplary external structure of the terminal device7. (a) ofFIG. 9is a front view showing a non-limiting example of the terminal device7, (b) ofFIG. 9is a top view thereof, (c) ofFIG. 9is a right side view thereof, and (d) ofFIG. 9is a bottom view thereof.FIG. 10shows a non-limiting exemplary state in which a user holds the terminal device7.

As shown inFIG. 9, the terminal device7includes a housing50which approximately has a horizontally long plate-like rectangular shape. The housing50is small enough to be held by a user. Therefore, the user is allowed to hold and move the terminal device7, and change the location of the terminal device7.

The terminal device7includes an LCD51on a front surface of the housing50. The LCD51is provided near the center of the front surface of the housing50. Therefore, as shown inFIG. 10, by holding the housing50at portions to the right and the left of the LCD51, a user is allowed to hold and move the terminal device while viewing a screen of the LCD51.FIG. 10shows an exemplary case in which a user holds the terminal device7horizontally (with the longer sides of the terminal device7being oriented horizontally) by holding the housing50at portions to the right and the left of the LCD51. However, the user may hold the terminal device7vertically (with the longer sides of the terminal device7being oriented vertically).

As shown in (a) ofFIG. 9, the terminal device7includes, as operation means, a touch panel52on the screen of the LCD51. In the exemplary embodiment described herein, the touch panel52is, but is not limited to, a resistive film type touch panel. However, a touch panel of any type, such as electrostatic capacitance type touch panel, may be used. The touch panel52may be of single touch type or multiple touch type. In the exemplary embodiment described herein, the touch panel52has the same resolution (detection accuracy) as that of the LCD51. However, the resolution of the touch panel52and the resolution of the LCD51need not be the same. Although an input onto the touch panel52is usually performed by using a touch pen, a finger of a user, in addition to the touch pen, may be used for performing an input onto the touch panel52. The housing50may have an opening for accommodating the touch pen used for performing an operation on the touch panel52. Thus, since the terminal device7has the touch panel52, a user is allowed to operate the touch panel52while moving the terminal device7. That is, the user is allowed to directly (by using the touch panel52) perform an input onto the screen of the LCD51while moving the screen of the LCD51.

As shown inFIG. 9, the terminal device7has, as operation means, two analog sticks53A and53B, and a plurality of buttons54A to54L. The analog sticks53A and53B are each a device for designating a direction. The analog sticks53A and53B are each configured such that a stick part operated by a finger of the user is slidable (or tiltable) in any direction (at any angle in any direction such as the upward, the downward, the rightward, the leftward, or the diagonal direction) relative to the front surface of the housing50. The left analog stick53A is provided to the left of the screen of the LCD51, and the right analog stick53B is provided to the right of the screen of the LCD51. Therefore, the user is allowed to perform an input for designating a direction by using the analog stick with either the left hand or the right hand. Further, as shown inFIG. 10, the analog sticks53A and53B are positioned so as to be operated by the user holding the right and left portions of the terminal device7. Therefore, the user is allowed to easily operate the analog sticks53A and53B also when the user holds and moves the terminal device7.

The buttons54A to54L are each operation means for performing a predetermined input. As described below, the buttons54A to54L are positioned so as to be operated by the user holding the right and left portions of the terminal device7(seeFIG. 10). Accordingly, the user is allowed to easily operate the operation means when the user holds and moves the terminal device7.

As shown in (a) ofFIG. 9, among the operation buttons54A to54L, the cross button (direction input button)54A and the buttons54B to54H are provided on the front surface of the housing50. Namely, the buttons54A to54H are positioned so as to be operated by a thumb of the user (seeFIG. 10).

The cross button54A is provided to the left of the LCD51below the left analog stick53A. That is, the cross button54A is positioned so as to be operated by the left hand of the user. The cross button54A is cross-shaped, and is capable of designating an upward, a downward, a leftward, or a rightward direction. The buttons54B to54D are provided below the LCD51. The three buttons54B to54D are positioned so as to be operated by the right and left hands of the user. The four buttons54E to54H are provided to the right of the LCD51below the right analog stick53B. Namely the four buttons54E to54H are positioned so as to be operated by the right hand of the user. Further, the four buttons54E,54H,54F, and54G are positioned upward, downward, leftward, and rightward, respectively, (with respect to a center position of the four buttons). Accordingly, the terminal device7may cause the four buttons54E to54H to function as buttons which allow the user to designate an upward, a downward, a leftward, or a rightward direction.

As shown in (a), (b), and (c) ofFIG. 9, a first L button541and a first R button54J are provided on diagonally upper portions (an upper left portion and an upper right portion) of the housing50. Specifically, the first L button541is provided on the left end of the upper side surface of the plate-shaped housing50so as to protrude from the upper and left side surfaces. The first R button54J is provided on the right end of the upper side surface of the housing50so as to protrude from the upper and right side surfaces. In this way, the first L button541is positioned so as to be operated by the index finger of the left hand of the user, and the first R button54J is positioned so as to be operated by the index finger of the right hand of the user (seeFIG. 10).

As shown in (b) and (c) ofFIG. 9, leg parts59A and59B are provided so as to protrude from a rear surface (i.e., a surface reverse of the front surface on which the LCD51is provided) of the plate-shaped housing50, and a second L button54K and a second R button54L are provided on the leg parts59A and59B, respectively. Specifically, the second L button54K is provided at a slightly upper position on the left side (the left side as viewed from the front surface side) of the rear surface of the housing50, and the second R button54L is provided at a slightly upper position on the right side (the right side as viewed from the front surface side) of the rear surface of the housing50. In other words, the second L button54K is provided at a position substantially opposite to the position of the left analog stick53A provided on the front surface, and the second R button54L is provided at a position substantially opposite to the position of the right analog stick53B provided on the front surface. Thus, the second L button54K is positioned so as to be operated by the middle finger of the left hand of the user, and the second R button54L is positioned so as to be operated by the middle finger of the right hand of the user (seeFIG. 10). Further, as shown in (c) ofFIG. 9, the leg parts59A and59B each have a surface facing diagonally upward, and the second L button54K and the second R button54L are provided on the diagonally upward facing surfaces of the leg parts59A and59B, respectively. Thus, the second L button54K and the second R button54L each have a button surface facing diagonally upward. Since it is supposed that the middle fingers of the user move vertically when the user holds the terminal device7, the upward facing button surfaces allow the user to easily press the second L button54K and the second R button54L. Further, the leg parts provided on the rear surface of the housing50allow the user to easily hold the housing50. Moreover, the buttons provided on the leg parts allow the user to easily perform operation while holding the housing50.

In the terminal device7shown inFIG. 9, the second L button54K and the second R button54L are provided on the rear surface of the housing50. Therefore, if the terminal device7is placed with the screen of the LCD51(the front surface of the housing50) facing upward, the screen of the LCD51may not be perfectly horizontal. Accordingly, in another exemplary embodiment, three or more leg parts may be provided on the rear surface of the housing50. In this case, if the terminal device7is placed on a floor with the screen of the LCD51facing upward, the three or more leg parts contact with the floor (or another horizontal surface). Thus, the terminal device7can be placed with the screen of the LCD51being horizontal. Such a horizontal placement of the terminal device7may be achieved by additionally providing detachable leg parts.

The respective buttons54A to54L are assigned functions, according to need, in accordance with a game program. For example, the cross button54A and the buttons54E to54H may be used for direction designation operation, selection operation, and the like, and the buttons54B to54D may be used for determination operation, cancellation operation, and the like.

The terminal device7includes a power button (not shown) for turning on/off the power of the terminal device7. The terminal device7may include a button for turning on/off screen display of the LCD51, a button for performing connection setting (pairing) for connecting with the game apparatus3, and a button for adjusting a sound volume of loudspeakers (loudspeakers67shown inFIG. 11).

As shown in (a) ofFIG. 9, the terminal device7includes a marker section (a marker section55shown inFIG. 11) having a marker55A and a marker55B, on the front surface of the housing50. The marker section55may be provided at any position. In the exemplary embodiment described herein, the marker section55is provided above the LCD51. The markers55A and55B are each implemented as one or more infrared LEDs, like the markers6L and6R of the marker device6. The marker section55is used, like the marker device6, for causing the game apparatus3to calculate, for example, a movement of the controller5(the main controller8). The game apparatus3is capable of controlling the infrared LEDs of the marker section55to be on or off

The terminal device7includes a camera56as imaging means. The camera56includes an image pickup element (e.g., a CCD image sensor or a CMOS image sensor) having a predetermined resolution, and a lens. As shown inFIG. 9, in the exemplary embodiment describe herein, the camera56is provided on the front surface of the housing50. Accordingly, the camera56is capable of taking an image of the face of the user holding the terminal device7. For example, the camera56is capable of taking an image of the user playing a game while viewing the LCD51. In another exemplary embodiment, one or more camera may be included in the terminal device7.

The terminal device7has a microphone (a microphone69shown inFIG. 11) as sound input means. A microphone hole60is provided in the front surface of the housing50. The microphone69is embedded in the housing50at a position inside the microphone hole60. The microphone detects for sound, such as user's voice, around the terminal device7. In another exemplary embodiment, one or more microphone may be included in the terminal device7.

The terminal device7has loudspeakers (loudspeakers67shown inFIG. 11) as sound output means. As shown in (d) ofFIG. 9, loudspeaker holes57are provided in the lower side surface of the housing50. Sound is outputted through the speaker holes57from the loudspeakers67. In the exemplary embodiment described herein, the terminal device7has two loudspeakers, and the speaker holes57are provided at positions corresponding to a left loudspeaker and a right loudspeaker, respectively. The number of loudspeakers included in the terminal device7may be any number, and additional loudspeakers, in addition to the two loudspeakers, may be provided in the terminal device7.

The terminal device7includes an extension connector58for connecting another device to the terminal device7. In the exemplary embodiment described herein, as shown in (d) ofFIG. 9, the extension connector58is provided in the lower side surface of the housing50. Any device may be connected to the extension connector58. For example, a controller (a gun-shaped controller or the like) used for a specific game or an input device such as a keyboard may be connected to the extension connector58. If another device need not be connected, the extension connector58need not be provided.

In the terminal device7shown inFIG. 9, the shapes of the operation buttons and the housing50, the number of the respective components, and the positions in which the components are provided, are merely examples. The shapes, numbers, and positions may be different from those described above.

Next, an internal structure of the terminal device7will be described with reference toFIG. 11.FIG. 11is a block diagram showing a non-limiting exemplary internal structure of the terminal device7. As shown inFIG. 11, the terminal device7includes, in addition to the components shown inFIG. 9, a touch panel controller61, a magnetic sensor62, the acceleration sensor63, the gyro sensor64, a user interface controller (UI controller)65, a codec LSI66, the loudspeakers67, a sound IC68, the microphone69, a wireless module70, an antenna71, an infrared communication module72, a flash memory73, a power supply IC74, a battery75, and a vibrator79. These electronic components are mounted on an electronic circuit board and accommodated in the housing50.

The UI controller65is a circuit for controlling data input to various input sections and data output from various output sections. The UI controller65is connected to the touch panel controller61, the analog stick53(the analog sticks53A and53B), the operation button54(the operation buttons54A to54L), the marker section55, the magnetic sensor62, the acceleration sensor63, the gyro sensor64, and the vibrator79. Further, the UI controller65is connected to the codec LSI66and the extension connector58. The power supply IC74is connected to the UI controller65, so that power is supplied to the respective components through the UI controller65. The internal battery75is connected to the power supply IC74, so that power is supplied from the internal battery75. Further, a battery charger76or a cable, which is supplied with power from an external power supply, may be connected to the power supply IC74via a connector or the like. In this case, the terminal device7can be supplied with power and charged from the external power supply by using the battery charger76or the cable. Charging of the terminal device7may be performed by setting the terminal device7on a cradle (not shown) having a charging function.

The touch panel controller61is a circuit which is connected to the touch panel52, and controls the touch panel52. The touch panel controller61generates a predetermined form of touch position data, based on a signal from the touch panel52, and outputs the touch position data to the UI controller65. The touch position data represents a coordinate of a position (the position may be a plurality of positions when the touch panel52is a multiple touch type one) at which an input is performed on an input surface of the touch panel52. The touch panel controller61reads a signal from the touch panel52and generates the touch position data every predetermined period of time. Further, various control instructions for the touch panel52are output from the UI controller65to the touch panel controller61.

The analog stick53outputs, to the UI controller65, stick data representing a direction in which the stick part operated by a finger of the user slides (or tilts), and an amount of the sliding (tilting). The operation button54outputs, to the UI controller65, operation button data representing an input state of each of the operation buttons54A to54L (whether or not each of the operation buttons is pressed).

The magnetic sensor62detects the magnitude and direction of a magnetic field to detect an orientation. Orientation data representing the detected orientation is outputted to the UI controller65. The UI controller65outputs, to the magnetic sensor62, a control instruction for the magnetic sensor62. Examples of the magnetic sensor62include: sensors using, for example, an MI (Magnetic Impedance) device, a fluxgate sensor, a hall device, a GMR (Giant Magneto Resistance) device, a TMR (Tunneling Magneto Resistance) device, and an AMR (Anisotropic Magneto Resistance) device. However, any sensor may be adopted as long as the sensor can detect an orientation. Strictly speaking, the obtained orientation data does not represent an orientation in a place where a magnetic field in addition to the geomagnetism is generated. Even in such a case, it is possible to calculate a change in the attitude of the terminal device7because the orientation data changes when the terminal device7moves.

The acceleration sensor63is provided inside the housing50. The acceleration sensor63detects the magnitudes of linear accelerations along three axial directions (XYZ axial directions shown in (a) ofFIG. 9), respectively. Specifically, the long side direction of the housing50is defined as the Z-axial direction, the short side direction of the housing50is defined as the X-axial direction, and the direction orthogonal to the front surface of the housing50is defined as the Y-axial direction, and the acceleration sensor63detects the magnitudes of the linear accelerations in the respective axial directions. Acceleration data representing the detected accelerations is outputted to the UI controller65. The UI controller65outputs, to the acceleration sensor63, a control instruction for the acceleration sensor63. In the exemplary embodiment described herein, the acceleration sensor63is, for example, an electrostatic capacitance type MEMS acceleration sensor. However, in another exemplary embodiment, another type of acceleration sensor may be used. Further, the acceleration sensor63may be an acceleration sensor for detecting the magnitude of acceleration in one axial direction or two axial directions.

The gyro sensor64is provided inside the housing50. The gyro sensor64detects angular velocities around the three axes of the above-described X-axis, Y-axis, and Z-axis, respectively. Angular velocity data representing the detected angular velocities is outputted to the UI controller65. The UI controller65outputs, to the gyro sensor64, a control instruction for the gyro sensor64. Any number and any combination of gyro sensors may be used as long as the angular velocities around three axes are detected. The gyro sensor64may include a two-axes gyro sensor and a one-axis gyro sensor, like the gyro sensor48. Alternatively, the gyro sensor64may be a gyro sensor for detecting an angular velocity around one axis or two axes.

The vibrator79is, for example, a vibration motor or a solenoid. The vibrator79is connected to the UI controller65. The terminal device7is vibrated by actuating the vibrator79according to an instruction from the UI controller65. The vibration is conveyed to the user's hand holding the terminal device7. Thus, a so-called vibration-feedback game is realized.

The UI controller65outputs, to the codec LSI66, the operation data (the terminal operation data) including the touch position data, the stick data, the operation button data, the orientation data, the acceleration data, and the angular velocity data, which have been received from the respective components. If another device is connected to the terminal device7through the extension connector58, data representing operation on the other device may be also included in the operation data.

The codec LSI66is a circuit for subjecting data to be transmitted to the game apparatus3to a compression process, and subjecting data transmitted from the game apparatus3to a decompression process. The LCD51, the camera56, the sound IC68, the wireless module70, the flash memory73, and the infrared communication module72are connected to the codec LSI66. The codec LSI66includes a CPU77and an internal memory78. Although the terminal device7is configured not to perform a game process, the terminal device7needs to execute at least a program for managing the terminal device7and a program for communication. A program stored in the flash memory73is loaded into the internal memory78and executed by the CPU77when the terminal device7is powered on, thereby starting up the terminal device7. A part of the area of the internal memory78is used as a VRAM for the LCD51.

The camera56takes an image in accordance with an instruction from the game apparatus3, and outputs data of the taken image to the codec LSI66. The codec LSI66outputs, to the camera56, a control instruction for the camera56, such as an instruction to take an image. The camera56is also capable of taking a moving picture. That is, the camera56is capable of repeatedly performing image taking, and repeatedly outputting image data to the codec LSI66.

The sound IC68is connected to the loudspeakers67and the microphone69. The sound IC68is a circuit for controlling input of sound data from the microphone69to the codec LSI66and output of sound data to the loudspeakers67from the codec LSI66. Specifically, when the sound IC68receives sound data from the codec LSI66, the sound IC68performs D/A conversion on the sound data, and outputs a resultant sound signal to the loudspeakers67to cause the loudspeakers67to output sound. The microphone69detects sound (such as user's voice) propagated to the terminal device7, and outputs a sound signal representing the sound to the sound IC68. The sound IC68performs A/D conversion on the sound signal from the microphone69, and outputs a predetermined form of sound data to the codec LSI66.

The codec LSI66transmits the image data from the camera56, the sound data from the microphone69, and the operation data from the UI controller65, as the terminal operation data, to the game apparatus3through the wireless module70. In the exemplary embodiment described herein, the codec LSI66subjects the image data and the sound data to a compression process similar to that performed by the codec LSI27. The compressed image data and sound data, and the terminal operation data are outputted to the wireless module70as transmission data. The antenna71is connected to the wireless module70, and the wireless module70transmits the transmission data to the game apparatus3through the antenna71. The wireless module70has the same function as the terminal communication module28of the game apparatus3. That is, the wireless module70has a function of connecting to a wireless LAN by a method based on, for example, the IEEE802.11n standard. The transmitted data may be encrypted according to need, or may not be encrypted.

As described above, the transmission data transmitted from the terminal device7to the game apparatus3includes the operation data (the terminal operation data), the image data, and the sound data. If another device is connected to the terminal device7through the extension connector58, data received from the other device may be also included in the transmission data. The infrared communication module72performs infrared communication with another device based on, for example, the IRDA standard. The codec LSI66may include, in the transmission data, data received by the infrared communication, and transmit the transmission data to the game apparatus3, according to need.

As described above, the compressed image data and sound data are transmitted from the game apparatus3to the terminal device7. These data are received by the codec LSI66through the antenna71and the wireless module70. The codec LSI66decompresses the received image data and sound data. The decompressed image data is outputted to the LCD51, and an image is displayed on the LCD51. On the other hand, the decompressed sound data is outputted to the sound IC68, and the sound IC68outputs sound through the loudspeakers67.

When control data is included in the data received from the game apparatus3, the codec LSI66and the UI controller65issue control instructions for the respective components, according to the control data. As described above, the control data represents control instructions for the respective components (in the exemplary embodiment described herein, the camera56, the touch panel controller61, the marker section55, the sensors62to64, the infrared communication module72, and the vibrator79) included in the terminal device7. In the exemplary embodiment describe herein, the control instructions represented by the control data are considered to be instructions to start and halt (stop) the operations of the above-mentioned components. That is, some components which are not used for a game may be halted to reduce power consumption. In this case, data from the halted components are not included in the transmission data transmitted from the terminal device7to the game apparatus3. Since the marker section55is implemented as infrared LEDs, the marker section55is controlled by simply turning on/off the supply of power thereto.

As described above, the terminal device7includes the operation means such as the touch panel52, the analog stick53, and the operation button54. In another exemplary embodiment, however, the terminal device7may include other operation means instead of or in addition to these operation means.

The terminal device7includes the magnetic sensor62, the acceleration sensor63, and the gyro sensor64as sensors for calculating the movement (including the position and the attitude, or a change in the position and the attitude) of the terminal device7. In another exemplary embodiment, however, the terminal device7may include one or two of these sensors. In still another exemplary embodiment, the terminal device7may include other sensors instead of or in addition to these sensors.

The terminal device7includes the camera56and the microphone69. In another exemplary embodiment, however, the terminal device7may not include the camera56and the microphone69, or may include either of the cameral56and the microphone69.

The terminal device7includes the marker section55as a component for calculating the positional relation between the terminal device7and the main controller8(such as the position and/or the attitude of the terminal device7as viewed from the main controller8). In another exemplary embodiment, however, the terminal device7may not include the marker section55. In still another exemplary embodiment, the terminal device7may include other means as a component for calculating the above-mentioned positional relation. For example, in another exemplary embodiment, the main controller8may include a marker section, and the terminal device7may include an image pickup element. In this case, the marker device6may include an image pickup element instead of an infrared LED.

[5. Outline of Game Processing]

Next, description will be given of the outline of game processing performed in the game system1of the exemplary embodiment. A game described in the exemplary embodiment is a rhythm game in which a player operates the terminal device7, to the rhythm. The player holds the terminal device7and plays the game by changing the attitude of the terminal device7while looking at a game image displayed on the television2(television game image) and a game image displayed on the terminal device7(terminal game image).FIG. 12Ashows a basic posture of a player when playing the game, in which posture the player holds the terminal device7in a first attitude.FIG. 12Bshows a basic posture of a player when playing the game, in which posture the player holds the terminal device7in a second attitude.FIG. 13Ashows an example of a television game image displayed on a television2when the terminal device7is held in the first attitude.FIG. 13Bshows an example of a television game image displayed on the television2when the terminal device7is held in the second attitude.FIG. 14shows positional relationship between objects arranged in a virtual space of the game.

As shown inFIG. 12AandFIG. 12B, the player basically faces the screen of the television2(hereinafter, “to face the screen of the television2” may be referred to simply as “to face the television2”), and plays the game while looking at the television2and the LCD51of the terminal device7. Specifically, when the player holds the terminal device7in the first attitude, the player plays the game, looking at an image displayed on the screen of the television2(television game image). When the player holds the terminal device7in the second attitude, the player plays the game, looking at an image displayed on the LCD51of the terminal device7(terminal game image). As shown inFIG. 12A, the first attitude is an attitude in which the surface on which the LCD51of the terminal device7is provided is directed vertically upward (an attitude in which the Z-axis negative direction coincides with the direction of gravity) and in which the screen of the LCD51of the terminal device7is perpendicular to the direction of gravity. As shown inFIG. 12B, the second attitude is an attitude of the terminal device7in which the player holds the terminal device7with both hands, with his or her arms extended straight forward and in which the screen of the LCD51of the terminal device7is parallel to the direction of gravity.

As shown inFIG. 13AandFIG. 13B, a pirate ship90A, a pirate92, a bow93, and an arrow94are displayed on the television2. When the player holds the terminal device7in the first attitude, the pirate92is displayed in a zoomed-in manner. On the other hand, when the player holds the terminal device7in the second attitude, the pirate92is displayed in a zoomed-out manner. In this game, as shown inFIG. 14, the viewpoint of the player (the position of a first virtual camera) is set at a ship91in the virtual space, the pirate ship90A is arranged to the front of the ship91, a pirate ship90B is arranged to the right of the ship91, and a pirate ship90C is arranged to the left of the ship91. That is, in the exemplary embodiment, a game is performed in which the ship91where the player is on board is surrounded by the pirate ship90A, the pirate ship90B, and the pirate ship90C. The image taking direction of the first virtual camera is fixed, and the gazing point of the first virtual camera is fixed to the pirate92. It should be noted that the image taking direction of the first virtual camera is not necessarily fixed, and for example, the image taking direction may be changed in accordance with an operation performed onto the operation button of the terminal device7by the user or in accordance with the state of the game. When the player holds the terminal device7in the first attitude, the position of the first virtual camera is moved in the image taking direction of the first virtual camera and a zoom setting of the first virtual camera (the angle of view is reduced) is changed, whereby the pirate92is displayed in a zoomed-in manner. On the other hand, when the player changes the attitude of the terminal device7to the second attitude, the position of the first virtual camera is moved in a direction opposite to the image taking direction of the first virtual camera and the first virtual camera zooms out, whereby the pirate92is displayed in a zoomed-out manner. It should be noted that in the exemplary embodiment, the pirate ship90B and the pirate ship90C are arranged to the right and the left of the ship91, respectively. However, the pirate ship90B and the pirate ship90C may be arranged at any positions. That is, the pirate ship90B and the pirate ship90C are arranged at positions at 90 degrees relative to the front of the ship91in the in the exemplary embodiment. However, these may be arranged at positions at, for example, 60 degrees relative to the front of the ship91.

FIG. 15Ashows an example of a terminal game image displayed on the LCD51of the terminal device7at the time when the player is facing the front of the television2while holding the terminal device7in front of his or her face.FIG. 15Bshows an example of a terminal game image displayed on the LCD51of the terminal device7at the time when the player is facing to the right relative to the television2while holding the terminal device7in front of his or her face.

As shown inFIG. 15A, when the player faces the front of the television2, the pirate ship90A and a part of the ship91are displayed on the LCD51, but the pirate92, the bow93, and the arrow94are not displayed on the LCD51. A terminal game image is an image that is obtained by a second virtual camera set on the ship91taking an image of the virtual space. The position of the second virtual camera is set to substantially the same as that of the first virtual camera when the pirate92is displayed in a zoomed-out manner. The angle of view of the second virtual camera is set to substantially the same as that of the first virtual camera when the pirate92is displayed in a zoomed-out manner. Therefore, the imaging range of the second virtual camera is substantially the same as that of the first virtual camera when the pirate92is displayed in a zoomed-out manner. When the player faces the front of the television2while holding the terminal device7in front of his or her face as shown inFIG. 12B, the surface opposite to the surface on which the screen of the LCD51of the terminal device7is provided faces the television2. In such an attitude of the terminal device7(herein, this attitude may be referred to as “reference attitude”), the pirate ship90A which is located to the front of the ship91is displayed on the LCD51of the terminal device7. The “reference attitude” is an attitude in which the surface opposite to the surface on which the LCD51of the terminal device7is provided faces the television2and in which a straight line extended from the terminal device7in the Z-axis negative direction is substantially perpendicular to the screen of the television2. When the terminal device7is in the reference attitude, the image taking direction of the second virtual camera coincides with the image taking direction of the first virtual camera, and the positions of the first virtual camera and the second virtual camera substantially coincide with each other.

The first attitude is an attitude in which the screen of the LCD51of the terminal device7is perpendicular to the direction of gravity. Therefore, no matter where the top surface of the terminal device7is directed as long as the screen of the LCD51of the terminal device7is perpendicular to the direction of gravity, the terminal device7is assumed to be in the first attitude. That is, independent of the degree of rotation of the terminal device7about the Z-axis, when the Z-axis negative direction of the terminal device7almost coincides with the direction of gravity, the terminal device7is assumed to be in the first attitude. Similarly, the second attitude is an attitude in which the screen of the LCD51of the terminal device7is parallel to the direction of gravity. Therefore, no matter where the surface opposite to the surface on which the screen of the LCD51of the terminal device7is provided is directed as long as the screen of the LCD51of the terminal device7is parallel to the direction of gravity, the terminal device7is assumed to be in the second attitude. That is, independent of the degree of rotation of the terminal device7about the Y-axis, when the Y-axis negative direction of the terminal device7almost coincides with the direction of gravity, the terminal device7is assumed to be in the second attitude. In contrast, the reference attitude is an attitude in which the Y-axis negative direction of the terminal device7coincides with the direction of gravity and in which the surface opposite to the surface on which the screen of the LCD51of the terminal device7is provided is directed to the television2. That is, the reference attitude is an attitude of the terminal device7at the time when the player stands to the front of the television2, holding the terminal device7with both hands, with his or her arms extended straight forward.

As shown inFIG. 15B, when the player faces to the right relative to the television2(seeFIG. 20), the pirate ship90B and a part of the ship91are displayed on the LCD51of the terminal device7. When the player faces to the right relative to the television while holding the terminal device7in front of his or her face, the left side surface of the terminal device7faces the television2. In such an attitude of the terminal device7, the pirate ship90B which is located to the right of the ship91is displayed on the LCD51of the terminal device7.

As described above, when the player faces to the right relative to the television2, the second virtual camera also faces to the right, and thus, the pirate ship90B is displayed on the LCD51. That is, the attitude of the second virtual camera is changed in accordance with the attitude of the terminal device7, and the attitude of the second virtual camera is determined so as to coincide with the attitude of the terminal device7. Specifically, the attitude of the second virtual camera is set in accordance with rotation angles about the respective axes (X, Y, and Z-axes), which are obtained by integrating by time angular velocities about the respective axes detected by the gyro sensor64. It should be noted that the zoom setting of and the position in the image taking direction of the second virtual camera are not changed in accordance with the attitude of the terminal device7.

On the other hand, even when the player faces to the right relative to the television2, the image displayed on the television2is not changed. That is, the attitude of the first virtual camera is not changed in accordance with the attitude of the terminal device7. However, as described above, the position and the zoom setting of the first virtual camera are changed in accordance with the attitude of the terminal device7. Specifically, the position of the first virtual camera is moved from the ship91toward the pirate ship90A in accordance with the rotation angle of the terminal device7about the X-axis, and concurrently the angle of view of the first virtual camera is reduced, whereby the pirate92is displayed in a zoomed-in manner as shown inFIG. 13A.

It should be noted that, in the exemplary embodiment, two virtual spaces are defined in the game apparatus3, and the first virtual camera is set in one virtual space and the second virtual camera is set in the other virtual space. The same objects (the pirate ship90A, background objects, etc.) are arranged in each of the two virtual spaces, and the same objects are displayed on each screen. Therefore, the television2and the terminal device7display respective images as if they had been obtained by different virtual cameras imaging the same virtual space. Meanwhile, whenFIG. 13Bis compared withFIG. 15A,FIG. 13Bshows the pirate92butFIG. 15Adoes not show the pirate92, although they are basically similar images. The pirate92and the like are not arranged in the other virtual space for which the second virtual camera is provided. Since the terminal device7is a portable display device, the dimensions of the screen are limited, and thus, even if a small image of the pirate92is displayed on the LCD51of the terminal device7, the player cannot recognize the image. Therefore, the pirate92is not displayed on the LCD51of the terminal device7. It should be noted that, in another exemplary embodiment, one virtual space may be defined, and two virtual cameras may be arranged in one virtual space. Alternatively, two virtual spaces may be defined, and identical objects (the pirate ship90A, etc.) may not be necessarily arranged in each of the virtual spaces. The objects may have different appearances as long as they represent the same object. For example, a first object representing the pirate ship90A is arranged in one virtual space, and a second object representing a pirate ship having the same as or a similar appearance to that of the pirate ship90A may be arranged in the other virtual space. Then, images of these objects are taken by the virtual cameras arranged in respective virtual spaces, and the taken images may be displayed on the television2and the terminal device7, respectively.

Next, the rhythm game of the exemplary embodiment will be described in detail.FIG. 16Ashows an example of a television game image displayed on the television2at a first timing after the game of the exemplary embodiment has been executed.FIG. 16Bshows an example of a terminal game image displayed on the LCD51of the terminal device7at a second timing after a predetermined time period has elapsed from the first timing.FIG. 17shows how the player changes the attitude of the terminal device7, from a state where the player is holding the terminal device7in a lower position (first attitude) to a state where the player is directing the terminal device7upwardly.

When the game is started, an image shown inFIG. 13AorFIG. 13Bis displayed on the television2in accordance with the attitude of the terminal device7, and predetermined music is outputted from the speaker2aof the television2. Then, as shown inFIG. 16A, at a predetermined timing (first timing), an instruction image96indicating that the arrow94has been shot is displayed on the television2. The instruction image96is an image for notifying the player that an arrow has been shot. Moreover, at the first timing, a voice indicating that the arrow94has been shot and from which direction the arrow94will come (from which position the arrow94has been shot), and sound effects indicating that the arrow94has been shot are outputted from the speaker2aof the television2. For example, as shown inFIG. 16A, an instruction image96showing that the pirate92has shot the arrow94is displayed at the first timing, and concurrently, the pirate92issues a voice instruction “Up” to the player (the voice of the pirate92is outputted from the speaker2aof the television2). In response to the instruction image96and the voice instruction, the player changes the attitude of the terminal device7so as to direct the terminal device7upward. It should be noted that a pirate who shoots the arrow94, who is different from the pirate92, may be displayed. Alternatively, when the pirate92shoots the arrow94, letter information indicating from which direction the arrow94will come may be displayed on the television2, in addition to the instruction image96and the voice.

As shown inFIG. 17, in order to receive the arrow with the surface opposite to the surface on which the screen of the LCD51of the terminal device7is provided, the player directs that surface upwardly in the real space. Then, in the case where the attitude of the terminal device7is being maintained at the second timing after the predetermined time period has elapsed from the first timing, the image shown inFIG. 16Bis displayed on the LCD51of the terminal device7. Specifically, an image97indicating that the arrow94has been received with the terminal device7and a circular image98indicating the position at which the arrow has been received are displayed on the LCD51of the terminal device7. In the exemplary embodiment, conceptually, the terminal device7is used as a target for receiving the arrow94shot by the pirate92.

As shown inFIG. 16AandFIG. 16B, a scene in which the arrow94is shot is displayed on the television2at the first timing, and a scene in which the arrow94has reached the player is displayed on the LCD51of the terminal device7at the second timing. The scene in which the arrow94is flying toward the player during the time period from the first timing to the second timing is not displayed on either the television2or the LCD51of the terminal device7. Therefore, the player changes the attitude of the terminal device7in time with the sound outputted from the speaker2aof the television2and the loudspeakers67of the terminal device7. Specifically, the player estimates a timing at which the arrow94reaches the player based on the music outputted from the speaker2aof the television2, changes the attitude of the terminal device7before the arrow94reaches the player, and maintains the attitude of the terminal device7until that timing. Moreover, the player estimates the timing at which the arrow94reaches the player based on sound effects outputted from the speaker2aof the television2and the loudspeakers67of the terminal device7. The sound effects are such sounds that would be generated due to the friction between the arrow94and air when the arrow94flies through the space. The sound effects are started to be outputted from the television2at the first timing at which the arrow94is shot, and the sound effects outputted from the television2become less loud in accordance with the elapsed time from the first timing. Meanwhile, the sound effects from the terminal device7become gradually loud in accordance with the elapsed time from the first timing, which allows the player to imagine that the arrow94is approaching.

FIG. 18shows an elapsed time from a first timing t1at which the pirate92shot the arrow94till a second timing t2at which the arrow94reaches the player. As described above, concurrently when the game is started, predetermined music is started to be outputted from the speaker2aof the television2. At a first timing t1while the music is being outputted, the arrow94is shot. Then, at a second timing t2when a predetermined time period has elapsed from the first timing t1(for example, three beats), the arrow94reaches the player. At the second timing t2, in the case where the attitude of the terminal device7is the attitude instructed at the first timing t1, an image97indicating that the arrow94has been received with the terminal device7and a circular image98are displayed. Further, a sound indicating that the arrow94has been received with the terminal device7(sound effect) is outputted from the loudspeakers67of the terminal device7. It should be noted that at the second timing t2, in the case where the attitude of the terminal device7is not the attitude instructed at the first timing t1, the image97indicating that the arrow94has been received with the terminal device7and the circular image98are not displayed. For example, a sound, a vibration, or the like outputted from the terminal device7(vibration of the vibrator79) notifies the player that the player has failed to receive the arrow with the terminal device7. Further, when another first timing t1has come after some time elapsed, the pirate92issues a next sound instruction.

During a time period from the second timing t2to the next first timing t1, the player performs a predetermined operation onto the terminal device7so as to perform an operation of shaking off the arrow94received with the terminal device7. When the operation is performed, the arrow94is shaken off, and the image97and the circular image98are not displayed on the LCD51of the terminal device7any more. The predetermined operation may be, for example, an operation that generates an acceleration of a predetermined magnitude in the Z-axis negative direction of the terminal device7, or an operation of pressing a predetermined operation button of the terminal device7.

The first timings t1and the second timings t2are stored associated with the predetermined music in advance. The first timings and the second timings are set, synchronized with the rhythm of the predetermined music. For example, the first timings t1are stored in the game apparatus3, being associated with the predetermined music in advance such that they correspond to the timings of the 8th beat, the 16th beat, the 24th beat, and the like after the predetermined music is started to be reproduced. Second timings t2are stored in the game apparatus3in advance such that they correspond to the timing of the 3rd beat from each first timing t1. Then, at the time when the predetermined music is started to be reproduced, the first timings and the second timings are set. The reproduction speed of the predetermined music may be adjusted by the player. When the reproduction speed of the predetermined music is adjusted, the time period from the start of the reproduction of the music to the first timing and the time period from the start of the reproduction of the music to the second timing are also adjusted, in accordance with the adjusted reproduction speed. That is, for example, when the predetermined music is reproduced at a faster tempo than usual, the time period from the start of the reproduction of the music to the first timing that comes first is shortened, and the time period from the first timing to the second timing is also shortened.

FIG. 19Ashows another example of a television game image displayed on the television2at a first timing t1after the game of the exemplary embodiment has been executed.FIG. 19Bshows another example of a terminal game image displayed on the LCD51of the terminal device7at a second timing t2after a predetermined time period has elapsed from the first timing t1.

As shown inFIG. 19A, at a first timing t1, a scene in which the pirate92is indicating right is displayed on the television2, and concurrently, the pirate92issues a voice instruction “Right” to the player (the voice of the pirate92is outputted from the speaker2aof the television2). In this case, the arrow94is not shot from the pirate ship90A which is located to the front of the ship91and the arrow94is shot from the pirate ship90B which is located to the right of the ship91. Therefore,FIG. 19Adoes not show the instruction image96indicating that the pirate92has shot the arrow94.

Here, in response to the instruction issued by the pirate92, the player changes his or her posture (and the attitude of the terminal device7) so as to turn to the right relative to the television2.FIG. 20is a view of the player turning to the right relative to the television2in response to an instruction from the game apparatus3, viewed from above in the real space. Usually, the player waits for an instruction from the game apparatus3, facing to the front of the television2. At a first timing t1, the player receives an instruction from the game apparatus3, in the form of an action of the pirate92indicating right and the voice from the speaker2aof the television2. Then, as shown inFIG. 20, the player changes his or her posture so as to turn to the right relative to the television2in accordance with the instruction (the whole body is turned to the right by 90 degrees). Then, as shown inFIG. 19B, at a second timing t2, the image97and the circular image98indicating that the arrow94has reached the terminal device7are displayed.

Instructions by the pirate92as described above are repeatedly issued, to the rhythm of the music. In the game of the exemplary embodiment, based on an instruction by an image displayed on the television2and an instruction by a voice from the television2, the player operates the terminal device7to the rhythm at a right timing. When the player has performed the operation in accordance with the instruction, the player can obtain a high score.

[6. Details of Game Processing]

Next, the game processing performed in the game system will be described in detail. First, various data used in the game processing will be described.FIG. 21shows various data used in the game processing. Specifically,FIG. 21shows main data stored in the main memory (the external main memory12or the internal main memory11e) of the game apparatus3. As shown inFIG. 21, a game program100, terminal operation data110, and processing data120are stored in the main memory of the game apparatus3. It should be noted that data used to play the game, such as image data of various objects that appear in the game and sound data used in the game, are stored in the main memory, in addition to the data shown inFIG. 21.

A part or the whole of the game program100is read from the optical disc4, at an appropriate timing after the game apparatus3is powered on, and is stored in the main memory. It should be noted that, the game program100may be obtained from the flash memory17or an external device of the game apparatus3(for example, via the Internet), instead of the optical disc4. A part of the game program100(for example, a program for calculating an attitude of the terminal device7) may be stored in the game apparatus3in advance.

The terminal operation data110is data indicating an operation performed by the player onto the terminal device7, and is outputted (transmitted) from the terminal device7based on the operation performed onto the terminal device7. The terminal operation data110is transmitted from the terminal device7, obtained by the game apparatus3, and stored in the main memory. The terminal operation data110includes angular velocity data111, acceleration data112, orientation data113, and operation button data114. In addition to these types of data, the terminal operation data110further includes data indicating the position at which the touch panel52of the terminal device7is touched (touch position), and the like. When the game apparatus3obtains terminal operation data from a plurality of terminal devices7, the game apparatus3may cause the terminal operation data110transmitted from each terminal device7to be stored separately in the main memory.

The angular velocity data111is data indicating an angular velocity detected by the gyro sensor64in the terminal device. Here, the angular velocity data111indicates angular velocities about the respective axes of the fixed XYZ coordinate system in the terminal device7(seeFIG. 9). However, in another exemplary embodiment, the angular velocity data111may indicate one or more angular velocities about any one or more axes, respectively.

The acceleration data112is data indicating an acceleration detected by the acceleration sensor63of the terminal device7. Here, the acceleration data112indicates accelerations about the respective axes of the fixed XYZ coordinate system in the terminal device7(seeFIG. 9).

The orientation data113is data indicating an orientation detected by the magnetic sensor62of the terminal device7.

The operation button data114is data indicating whether the operation buttons54A to54L provided in the terminal device7are pressed.

The processing data120is data used in the game processing (FIG. 22) described below. The processing data120includes attitude data121, first virtual camera data122, second virtual camera data123, and status data124. In addition to the data shown inFIG. 21, the processing data120further includes various data used in the game processing, such as data regarding scores.

The attitude data121is data indicating an attitude of the terminal device7. The attitude of the terminal device7is expressed, for example, in a rotation matrix representing a rotation from the reference attitude to the current attitude. The attitude of the terminal device7may be expressed by three angles (rotation angles about the respective XYZ axes). The attitude data121is calculated based on the angular velocity data111included in the terminal operation data110from the terminal device7. Specifically, the attitude data121is calculated by integrating by time angular velocities about the X-axis, the Y-axis, and the Z-axis, respectively, detected by the gyro sensor64. It should be noted that the attitude of the terminal device7may not be necessarily calculated based on the angular velocity data111indicating the angular velocity detected by the gyro sensor64, and may be calculated based on the acceleration data112indicating the acceleration detected by the acceleration sensor63and the orientation data113indicating the orientation detected by the magnetic sensor62. Further, the attitude may be calculated by correcting, based on the acceleration data and the orientation data, an attitude which has been calculated based on the angular velocity. Moreover, pieces of attitude data121indicating attitudes of the terminal device7in a predetermined number of past frames are chronologically stored in the main memory.

The first virtual camera data122includes data indicating a position and an attitude of the first virtual camera in the virtual space where the first virtual camera is set, and data indicating a zoom setting of the first virtual camera (setting of the angle of view). As described above, the first virtual camera is a virtual camera for generating a television game image. Although the image taking direction (attitude) of the first virtual camera is fixed, the position and the zoom setting of the first virtual camera are changed in accordance with the attitude of the terminal device7.

The second virtual camera data123is data indicating a position and an attitude of the second virtual camera in the virtual space where the second virtual camera is set. As described above, the second virtual camera is a virtual camera for generating a terminal game image. Although the position of the second virtual camera is fixed, the attitude of the second virtual camera is changed in accordance with the attitude of the terminal device7.

The status data124is data indicating which of the first attitude and the second attitude the attitude of the terminal device7is.

Next, the game processing performed in the game apparatus3will be described in detail with reference toFIG. 22andFIG. 23.FIG. 22is a main flowchart showing the flow of the game processing performed in the game apparatus3. When the game apparatus3is powered on, the CPU10of the game apparatus3executes the boot program stored in the boot ROM not shown, to initialize units such as the main memory. Then, the game program stored in the optical disc4is loaded onto the main memory and the game program is started to be performed by the CPU10. The flowchart shown inFIG. 22shows processing performed after the above process is completed. It should be noted that, the game apparatus3may be configured such that the game program is executed immediately after the power is turned on. Alternatively, the following configuration may be employed: a built-in program that causes a predetermined menu screen to be displayed is firstly executed immediately after the power is turned on, and then, for example, the user performs a selection operation on the menu screen to issue an instruction of starting the game, whereby the game program is executed.

It should be noted that the processes of the steps in the flowcharts ofFIG. 22andFIG. 23are merely an example, and the sequence of the processes of the steps may be changed as long as the same result is obtained. Also, the values and the like of variables and constants are merely an example, and other values may be employed as appropriate. In the exemplary embodiment, description will be given under an assumption that all the processes of the steps in the flowcharts are performed by the CPU10. However, processes of some of the steps in the flowcharts may be performed by a processor or a dedicated circuit other than the CPU10.

First, in step S1, the CPU10performs an initial process. The initial process is a process of constructing virtual spaces, arranging objects (the ship91, the pirate ship90A, the pirate92, the arrow94, the first virtual camera, the second virtual camera, etc.) that will appear in the virtual spaces at initial positions, and setting initial values of various parameters used in the game processing.

Moreover, in step S1, an initial process for the terminal device7is performed. For example, an image for causing the player to hold the terminal device7in the reference attitude and to press a predetermined operation button of the terminal device7while maintaining the attitude is displayed on the television2. As a result of the initial process of the terminal device7, rotation angles about the respective XYZ axes in the terminal device7are set to 0. It should be noted that, in the initial process, only the rotation angle about the Z-axis (rotation angle about the axis perpendicular to the LCD51of the terminal device7) may be set to 0 and rotation angles about the X-axis and the Y-axis may be set based on the acceleration detected by the acceleration sensor63. The game apparatus3can calculate how much the terminal device7is tilted relative to the direction of gravity, based on the direction of gravity detected by the acceleration sensor63of the terminal device7. However, the game apparatus3cannot know which direction the terminal device7is directed (how much the terminal device7is rotated about the vertically downward axis direction) only based on the direction of gravity detected by the acceleration sensor63. Therefore, in step S1, the player is caused to hold the terminal device7such that a predetermined surface of the terminal device7(for example, the surface opposite to the surface on which the LCD51is provided) faces the television2, and the rotation angle of the terminal device7about the Z-axis is set to 0, thereby initializing the attitude of the terminal device7. Accordingly, using the attitude of the terminal device7at the time of the initialization as a reference attitude, the game apparatus3can calculate a change of the attitude of the terminal device7from the reference attitude, based on the angular velocity detected by the gyro sensor64.

When a predetermined operation button of the terminal device7is pressed and the initial process of the terminal device7is completed, the CPU10starts reproduction of predetermined music, and then performs the process of step S2. Thereafter, a processing loop composed of the processes of step S2to S8is repeatedly performed once in a predetermined time period (one frame time; for example, 1/60 second).

In step S2, the CPU10obtains the terminal operation data110which has been transmitted from the terminal device7and stored in the main memory. The terminal device7repeatedly transmits the terminal operation data110to the game apparatus3. In the game apparatus3, the terminal communication module28sequentially receives terminal operation data, and the I/O processor11asequentially stores the received terminal operation data in the main memory. It is preferable that the interval between a transmission and a reception performed between the terminal device7and the game apparatus3is shorter than the game processing period, and for example, it is 1/200 sec. In step S2, the CPU10reads the latest terminal operation data110from the main memory. After step S2, the process of step S3is performed.

In step S3, the CPU10performs a game control process. The game control process is a process of advancing the game in accordance with a game operation performed by the player. Hereinafter, with reference toFIG. 23, the game control process will be described in detail.

FIG. 23is a flowchart showing in detail the flow of the game control process (step S3) shown inFIG. 22.

In step S11, the CPU10calculates an attitude of the terminal device7based on the angular velocity data111. Specifically, the CPU10calculates the attitude of the terminal device7, based on the angular velocity data111obtained in step S2and the attitude data121stored in the main memory. More specifically, the CPU10calculates rotation angles about the respective axes (X-axis, Y-axis, and Z-axis) obtained by multiplying, by one frame time, angular velocities about the respective axes indicated by the angular velocity data111obtained in step S2. The rotation angles about the respective axes calculated in this manner are rotation angles of the terminal device7about the respective axes (rotation angles in one frame time) during a time period from the time when an immediately preceding processing loop was performed to the time when the current processing loop is performed. Next, the CPU10adds the calculated rotation angles about the respective axes (rotation angles in one frame time) to the rotation angles of the terminal device7about the respective axes indicated by the attitude data121, and thereby calculates the latest rotation angles of the terminal device7about the respective axes (the latest attitude of the terminal device7). Further, the calculated attitude may be corrected based on the acceleration. Specifically, when the amount of motion of the terminal device7is little, the direction of the acceleration detected by the acceleration sensor63can be considered as the direction of gravity. Therefore, when the amount of motion of the terminal device7is little, the attitude may be corrected such that the direction of gravity calculated based on the attitude calculated based on the angular velocity approximates to the direction of the acceleration detected by the acceleration sensor63. Moreover, the calculated attitude may further be corrected based on the orientation data113. Specifically, how much the terminal device7is rotated about the axis in the vertically downward direction can be determined based on the orientation detected by the magnetic sensor62at the time when the initial process was performed in step S1and based on the orientation currently detected by the magnetic sensor62. Thus, the attitude of the terminal device7may be corrected based on the orientation data113. Then, the CPU10stores the calculated, latest attitude of the terminal device7as the attitude data121, in the main memory. The latest attitude of the terminal device7calculated in this manner indicates, using the attitude at the time of the initialization process (the time when the initialization was performed in step S1) as a reference attitude, rotation angles of the terminal device7about the respective axes from the reference attitude. Specifically, the attitude data121indicating the attitude of the terminal device7is data representing a rotation matrix. After the process in step S11, the CPU10performs the process of step S12.

In step S12, the CPU10sets a position and an angle of view of the first virtual camera in accordance with the attitude of the terminal device7. Specifically, the CPU10determines whether the attitude of the terminal device7calculated in step S11is the first attitude or the second attitude, and performs setting of the first virtual camera based on the determination result. More specifically, the CPU10first determines whether the rotation angle of the terminal device7about the X-axis is greater than or equal to a predetermined threshold value (for example, 45 degrees).

FIG. 24is a side view of the terminal device7rotating a predetermined angle about an X-axis. As shown inFIG. 24, when the rotation angle of the terminal device7about the X-axis is greater than or equal to the predetermined threshold value, the CPU10determines that the attitude of the terminal device7is the second attitude. On the other hand, when the rotation angle of the terminal device7about the X-axis is less than the predetermined threshold value, the CPU10determines that the attitude of the terminal device7is the first attitude. The CPU10stores the determination result in the main memory, as the status data124. It should be noted that the CPU10changes the predetermined threshold value depending on whether the current attitude of the terminal device7is the first attitude or the second attitude. For example, when the current attitude is the first attitude, the CPU10may use 45 degrees as the predetermined threshold value, and when the current attitude is the second attitude, the CPU10may use 30 degrees as the predetermined threshold value.

Next, the CPU10sets the position and the angle of view of the first virtual camera. Specifically, when the attitude of the terminal device7is the first attitude, the CPU10sets the position of the first virtual camera to a predetermined position (a position nearer to the pirate ship90A) between the position of the ship91and the position of the pirate ship90A, and sets the angle of view of the first virtual camera to a minimum value. Accordingly, when an image of the virtual space is taken by the first virtual camera, the pirate ship90A (the pirate92) is zoomed in and thus displayed in a zoomed-in manner. On the other hand, when the attitude of the terminal device7is the second attitude, the CPU10sets the position of the first virtual camera to the position of the ship91and sets the angle of view of the first virtual camera to a maximum value. Accordingly, when an image of the virtual space is taken by the first virtual camera, the pirate ship90A is zoomed out, and thus, is displayed in a zoomed-out manner. It should be noted that the CPU10moves the first virtual camera and changes the angle of view of the first virtual camera over a predetermined time period. Therefore, a scene in which the pirate92is gradually zoomed in or zoomed out is displayed on the television2. After the process of step S12is completed, the CPU10performs the process of step S13next.

In step S13, the CPU10sets the volume of the sound outputted from the television2, in accordance with the attitude of the terminal device7. Specifically, when the attitude of the terminal device7calculated in step S11is the second attitude, the CPU10lowers the volume of the sound (the voice of the pirate92and the music) outputted form the television2, compared with that in the case of the first attitude. Then, the CPU10performs the process of step S14.

In step S14, the CPU10sets the attitude of the second virtual camera in accordance with the attitude of the terminal device7. Specifically, the CPU10sets the attitude of the second virtual camera so as to coincide with the attitude of the terminal device7calculated in step S11, and stores it as the second virtual camera data123in the main memory. Accordingly, for example, when the surface opposite to the surface on which the LCD51of the terminal device7is provided faces the television2(when the player faces the television2), the attitude of the second virtual camera is set such that the second virtual camera faces the pirate ship90A. On the other hand, for example, when the surface on which the LCD51of the terminal device7is provided faces the television2(when the player faces opposite to the television2), the attitude of the second virtual camera is set such that the second virtual camera faces opposite to the pirate ship90A. Then, the CPU10performs the process of step S15.

In step S15, the CPU10determines whether the time is a first timing t1. Specifically, the CPU10determines whether the time is a first timing t1, based on an elapsed time from the start of the reproduction of the predetermined music in step S1. When the determination result is affirmative, the CPU10performs the process of step S16next. On the other hand, when the determination result is negative, the CPU10performs the process of step S18next.

In step S16, the CPU10starts a process of shooting the arrow94. Specifically, the CPU10determines, based on a predetermined algorithm, from which direction to shoot the arrow94(front, up, right, left, etc.), and starts the process for shooting the arrow94from the determined direction. Accordingly, when an image of the pirate92is taken by the first virtual camera, a scene in which the pirate92shoots the arrow94(FIG. 16A) is displayed on the television2, and a scene in which the pirate92indicates the direction from which the arrow94will come (FIG. 19A) is displayed on the television2. Moreover, as a process of shooting the arrow, the CPU10reproduces a voice indicating from which direction the arrow94will come and sound effects indicating that the arrow94has been shot. A plurality of pieces of sound data are stored in the main memory, and the CPU10selects a piece of sound data corresponding to the determined direction. Thus, a voice indicating from which direction the arrow94will come (for example, “Right”) is outputted from the speaker2aof the television2. Then, the CPU10performs the process of step S17.

In step S17, the CPU10starts measuring an elapsed time from the first timing t1. After performing the process of step S17, the CPU10ends the game control process shown inFIG. 23.

Meanwhile, in step S18, the CPU10determines whether the arrow94has currently been shot. The process here, as shown inFIG. 18, is a process of determining whether the current time is in a time period from the first timing t1to the second timing t2. Specifically, based on the elapsed time from the time when the measurement was started in step S17, the CPU10determines whether the arrow94has currently been shot. When the determination result is affirmative, the CPU10performs the process of step S19next. On the other hand, when the determination result is negative, the CPU10performs the process of step S22next.

In step S19, the CPU10determines whether the current time is a second timing t2. Specifically, based on an elapsed time from the time when the measurement was started in step S17, the CPU10determines whether the current time is a second timing t2. When the determination result is affirmative, the CPU10performs the process of step S20next. On the other hand, when the determination result is negative, the CPU10performs the process of step S21next.

In step S20, the CPU10performs a determination process based on the attitude of the terminal device7. In step S20, the CPU10performs a process in accordance with the attitude of the terminal device7at the second timing t2. Specifically, the CPU10determines whether the attitude of the terminal device7calculated in step S11is an attitude that corresponds to the direction determined in step S16(the direction instructed by the pirate92). For example, when it is determined in step S16that the arrow94is to be shot from the right (that is, the arrow94is to be shot to the player from the pirate ship90B), the CPU10determines whether the attitude of the terminal device7is an attitude as shown inFIG. 20. For example, the CPU10determines, based on the attitude data121, whether a coordinate value for each axis of a unit vector along the Z-axis negative direction of the terminal device7is within a predetermined range in accordance with the determination in step S16, and thereby determines whether the attitude of the terminal device7is an attitude in accordance with the instruction by the pirate92.

Further, the CPU10determines, in accordance with the determination result, an image to be displayed on the LCD51of the terminal device7, and sound to be outputted from the loudspeakers67of the terminal device7. Accordingly, for example, when the determination result is affirmative (when the attitude of the terminal device7is the attitude corresponding to the direction determined in step S16), the processes of step S5and step S7described below are performed, whereby the image97and the circular image98shown inFIG. 16Bare displayed on the LCD51. Moreover, when the determination result is affirmative, the CPU10adds points.

It should be noted that, in step S20, the CPU10determines whether the terminal device7is moving, based on the attitude of the terminal device7calculated using the predetermined number of past frames. Attitudes of the terminal device7in the predetermined number of past frames are stored in the main memory. Therefore, the CPU10can calculate how much the attitude of the terminal device7has changed, based on the attitudes of the terminal device7in the predetermined number of past frames. In the case where the CPU10has determined that the terminal device7is moving, even when the attitude of the terminal device7is the attitude corresponding to the direction determined in step S16, the CPU10generates an image different from that in the case where the terminal device7is stationary, and displays it on the terminal device7.FIG. 25shows an example of an image displayed on the LCD51of the terminal device7at a second timing t2when the terminal device7is moving. As shown inFIG. 25, when the terminal device7is moving, even if the player is holding the terminal device7in a proper attitude (the attitude in accordance with the instruction by the pirate92), the circular image98indicating the position at which the arrow has been received is displayed at a position shifted from the center of the screen. In this case, the player obtains less points than those at a time when the terminal device7is determined not to be moving. In this manner, the game processing is performed in accordance with the attitude of the terminal device7at the second timing t2, and the result of the game processing differs depending on whether the terminal device7is stationary at that timing. It should be noted that, whether the terminal device7is moving may be determined based on the acceleration detected by the acceleration sensor63.

Further, depending on whether the arrow94remains without having been shaken off, the CPU10determines, in step S20, positions at which the image97and the circular image98are to be displayed. In step S22described below, when a process of shaking off the arrow94that has been received with the terminal device7is not performed, the arrow94is not shaken off and remains to be displayed on the LCD51of the terminal device7. Information indicating whether the arrow94received with the terminal device7is remaining is stored in the main memory. When the arrow94is remaining, the image97and the circular image98are not displayed near the center of the screen of the terminal device7and displayed at positions shifted from the center of the screen as shown inFIG. 25. Also in this case, the user obtains less points than those when the arrow94is not remaining. Then, the CPU10resets the elapsed time which has been measured since step S17, and ends the game control process shown inFIG. 23.

In step S21, the CPU10sets a setting of displaying/not displaying a lock-on frame.FIG. 26is an example of a lock-on frame99displayed on the LCD51of the terminal device7. The process of step S21is performed when the arrow94has been shot and the current time is not a second timing t2(that is, it in a time period from a first timing t1to a second timing t2). The process of step S21is a process of determining whether the attitude of the terminal device7during that time period is a proper attitude (an attitude in accordance with the instruction by the pirate92) and of displaying the lock-on frame99when the attitude is the proper attitude. The lock-on frame99notifies the player that the current attitude of the terminal device7is a proper attitude. That is, the CPU10determines whether the current attitude of the terminal device7is the attitude corresponding to the direction instructed by the pirate92when the arrow94was shot. When the determination result is affirmative, the CPU10turns on the setting of displaying the lock-on frame99. When the determination result is negative, the CPU10turns off the setting of displaying the lock-on frame99. After performing the process of step S21, the CPU10ends the game control process shown inFIG. 23.

On the other hand, in step S22, the CPU10performs the process based on accelerations. In step S22, a predetermined game processing is performed based on the acceleration detected by the acceleration sensor63of the terminal device7. Specifically, the CPU10determines whether the value of the acceleration of the terminal device7in the Z-axis negative direction is greater than or equal to a predetermined threshold value. When the determination result is affirmative, the CPU10performs the process of shaking off the arrow94received with the terminal device7, and ends the game control process shown inFIG. 23. Accordingly, the image97and the circular image98having been displayed on the LCD51of the terminal device7are not displayed any more. When the determination result is negative, or when the arrow94has been shaken off, the CPU10directly ends the game control process shown inFIG. 23.

With reference back toFIG. 22, after performing the process of step S3, the CPU10performs the process of step S4next.

In step S4, the CPU10performs a process of generating a television game image. Specifically, the CPU10obtains a television game image by causing the first virtual camera to take an image of the virtual space. When an image of the virtual space is taken by the first virtual camera set at the position and having the angle of view which were set in the process of step S12, the pirate92is displayed in a zoomed-in manner or zoomed-out manner in accordance with the attitude of the terminal device7. Through these processes, television game images corresponding to the states of the game, such as a zoomed-in image of the pirate92and an image of a scene in which the pirate92is shooting the arrow94, are obtained. Then, the CPU10performs the process of step S5.

In step S5, the CPU10performs a process of generating a terminal game image. Specifically, the CPU10generates a terminal game image by causing the second virtual camera, for which the attitude in the virtual space has been set in step S14, to take an image of the virtual space. Further, in accordance with the result of the processes of step S20and step S22, the CPU10superimposes the image97and the circular image98on the generated terminal game image. Moreover, when the setting of displaying the lock-on frame99is turned on in step S21, the CPU10superimposes the image of the lock-on frame99on the terminal game image obtained by causing the second virtual camera to take an image of the virtual space. Thereby, terminal game images corresponding to the states of the game, such as an image of the virtual space seen from the second virtual camera having the attitude in accordance with the attitude of the terminal device7, an image showing that the arrow94has hit the screen of the terminal device7at the second timing t2(the circular image98, etc.), are obtained. Then, the CPU10performs the process of step S6.

In step S6, the CPU10outputs, to the television2, the television game image generated in step S4. Accordingly, for example, an image such as that shown inFIG. 13AorFIG. 13Bis displayed on the television2. Moreover, in step S6, sound data is outputted to the television2, along with the television game image, and game sound is outputted at the volume set in the process of step S13from the speaker2aof the television2. Specifically, when the attitude of the terminal device7is the first attitude, the sound volume of the voice of the pirate92is increased than that in the case of the second attitude. That is, when the pirate92is being displayed in a zoomed-in manner, the sound volume of the voice of the pirate92is increased, and when the pirate92is being displayed in a zoomed-out manner, the sound volume is decreased. It should be noted that the sound volume of the predetermined music may vary in accordance with the attitude of the terminal device7(in the case of the first attitude, the sound volume of the predetermined music may be increased as in the case of the voice of the pirate92), or alternatively, may be constant irrespective of the attitude of the terminal device7. Then, the CPU10performs the process of step S7.

In step S7, the CPU10transmits the terminal game image to the terminal device7. Specifically, the CPU10sends the terminal game image generated in step S5to the codec LSI27, and the codec LSI27performs a predetermined compression process onto the terminal game image. The compressed image data is transmitted to the terminal device7via the antenna29by the terminal communication module28. The terminal device7receives the data of the image transmitted from the game apparatus3via the wireless module70. The codec LSI66performs a predetermined decompression process onto the received image data. The decompressed image data is outputted to the LCD51. Accordingly, the terminal game image is displayed on the LCD51. Moreover, in step S7, the sound data is transmitted to the terminal device7, along with the terminal game image, and the game sound is outputted form the loudspeakers67of the terminal device7. Then, the CPU10performs the process of step S8.

In step S8, the CPU10determines whether to end the game. The determination in step S8is performed, for example, depending on whether a predetermined time period has elapsed from the start of the game, or on whether the user has issued an instruction to end the game. When the determination result in step S8is negative, the process of step S2is performed again. On the other hand, when the determination result in step S8is affirmative, the CPU10ends the game processing shown inFIG. 22.

As described above, the player can cause the pirate ship90A and the like (including the pirate92and the arrow94) to be displayed in a zoomed-in or zoomed-out manner, by changing the attitude of the terminal device7. Specifically, when the player holds the terminal device7in an attitude in which the player does not view the LCD51of the terminal device7(the first attitude), the pirate ship90A and the like are displayed in a zoomed-in manner, and concurrently, the volume of the sound outputted form the speaker2aof the television2is increased. On the other hand, when the attitude of the terminal device7is changed into the second attitude, the pirate ship90A and the like are displayed in a zoomed-out manner, and concurrently, the volume of the sound outputted from the speaker2aof the television2is reduced. Further, the pirate92and the arrow94are displayed on the television2, but the pirate92and the arrow94are not displayed on the terminal device7. Further, since the terminal device7is a portable display device, the dimensions of the screen of the LCD51are relatively small, and thus, the pirate ship90A displayed on the terminal device7is difficult to be viewed, as in the case of the pirate ship90A displayed in a zoomed-out manner when the attitude of the terminal device7is the second attitude. Therefore, the player cannot view the instruction from the pirate92merely by looking at the LCD51of the terminal device7. In other words, the pirate ship90A displayed on the terminal device7is always so small that it is difficult to be viewed by the player, whereas the pirate ship90A displayed on the television2becomes easy to be viewed or difficult to be viewed depending on the attitude of the terminal device7. Therefore, the player needs to look at the television2in order to view and listen to the instruction from the pirate92(the instruction given in the form of an image and a voice). Further, in order to easily view and listen to the instructions from the pirate92, the player views and listens to the television2while changing the attitude of the terminal device7into the first attitude. After the instruction has been issued from the pirate92, the player changes the attitude of the terminal device7so as to receive the arrow94with the terminal device7at the second timing t2. A result indicating whether the arrow94has been received is displayed on the LCD51of the terminal device7.

As described above, in the game of the exemplary embodiment, it is possible to cause the player to look at the screen of the television2and the screen of the terminal device7alternately, and to cause the player to enjoy the game using the terminal device7, which is a portable display device.

The television2is a stationary display device and the terminal device7is a portable display device. The portable display device is held by the player and these two display devices are distanced from each other to some extent. When performing the rhythm game as described above in such a game environment, the player can enjoy the rhythm game which utilizes the distance between these two display devices and thus allows the user to feel the broadness of the space.

[7. Modifications]

It should be noted that the above exemplary embodiment is merely an example, and in another exemplary embodiment, for example, the following configuration may be employed.

In the exemplary embodiment, for example, the pirate ship90A and the pirate92displayed on the television2are zoomed in or zoomed out depending on the attitude of the terminal device7. Accordingly, the player is caused to view and listen to the television2and the terminal device7alternately, to play the game using the two display devices. In another exemplary embodiment, for example, fog may be caused to appear/disappear in the virtual space so as to make the pirate ship90A and the pirate92difficult/easy to be viewed, accordingly. Further, for example, by blacking out the screen of the television2, by blurring the entire screen or a predetermined region including the pirate ship90A and the like, or by making the pirate ship90A and the like transparent or translucent, the pirate ship90A and the like may be made difficult to be viewed (or cannot be viewed). Further, for example, by displaying the pirate ship90A and the like in a pixelized manner, the pirate ship90A and the pirate92may be made difficult to be viewed (or cannot be viewed). Still further, for example, by displaying another object to the front of the pirate ship90A, the pirate ship90A may be made difficult to be viewed (or cannot be viewed). That is, in the exemplary embodiment, when the attitude of the terminal device7is the first attitude, the pirate ship90A and the like are zoomed in so as to be made easy to be viewed by the player, and when the attitude of the terminal device7is the second attitude, the pirate ship90A and the like are zoomed out so as to be made difficult to be viewed by the player. Thus, by making it difficult for the player to view the pirate ship90A and the like, the exemplary embodiment causes the player to look at the screen of the terminal device7. However, in another exemplary embodiment, instead of changing the settings of the first virtual camera in order to display the pirate ship90A and the like in a zoomed-in/zoomed-out manner, the pirate ship90A and the like may be made easy/difficult to be viewed by employing the above described methods. In order to make the pirate ship90A and the like difficult to be viewed, a process of displaying the pirate ship90A and the like in a zoomed-out manner, and a process of displaying a predetermined image (a white image representing fog, a black image for blacking out the screen, an image of another object located to the front of the pirate ship90A, and the like) over a part or the whole of a range including the pirate ship90A and the like may be performed. Further, in order to make the pirate ship90A and the like difficult to be viewed, a process of blurring the pirate ship90A and the like, and a process of displaying the pirate ship90A and the like in a pixelized manner may be performed. Accordingly, it is possible to cause the player to look at the screen of the television2and the screen of the terminal device7alternately to play the game.

Moreover, in the exemplary embodiment, it is determined whether the attitude of the terminal device7is the first attitude or the second attitude. Specifically, it is assumed that the first attitude is an attitude in which the screen of the terminal device7is parallel to the ground surface (an attitude in which the screen of the terminal device7is substantially perpendicular to the direction of gravity), and that, in this attitude, the player is looking at the television2without looking at the screen of the terminal device7. Further, it is assumed that the second attitude is an attitude in which the screen of the terminal device7is perpendicular to the ground surface (an attitude in which the screen of the terminal device7is substantially parallel to the direction of gravity), and that, in this attitude, the player is looking at the screen of the terminal device7. That is, in the exemplary embodiment, whether the player is viewing the screen of the terminal device7(in other words, whether the player is viewing the television2) is determined based on the attitude of the terminal device7. In another exemplary embodiment, whether the player is viewing the television2(or the terminal device7) may be determined by another method. For example, an image of the face of the player may be taken by the camera56included in the terminal device7, and the taken image may be subjected to a face recognition process. Thus, by determining whether the line of sight of the player is directed to the LCD51of the terminal device7, it may be determined whether the player is viewing the terminal device7(whether the player is viewing the television2). Still further, for example, a camera different from the camera56may be provided in the real space (for example, around the television2), and the game apparatus3may obtain an image taken by this camera and may determine whether the player is viewing the television2. For example, whether the player is viewing the television2can be determined by using a face recognition technology that determines whether the face of the player is included in the image taken by the camera. Still another exemplary embodiment, whether the player is viewing the television2may be determined based on whether the player has pressed a predetermined operation button of the terminal device7.

For example, the attitude of the terminal device7may be calculated by the terminal device7taking an image of the markers of the marker device6, and based on the calculated attitude of the terminal device7, whether the player is viewing the terminal device7(or the television2) may be determined In this case, a camera for taking an image of the markers may be provided on the surface opposite to the surface on which the LCD51of the terminal device is provided. Alternatively, the attitude of the terminal device7may be calculated by a camera provided in the real space taking an image of the marker section55of the terminal device7. For example, in the case where a camera provided in the terminal device7takes an image of the two markers of the marker device6, the game apparatus3can calculate, based on the positions and the attitudes of the two markers included in the taken image, which direction the terminal device7is directed (whether the terminal device7is facing the television2), and how much the terminal device7is inclined in the lateral direction. Alternatively, in the case where a camera provided in the real space takes an image of the terminal device7, if the terminal device7included in the taken image is detected by means of image recognition technology such as pattern matching, the attitude of the terminal device7can be calculated.

In the exemplary embodiment, the attitude of the terminal device7is calculated based on the angular velocity detected by the gyro sensor, and the attitude of the terminal device7is corrected based on the acceleration detected by acceleration sensor. That is, the attitude of the terminal device7is calculated by using physical amounts detected by the two types of inertial sensors (the acceleration sensor and the gyro sensor). In another exemplary embodiment, the attitude of the terminal device7may be calculated based on the orientation detected by the magnetic sensor (the direction indicated by the geomagnetism detected by the magnetic sensor). By use of the magnetic sensor, which direction the terminal device7is facing (a direction parallel to the ground surface) can be detected. In this case, further by use of the acceleration sensor, the inclination relative to the direction of gravity can be detected, and the attitude of the terminal device7in the three-dimensional space can be calculated.

Further, in another exemplary embodiment, the attitude of the terminal device7may be calculated based on the physical amounts detected by the gyro sensor64and the like in the terminal device7and the data regarding the attitudes may be transmitted to the game apparatus3. Then, the game apparatus3may receive the data from the terminal device7, obtain the attitude of the terminal device7, and perform the game processing as described above, based on the attitude of the terminal device7. That is, the game apparatus3may obtain the attitude of the terminal device7, by calculating the attitude of the terminal device7, based on the data corresponding to the physical amounts detected by the gyro sensor64and the like from the terminal device7. Alternatively, the game apparatus3may obtain the attitude of the terminal device7, based on the data regarding the attitude calculated in the terminal device7.

In the exemplary embodiment, when the attitude of the terminal device7is the first attitude, the pirate ship90A and the like are displayed in a zoomed-in manner, by moving the position of the first virtual camera in its image taking direction and concurrently reducing the angle of view of the first virtual camera. In another exemplary embodiment, the pirate ship90A and the like may be displayed in a zoomed-in/zoomed-out manner, by changing at least one of the position and the angle of view of the first virtual. In the exemplary embodiment, when the attitude of the terminal device7is the second attitude, the position and the angle of view of the first virtual camera are set to substantially the same as those of the second virtual camera, whereby the imaging ranges of the two virtual cameras are made substantially the same with each other. In another exemplary embodiment, the position and the angle of view of the first virtual camera may not be necessarily substantially the same as those of the second virtual camera, and as long as the imaging ranges of the two virtual cameras are substantially the same with each other, the positions and the angles of view of the two virtual cameras may be adjusted as appropriate.

In the exemplary embodiment, during a time period from a first timing to a second timing, when the attitude of the terminal device7is a predetermined attitude (the attitude in accordance with the instruction), the lock-on frame99is displayed on the terminal device7. In another exemplary embodiment, in addition to the lock-on frame99(or instead of the lock-on frame99), a stationary image of the virtual space taken by the second virtual camera may be displayed on the terminal device7. For example, when the attitude of the terminal device7is a predetermined attitude in the above time period, the attitude of the second virtual camera may not be changed in accordance with the attitude of the terminal device7, or the change amount of attitude of the second virtual camera may be reduced relative to the change amount of the attitude of the terminal device7, and then an image of the virtual space taken by the second virtual camera may be displayed on the terminal device7. In another exemplary embodiment, in the above time period, the vibrator79may be operated in a predetermined pattern in accordance with a determination result of the attitude. That is, during the above time period, whether the attitude of the terminal device7is a predetermined attitude may be determined, and a notification in accordance with the determination result (displaying a frame or a stationary image, notifying the user of the determination result by sound, vibration, and the like) may be issued on the terminal device7.

In the exemplary embodiment, the pirate ship90A and the like are displayed in a zoomed-in or zoomed-out manner in accordance with the attitude of the terminal device7, and concurrently the volume of the sound outputted from the television2is adjusted. Specifically, when the attitude of the terminal device7is the first attitude (when the attitude in which the player is not viewing the terminal device7, in other words, the player is viewing the television2), the pirate ship90A and the like are displayed in a zoomed-in manner and the volume of the sound outputted from the television2is increased. In another exemplary embodiment, when the attitude of the terminal device7is the first attitude, the volume of the sound outputted from the terminal device7may be increased or may not be adjusted.

In the exemplary embodiment, the pirate92is displayed on the television2and the pirate92is caused to perform a predetermined action at a first timing (an action of shooting the arrow94, or an action of pointing a predetermined direction), thereby giving an instruction to the player. In another exemplary embodiment, the instruction to be given to the player may take any form, and the object to be displayed on the television2may be any object. In the exemplary embodiment, it is assumed that the arrow94is shot and the shot arrow94is received with the terminal device7. However, in another exemplary embodiment, the game may assume that another object is moved from the television2to the terminal device7, or from the terminal device7to the television2.

In the exemplary embodiment, an instruction to the user is issued by means of an action of the pirate92displayed on the television2and of sound outputted from the television2. However, in another exemplary embodiment, an instruction by means of either one of an image or sound may be issued from the television2.

In the exemplary embodiment, a predetermined instruction is issued to the player at a first timing, and a game processing is performed based on the attitude of the terminal device7at a second timing which is a timing after a predetermined time period has elapsed from the first timing. In another exemplary embodiment, for example, the controller5may be used as an input device, or the terminal device7may be used as an input device. That is, in another exemplary embodiment, the game processing may be performed based on whether a predetermined operation button of the controller5is being pressed, for example, at the second timing, or the game processing may be performed based on whether a predetermined operation button of the terminal device7is being pressed. Moreover, for example, the game processing may be performed in accordance with the attitude of the controller5or the attitude of the terminal device7, at the second timing.

In another exemplary embodiment, an input by the player may be performed in the form of a gesture (action) of the player himself or herself. For example, a camera that takes an image of the player is connected to the game apparatus3, and based on the image from the camera, the action of the player is determined, whereby an input from the player may be performed. For example, an instruction is given to the player at a first timing, and the player performs an action in accordance with the instruction at a second timing. Then, based on the image from the camera, the game apparatus3analyzes the action of the player and determines whether the action of the player is in accordance with the instruction.

That is, the game processing may be performed based on the state of an input to the input device at a second timing. The input state of the input device may be the state whether a predetermined operation button provided on the controller5or the terminal device7is being pressed, or may be the attitude of the controller5or the terminal device7itself. The input state of the input device may be a state based on the gesture (action) of the player himself or herself, and the action of the player may be determined by the camera taking an image of the player.

Moreover, for example, the game processing may be performed based on a second timing and a timing at which an input to the input device (for example, the terminal device7) is performed. For example, the game processing may be performed based on a difference between a second timing and a timing at which a predetermined operation button of the input device is pressed. Specifically, when the difference is less than or equal to a predetermined threshold value, a scene in which the arrow94has reached the terminal device7may be displayed on the LCD51of the terminal device7, assuming that the player has performed an input in accordance with the instruction. Moreover, for example, the game processing may be performed based on a difference between a second timing and a timing of a predetermined operation performed onto the input device (an operation that changes the attitude of the input device itself, or an operation that accelerates the motion of the input device, such as an operation of shaking the input device). That is, the game processing may be performed based on the difference between the second timing and the timing of the input performed onto the input device.

In the exemplary embodiment, a scene in which the arrow94is shot from the television2toward the player is displayed at a first timing, and a scene in which the arrow94has reached the terminal device7is displayed at a second timing. In another exemplary embodiment, a scene in which a predetermined object is moved from the terminal device7toward the television2may be displayed at a first timing, and a scene in which the object reaches the television2may be displayed at a second timing. In this case, the game processing is performed based on the input state of the input device (the terminal device7or another input device) at the second timing. That is, in another exemplary embodiment, the game as described above may be performed, with the television2being a stationary display device switched with the terminal device7being a portable display device.

Further, in another exemplary embodiment, a part of the game processing performed in the game apparatus3may be performed in the terminal device7. For example, a virtual space is defined in the terminal device7and an image of the virtual space is taken by a virtual camera, whereby an image to be displayed on the LCD51of the terminal device7may be generated in the terminal device7.

Further, in another exemplary embodiment, in a game system including a plurality of information processing apparatuses that can communicate with each other, the plurality of information processing apparatuses may share the game processing performed by the game apparatus3. For example, a game system as described above may be configured by a plurality of information processing apparatuses connected to a network such as the Internet. For example, a game system as described above may be constructed by an information processing apparatus to which the terminal device7and a monitor are connected, and a server connected to the information processing apparatus via the Internet. In this case, the terminal device7and the information processing apparatus are arranged to the player side. Then, for example, operation information based on a game operation performed on the terminal device7is transmitted to the server via the network, and the server performs the game processing based on the received operation information, and transmits a result of the game processing to the information processing apparatus.

Further, in another exemplary embodiment, data may be transmitted and received between the game apparatus3and the terminal device7which are connected with each other not in a wireless manner but in a wired manner. The above described program may be executed in an information processing apparatus, other than the game apparatus3, for performing various information processes, such as a personal computer.

Further, the above game program may not be stored in an optical disc but may be stored in a storage medium such as a magnetic disc, a nonvolatile memory, and the like. The above game program may be stored in a computer-readable storage medium such as a RAM or a magnetic disc on a server connected to a network, and may be provided via the network. The game program may be loaded into an information processing apparatus as a source code and may be compiled at the execution of the program.

In the exemplary embodiment, the CPU10of the game apparatus3executes the game program, whereby the processes of the flowchart are performed. In another exemplary embodiment, a part or the whole of the processes may be performed by a dedicated circuit included in the game apparatus3, or by a general-purpose processer. At least one processor may operate as a “programmed logic circuit” for performing the processes.

The systems, devices and apparatuses described herein may include one or more processors, which may be located in one place or distributed in a variety of places communicating via one or more networks. Such processor(s) can, for example, use conventional 3D graphics transformations, virtual camera and other techniques to provide appropriate images for display. By way of example and without limitation, the processors can be any of: a processor that is part of or is a separate component co-located with the stationary display and which communicates remotely (e.g., wirelessly) with the movable display; or a processor that is part of or is a separate component co-located with the movable display and communicates remotely (e.g., wirelessly) with the stationary display or associated equipment; or a distributed processing arrangement some of which is contained within the movable display housing and some of which is co-located with the stationary display, the distributed portions communicating together via a connection such as a wireless or wired network; or a processor(s) located remotely (e.g., in the cloud) from both the stationary and movable displays and communicating with each of them via one or more network connections; or any combination or variation of the above. The processors can be implemented using one or more general-purpose processors, one or more specialized graphics processors, or combinations of these. These may be supplemented by specifically-designed ASICs (application specific integrated circuits) and/or logic circuitry. In the case of a distributed processor architecture or arrangement, appropriate data exchange and transmission protocols are used to provide low latency and maintain interactivity, as will be understood by those skilled in the art. Similarly, program instructions, data and other information for implementing the systems and methods described herein may be stored in one or more on-board and/or removable memory devices. Multiple memory devices may be part of the same device or different devices, which are co-located or remotely located with respect to each other.

While certain example systems, methods, devices and apparatuses have been described herein, it is to be understood that the appended claims are not to be limited to the systems, methods, devices and apparatuses disclosed, but on the contrary, are intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims

  1. A non-transitory computer-readable storage medium having stored therein a game program executed by a computer of an information processing apparatus capable of displaying game processes on first and second display devices, the program causing the computer to perform: issuing a predetermined notification or instruction to a player of the executed game program via the first display device at a first timing, wherein the predetermined notification or instruction is a game operation or object movement which is displayed on a display of the first display device;receiving one or more inputs indicative of an attitude state of an input device operated by a player of the executed game program;determining whether a second timing has occurred, the second timing being a timing which occurs after a predetermined time period has elapsed from the first timing, and determining an attitude state of the input device at a time when the second timing has occurred;determining whether a predetermined game process should be performed based on a determined attitude state of the input device which exists when the second timing has occurred;performing the predetermined game process upon determining that the predetermined game process should be performed;and presenting one or more results of the performed predetermined game process on a second display device, wherein at least one result of the performed predetermined game process being said game operation or said object appearing on a display of the second display device.
  1. The non-transitory computer-readable storage medium having stored therein the game program according to claim 1 , wherein the second display device is a portable display device integrated with the input device.
  2. The non-transitory computer-readable storage medium having stored therein the game program according to claim 2 , wherein the input device includes one or more sensors and produces attitude data based on a value of a sensor that is indicative of an attitude of the input device, wherein an attitude state of said portable display device is determined based on the attitude data, and the predetermined game process is performed, based on the attitude of the portable display device at the second timing.
  3. The non-transitory computer-readable storage medium having stored therein the game program according to claim 3 , wherein the computer is further caused to perform a determining as to whether the attitude state of the portable display device is in accordance with the predetermined notification, and the game process is performed at the second timing if it is determined that the attitude state is in accordance with the predetermined notification.
  4. The non-transitory computer-readable storage medium having stored therein the game program according to claim 4 , wherein the computer is caused to further perform determining whether the attitude state of the portable display device is in accordance with the predetermined notification during a time period extending from occurrence of the first timing to occurrence of the second timing and providing a notification of a result of such on the portable display device.
  5. The non-transitory computer-readable storage medium having stored therein the game program according to claim 4 , wherein the computer is further caused to perform an evaluation of the input by the user, in accordance with a result of the determination at the second timing.
  6. The non-transitory computer-readable storage medium having stored therein the game program according to claim 4 , wherein an attitude of a virtual camera in a virtual space is set in accordance with the attitude of the portable display device, and an image of the virtual space is taken by the virtual camera, and thereby a second image in accordance with the attitude of the portable display device is displayed on the portable display device.
  7. The non-transitory computer-readable storage medium having stored therein the game program according to claim 3 , wherein the computer is further caused to perform determining whether the portable display device is moving at the second timing, and a result of a game process performed in accordance with a result of the determination whether the portable display device is moving is presented on the portable display device at the second timing.
  8. The non-transitory computer-readable storage medium having stored therein the game program according to claim 2 , wherein image data indicating the result of the game process is outputted to the portable display device, the portable display device includes: an image data obtaining unit that obtains the image data outputted from the information processing apparatus;and a display unit that displays the result of the game process indicated by the image data obtained by the image data obtaining unit.
  9. The non-transitory computer-readable storage medium having stored therein the game program according to claim 9 , wherein the computer is further caused to perform generating compressed image data which is obtained by compressing the image data, the compressed image data is outputted to the portable display device, the portable display device further includes an image decompression unit that decompresses the compressed image data and obtains the image data, and the display unit displays the result of the game process indicated by the image data decompressed by the image decompression unit.
  10. The non-transitory computer-readable storage medium having stored therein the game program according to claim 1 , wherein the computer is further caused to perform reproducing predetermined music, and the first timing and the second timing are set, synchronized with the rhythm of the predetermined music.
  11. The non-transitory computer-readable storage medium having stored therein the game program according to claim 11 , wherein the first timing and the second timing are set at timings when the predetermined music is reproduced.
  12. The non-transitory computer-readable storage medium having stored therein the game program according to claim 1 , wherein a predetermined sound is outputted at the first timing from a sound outputting unit provided in the first display device, and a predetermined sound is outputted at the second timing from a sound outputting unit provided in the second display device.
  13. The non-transitory computer-readable storage medium having stored therein the game program according to claim 1 , wherein a scene in which a predetermined object is started to be moved as the predetermined notification is presented on the first display device, and a scene in which the predetermined object has reached the second display device as the result of the game process is displayed on the second display device.
  14. The non-transitory computer-readable storage medium having stored therein the game program according to claim 1 , wherein performing a predetermined game process may also be based upon a difference between a timing of an input to the input device and the second timing.
  15. The non-transitory computer-readable storage medium having stored therein the game program according to claim 1 , wherein the predetermined notification is an instruction image for notifying the user of an event occurrence or an operation to be performed by the user.
  16. The non-transitory computer-readable storage medium having stored therein the game program according to claim 1 , wherein the first display includes an audio output device and the predetermined notification is an audio instruction for notifying the user of an event occurrence or an operation to be performed by the user.
  17. The non-transitory computer-readable storage medium having stored therein the game program according to claim 1 , wherein the predetermined game processing is not performed at a timing different from the second timing.
  18. An information processing apparatus associated with a first display device and having one or more computer processors configured to: issue a predetermined notification or instruction to user via a first display device at a first timing, wherein the predetermined notification or instruction is a game operation or object movement which is displayed on a display of the first display device;receive one or more inputs indicative of an attitude state of an input device operated by the user;determine an occurrence of a second timing, the second timing occurring at a predetermined elapse of time after the first timing, and determine an attitude state of the input device at a time when the second timing has occurred;determine whether a predetermined game process should be performed based on a determined attitude state of the input device which exists at occurrence of the second timing;perform the predetermined game process device upon determining that the predetermined game process should be performed;and present one or more results of the performed predetermined game process on a second display device, wherein at least one result of the performed predetermined game process being said game operation or said object appearing on a display of the second display device.
  19. The information processing apparatus of claim 19 , wherein performing a predetermined game process may also be based upon a difference between a timing of an input to the input device and the second timing.
  20. The information processing apparatus of claim 19 , wherein the predetermined notification is an instruction image for notifying the user of an event occurrence or an operation to be performed by the user.
  21. The information processing apparatus of claim 19 , wherein the first display includes an audio output device and the predetermined notification is an audio instruction for notifying the user of an event occurrence or an operation to be performed by the user.
  22. The information processing apparatus of claim 19 , wherein the predetermined game processing is not performed at a timing different from the second timing.
  23. A game system having one or more processing unit and including a game operation input device and first and second display devices to display game images, the game system comprising: a first presentation processing unit that issues a predetermined notification or instruction to a user via the first display device at a first timing, wherein the predetermined notification or instruction is a game operation or object movement which is displayed on a display of the first display device;an input processing unit that receives an input indicative of an attitude state of the operation input device;a game process performing processing unit that determines an occurrence of a second timing, the second timing occurring at a predetermined elapse of time after the first timing, and determines an attitude state of the operation input device that exists at a time when the second timing has occurred, and further determines whether a predetermined game process should be performed based on a determined attitude state of the input device which exists when the second timing has occurred;and a second presentation processing unit that performs the predetermined game process upon a determination that the predetermined game process should be performed and presents one or more results of the performed predetermined game process on a second display device, wherein at least one result of the performed predetermined game process being said game operation or said object appearing on a display of the second display device.
  24. The game system of claim 24 , wherein performing a predetermined game process may also be based upon a difference between a timing of an input to the input device and the second timing.
  25. The game system of claim 24 , wherein the predetermined notification is an instruction image for notifying the user of an event occurrence or an operation to be performed by the user.
  26. The game system of claim 24 , wherein the first display includes an audio output device and the predetermined notification is an audio instruction for notifying the user of an event occurrence or an operation to be performed by the user.
  27. The game system of claim 24 , wherein the predetermined game processing is not performed at a timing different from the second timing.
  28. A processing method implemented using a game operation input device and one or more computer processor of an information processing apparatus capable of providing images to be displayed on first or second display devices, the method comprising: using said one or more processor for issuing a predetermined notification or instruction to a user via the first display device at a first timing, wherein the predetermined notification or instruction is a game operation or object movement which is displayed on a display of the first display device;determining an occurrence of a second timing using said one or more computer processor, the second timing occurring at a predetermined elapse of time after the first timing;providing one or more inputs indicative of an attitude of the operation input device to said one or more processor;using said one or more processor for determining an attitude state of the operation input device that exists at a time when the second timing has occurred;using said one or more computer processor for determining whether a predetermined game process should be performed based on a determined attitude state of the operation input device which exists when the second timing has occurred;performing the predetermined game process upon a determination that the predetermined game process should be performed;and presenting one or more results of the performed predetermined game process on a second display device, wherein at least one result of the performed predetermined game process being said game operation or said object appearing on a display of the second display device.
  29. The processing method of claim 29 , wherein performing a predetermined game process may also be based upon a difference between a timing of an input to the input device and the second timing.
  30. The processing method of claim 29 , wherein the predetermined notification is an instruction image for notifying the user of an event occurrence or an operation to be performed by the user.
  31. The processing method of claim 29 , wherein the first display includes an audio output device and the predetermined notification is an audio instruction for notifying the user of an event occurrence or an operation to be performed by the user.
  32. The processing method of claim 29 , wherein the predetermined game processing is not performed at a timing different from the second timing.

Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.