U.S. Pat. No. 10,918,944

GAME SYSTEM WITH VIRTUAL CAMERA ADJUSTMENT BASED ON VIDEO OUTPUT CONNECTION STATUS AND SYSTEM ORIENTATION, AND COUNTERPART STORAGE MEDIUM HAVING STORED THEREIN GAME PROGRAM, GAME APPARATUS, AND GAME PROCESSING METHOD

AssigneeNintendo Co., Ltd.

Issue DateApril 12, 2019

Illustrative Figure

Abstract

Regarding game processing, a case where a screen is closer to vertical than a first reference is set to a first mode, and a case where the screen is closer to horizontal than the reference is set to a second mode. Then, in the first mode, a virtual camera is set to a first line-of-sight direction, and a game image in which an information image is placed in a first direction is generated. In the second mode, the virtual camera is set to be further downward than the first line-of-sight direction, and a game image in which the plurality of information images are placed in different directions is generated.

Description

DETAILED DESCRIPTION OF NON-LIMITING EXAMPLE EMBODIMENTS A description is given of a game system according to an exemplary embodiment. An example of a game system1according to the exemplary embodiment includes a main body apparatus (an information processing apparatus; which functions as a game apparatus main body in the exemplary embodiment)2, a left controller3, and a right controller4. Each of the left controller3and the right controller4is attachable to and detachable from the main body apparatus2. That is, the game system1can be used as a unified apparatus obtained by attaching each of the left controller3and the right controller4to the main body apparatus2. Further, in the game system1, the main body apparatus2, the left controller3, and the right controller4can also be used as separate bodies (seeFIG. 2). Hereinafter, first, the hardware configuration of the game system1according to the exemplary embodiment is described, and then, the control of the game system1according to the exemplary embodiment is described. FIG. 1is a diagram showing an example of the state where the left controller3and the right controller4are attached to the main body apparatus2. As shown inFIG. 1, each of the left controller3and the right controller4is attached to and unified with the main body apparatus2. The main body apparatus2is an apparatus for performing various processes (e.g., game processing) in the game system1. The main body apparatus2includes a display12. Each of the left controller3and the right controller4is an apparatus including operation sections with which a user provides inputs. FIG. 2is a diagram showing an example of the state where each of the left controller3and the right controller4is detached from the main body apparatus2. As shown inFIGS. 1 and 2, the left controller3and the right controller4are attachable to and detachable from the main body apparatus2. It should be noted that hereinafter, the left controller3and the right controller4will occasionally be referred ...

DETAILED DESCRIPTION OF NON-LIMITING EXAMPLE EMBODIMENTS

A description is given of a game system according to an exemplary embodiment. An example of a game system1according to the exemplary embodiment includes a main body apparatus (an information processing apparatus; which functions as a game apparatus main body in the exemplary embodiment)2, a left controller3, and a right controller4. Each of the left controller3and the right controller4is attachable to and detachable from the main body apparatus2. That is, the game system1can be used as a unified apparatus obtained by attaching each of the left controller3and the right controller4to the main body apparatus2. Further, in the game system1, the main body apparatus2, the left controller3, and the right controller4can also be used as separate bodies (seeFIG. 2). Hereinafter, first, the hardware configuration of the game system1according to the exemplary embodiment is described, and then, the control of the game system1according to the exemplary embodiment is described.

FIG. 1is a diagram showing an example of the state where the left controller3and the right controller4are attached to the main body apparatus2. As shown inFIG. 1, each of the left controller3and the right controller4is attached to and unified with the main body apparatus2. The main body apparatus2is an apparatus for performing various processes (e.g., game processing) in the game system1. The main body apparatus2includes a display12. Each of the left controller3and the right controller4is an apparatus including operation sections with which a user provides inputs.

FIG. 2is a diagram showing an example of the state where each of the left controller3and the right controller4is detached from the main body apparatus2. As shown inFIGS. 1 and 2, the left controller3and the right controller4are attachable to and detachable from the main body apparatus2. It should be noted that hereinafter, the left controller3and the right controller4will occasionally be referred to collectively as a “controller”.

FIG. 3is six orthogonal views showing an example of the main body apparatus2. As shown inFIG. 3, the main body apparatus2includes an approximately plate-shaped housing11. In the exemplary embodiment, a main surface (in other words, a surface on a front side, i.e., a surface on which the display12is provided) of the housing11has a generally rectangular shape.

It should be noted that the shape and the size of the housing11are optional. As an example, the housing11may be of a portable size. Further, the main body apparatus2alone or the unified apparatus obtained by attaching the left controller3and the right controller4to the main body apparatus2may function as a mobile apparatus. The main body apparatus2or the unified apparatus may function as a handheld apparatus or a portable apparatus.

As shown inFIG. 3, the main body apparatus2includes the display12, which is provided on the main surface of the housing11. The display12displays an image generated by the main body apparatus2. In the exemplary embodiment, the display12is a liquid crystal display device (LCD). The display12, however, may be a display device of any type.

Further, the main body apparatus2includes a touch panel13on a screen of the display12. In the exemplary embodiment, the touch panel13is of a type that allows a multi-touch input (e.g., a capacitive type). The touch panel13, however, may be of any type. For example, the touch panel13may be of a type that allows a single-touch input (e.g., a resistive type).

The main body apparatus2includes speakers (i.e., speakers88shown inFIG. 6) within the housing11. As shown inFIG. 3, speaker holes11aand11bare formed on the main surface of the housing11. Then, sounds output from the speakers88are output through the speaker holes11aand11b.

Further, the main body apparatus2includes a left terminal17, which is a terminal for the main body apparatus2to perform wired communication with the left controller3, and a right terminal21, which is a terminal for the main body apparatus2to perform wired communication with the right controller4.

As shown inFIG. 3, the main body apparatus2includes a slot23. The slot23is provided on an upper side surface of the housing11. The slot23is so shaped as to allow a predetermined type of storage medium to be attached to the slot23. The predetermined type of storage medium is, for example, a dedicated storage medium (e.g., a dedicated memory card) for the game system1and an information processing apparatus of the same type as the game system1. The predetermined type of storage medium is used to store, for example, data (e.g., saved data of an application or the like) used by the main body apparatus2and/or a program (e.g., a program for an application or the like) executed by the main body apparatus2. Further, the main body apparatus2includes a power button28.

The main body apparatus2includes a lower terminal27. The lower terminal27is a terminal for the main body apparatus2to communicate with a cradle. In the exemplary embodiment, the lower terminal27is a USB connector (more specifically, a female connector). Further, when the unified apparatus or the main body apparatus2alone is mounted on the cradle, the game system1can display on a stationary monitor an image generated by and output from the main body apparatus2. Further, in the exemplary embodiment, the cradle has the function of charging the unified apparatus or the main body apparatus2alone mounted on the cradle. Further, the cradle has the function of a hub device (specifically, a USB hub).

FIG. 4is six orthogonal views showing an example of the left controller3. As shown inFIG. 4, the left controller3includes a housing31. In the exemplary embodiment, the housing31has a vertically long shape, i.e., is shaped to be long in an up-down direction (i.e., a y-axis direction shown inFIGS. 1 and 4). In the state where the left controller3is detached from the main body apparatus2, the left controller3can also be held in the orientation in which the left controller3is vertically long. The housing31has such a shape and a size that when held in the orientation in which the housing31is vertically long, the housing31can be held with one hand, particularly the left hand. Further, the left controller3can also be held in the orientation in which the left controller3is horizontally long. When held in the orientation in which the left controller3is horizontally long, the left controller3may be held with both hands.

The left controller3includes an analog stick32. As shown inFIG. 4, the analog stick32is provided on a main surface of the housing31. The analog stick32can be used as a direction input section with which a direction can be input. The user tilts the analog stick32and thereby can input a direction corresponding to the direction of the tilt (and input a magnitude corresponding to the angle of the tilt). It should be noted that the left controller3may include a directional pad, a slide stick that allows a slide input, or the like as the direction input section, instead of the analog stick. Further, in the exemplary embodiment, it is possible to provide an input by pressing the analog stick32.

The left controller3includes various operation buttons. The left controller3includes four operation buttons33to36(specifically, a right direction button33, a down direction button34, an up direction button35, and a left direction button36) on the main surface of the housing31. Further, the left controller3includes a record button37and a “−” (minus) button47. The left controller3includes a first L-button38and a ZL-button39in an upper left portion of a side surface of the housing31. Further, the left controller3includes a second L-button43and a second R-button44, on the side surface of the housing31on which the left controller3is attached to the main body apparatus2. These operation buttons are used to give instructions depending on various programs (e.g., an OS program and an application program) executed by the main body apparatus2.

Further, the left controller3includes a terminal42for the left controller3to perform wired communication with the main body apparatus2.

FIG. 5is six orthogonal views showing an example of the right controller4. As shown inFIG. 5, the right controller4includes a housing51. In the exemplary embodiment, the housing51has a vertically long shape, i.e., is shaped to be long in the up-down direction. In the state where the right controller4is detached from the main body apparatus2, the right controller4can also be held in the orientation in which the right controller4is vertically long. The housing51has such a shape and a size that when held in the orientation in which the housing51is vertically long, the housing51can be held with one hand, particularly the right hand. Further, the right controller4can also be held in the orientation in which the right controller4is horizontally long. When held in the orientation in which the right controller4is horizontally long, the right controller4may be held with both hands.

Similarly to the left controller3, the right controller4includes an analog stick52as a direction input section. In the exemplary embodiment, the analog stick52has the same configuration as that of the analog stick32of the left controller3. Further, the right controller4may include a directional pad, a slide stick that allows a slide input, or the like, instead of the analog stick. Further, similarly to the left controller3, the right controller4includes four operation buttons53to56(specifically, an A-button53, a B-button54, an X-button55, and a Y-button56) on a main surface of the housing51. Further, the right controller4includes a “+” (plus) button57and a home button58. Further, the right controller4includes a first R-button60and a ZR-button61in an upper right portion of a side surface of the housing51. Further, similarly to the left controller3, the right controller4includes a second L-button65and a second R-button66.

Further, the right controller4includes a terminal64for the right controller4to perform wired communication with the main body apparatus2.

FIG. 6is a block diagram showing an example of the internal configuration of the main body apparatus2. The main body apparatus2includes components81to91,97, and98shown inFIG. 6in addition to the components shown inFIG. 3. Some of the components81to91,97, and98may be mounted as electronic components on an electronic circuit board and accommodated in the housing11.

The main body apparatus2includes a processor81. The processor81is an information processing section for executing various types of information processing to be executed by the main body apparatus2. For example, the processor81may be composed only of a CPU (Central Processing Unit), or may be composed of a SoC (System-on-a-chip) having a plurality of functions such as a CPU function and a GPU (Graphics Processing Unit) function. The processor81executes an information processing program (e.g., a game program) stored in a storage section (specifically, an internal storage medium such as a flash memory84, an external storage medium attached to the slot23, or the like), thereby performing the various types of information processing.

The main body apparatus2includes a flash memory84and a DRAM (Dynamic Random Access Memory)85as examples of internal storage media built into the main body apparatus2. The flash memory84and the DRAM85are connected to the processor81. The flash memory84is a memory mainly used to store various data (or programs) to be saved in the main body apparatus2. The DRAM85is a memory used to temporarily store various data used for information processing.

The main body apparatus2includes a slot interface (hereinafter abbreviated as “I/F”)91. The slot I/F91is connected to the processor81. The slot I/F91is connected to the slot23, and in accordance with an instruction from the processor81, reads and writes data from and to the predetermined type of storage medium (e.g., a dedicated memory card) attached to the slot23.

The processor81appropriately reads and writes data from and to the flash memory84, the DRAM85, and each of the above storage media, thereby performing the above information processing.

The main body apparatus2includes a network communication section82. The network communication section82is connected to the processor81. The network communication section82communicates (specifically, through wireless communication) with an external apparatus via a network. In the exemplary embodiment, as a first communication form, the network communication section82connects to a wireless LAN and communicates with an external apparatus, using a method compliant with the Wi-Fi standard. Further, as a second communication form, the network communication section82wirelessly communicates with another main body apparatus2of the same type, using a predetermined communication method (e.g., communication based on a unique protocol or infrared light communication). It should be noted that the wireless communication in the above second communication form achieves the function of enabling so-called “local communication” in which the main body apparatus2can wirelessly communicate with another main body apparatus2placed in a closed local network area, and the plurality of main body apparatuses2directly communicate with each other to transmit and receive data.

The main body apparatus2includes a controller communication section83. The controller communication section83is connected to the processor81. The controller communication section83wirelessly communicates with the left controller3and/or the right controller4. The communication method between the main body apparatus2and the left controller3and the right controller4is optional. In the exemplary embodiment, the controller communication section83performs communication compliant with the Bluetooth (registered trademark) standard with the left controller3and with the right controller4.

The processor81is connected to the left terminal17, the right terminal21, and the lower terminal27. When performing wired communication with the left controller3, the processor81transmits data to the left controller3via the left terminal17and also receives operation data from the left controller3via the left terminal17. Further, when performing wired communication with the right controller4, the processor81transmits data to the right controller4via the right terminal21and also receives operation data from the right controller4via the right terminal21. Further, when communicating with the cradle, the processor81transmits data to the cradle via the lower terminal27. As described above, in the exemplary embodiment, the main body apparatus2can perform both wired communication and wireless communication with each of the left controller3and the right controller4. Further, when the unified apparatus obtained by attaching the left controller3and the right controller4to the main body apparatus2or the main body apparatus2alone is attached to the cradle, the main body apparatus2can output data (e.g., image data or sound data) to the stationary monitor or the like via the cradle.

Here, the main body apparatus2can communicate with a plurality of left controllers3simultaneously (in other words, in parallel). Further, the main body apparatus2can communicate with a plurality of right controllers4simultaneously (in other words, in parallel). Thus, a plurality of users can simultaneously provide inputs to the main body apparatus2, each using a set of the left controller3and the right controller4. As an example, a first user can provide an input to the main body apparatus2using a first set of the left controller3and the right controller4, and simultaneously, a second user can provide an input to the main body apparatus2using a second set of the left controller3and the right controller4.

The main body apparatus2includes a touch panel controller86, which is a circuit for controlling the touch panel13. The touch panel controller86is connected between the touch panel13and the processor81. Based on a signal from the touch panel13, the touch panel controller86generates, for example, data indicating the position where a touch input is provided. Then, the touch panel controller86outputs the data to the processor81.

Further, the display12is connected to the processor81. The processor81displays a generated image (e.g., an image generated by executing the above information processing) and/or an externally acquired image on the display12.

The main body apparatus2includes a codec circuit87and speakers (specifically, a left speaker and a right speaker)88. The codec circuit87is connected to the speakers88and a sound input/output terminal25and also connected to the processor81. The codec circuit87is a circuit for controlling the input and output of sound data to and from the speakers88and the sound input/output terminal25.

Further, the main body apparatus2includes an acceleration sensor89. In the exemplary embodiment, the acceleration sensor89detects the magnitudes of accelerations along predetermined three axial (e.g., xyz axes shown inFIG. 1) directions. It should be noted that the acceleration sensor89may detect an acceleration along one axial direction or accelerations along two axial directions. It should be noted that the acceleration sensor89corresponds to an example of an inertial sensor included in the information processing apparatus.

Further, the main body apparatus2includes an angular velocity sensor90. In the exemplary embodiment, the angular velocity sensor90detects angular velocities about predetermined three axes (e.g., the xyz axes shown inFIG. 1). It should be noted that the angular velocity sensor90may detect an angular velocity about one axis or angular velocities about two axes. It should be noted that the angular velocity sensor90corresponds to another example of the inertial sensor included in the information processing apparatus.

The acceleration sensor89and the angular velocity sensor90are connected to the processor81, and the detection results of the acceleration sensor89and the angular velocity sensor90are output to the processor81. Based on the detection results of the acceleration sensor89and the angular velocity sensor90, the processor81can calculate information regarding the motion and/or the orientation of the main body apparatus2.

The main body apparatus2includes a power control section97and a battery98. The power control section97is connected to the battery98and the processor81. Further, although not shown inFIG. 6, the power control section97is connected to components of the main body apparatus2(specifically, components that receive power supplied from the battery98, the left terminal17, and the right terminal21). Based on a command from the processor81, the power control section97controls the supply of power from the battery98to the above components.

Further, the battery98is connected to the lower terminal27. When an external charging device (e.g., the cradle) is connected to the lower terminal27, and power is supplied to the main body apparatus2via the lower terminal27, the battery98is charged with the supplied power.

FIG. 7is a block diagram showing examples of the internal configurations of the main body apparatus2, the left controller3, and the right controller4. It should be noted that the details of the internal configuration of the main body apparatus2are shown inFIG. 6and therefore are omitted inFIG. 7.

The left controller3includes a communication control section101, which communicates with the main body apparatus2. As shown inFIG. 7, the communication control section101is connected to components including the terminal42. In the exemplary embodiment, the communication control section101can communicate with the main body apparatus2through both wired communication via the terminal42and wireless communication not via the terminal42. The communication control section101controls the method for communication performed by the left controller3with the main body apparatus2. That is, when the left controller3is attached to the main body apparatus2, the communication control section101communicates with the main body apparatus2via the terminal42. Further, when the left controller3is detached from the main body apparatus2, the communication control section101wirelessly communicates with the main body apparatus2(specifically, the controller communication section83). The wireless communication between the communication control section101and the controller communication section83is performed in accordance with the Bluetooth (registered trademark) standard, for example.

Further, the left controller3includes a memory102such as a flash memory. The communication control section101includes, for example, a microcomputer (or a microprocessor) and executes firmware stored in the memory102, thereby performing various processes.

The left controller3includes buttons103(specifically, the buttons33to39,43,44, and47). Further, the left controller3includes the analog stick (“stick” inFIG. 7)32. Each of the buttons103and the analog stick32outputs information regarding an operation performed on itself to the communication control section101repeatedly at appropriate timing.

The left controller3includes inertial sensors. Specifically, the left controller3includes an acceleration sensor104. Further, the left controller3includes an angular velocity sensor105. In the exemplary embodiment, the acceleration sensor104detects the magnitudes of accelerations along predetermined three axial (e.g., xyz axes shown inFIG. 4) directions. It should be noted that the acceleration sensor104may detect an acceleration along one axial direction or accelerations along two axial directions. In the exemplary embodiment, the angular velocity sensor105detects angular velocities about predetermined three axes (e.g., the xyz axes shown inFIG. 4). It should be noted that the angular velocity sensor105may detect an angular velocity about one axis or angular velocities about two axes. Each of the acceleration sensor104and the angular velocity sensor105is connected to the communication control section101. Then, the detection results of the acceleration sensor104and the angular velocity sensor105are output to the communication control section101repeatedly at appropriate timing.

The communication control section101acquires information regarding an input (specifically, information regarding an operation or the detection result of the sensor) from each of input sections (specifically, the buttons103, the analog stick32, and the sensors104and105). The communication control section101transmits operation data including the acquired information (or information obtained by performing predetermined processing on the acquired information) to the main body apparatus2. It should be noted that the operation data is transmitted repeatedly, once every predetermined time. It should be noted that the interval at which the information regarding an input is transmitted from each of the input sections to the main body apparatus2may or may not be the same.

The above operation data is transmitted to the main body apparatus2, whereby the main body apparatus2can obtain inputs provided to the left controller3. That is, the main body apparatus2can determine operations on the buttons103and the analog stick32based on the operation data. Further, the main body apparatus2can calculate information regarding the motion and/or the orientation of the left controller3based on the operation data (specifically, the detection results of the acceleration sensor104and the angular velocity sensor105).

The left controller3includes a power supply section108. In the exemplary embodiment, the power supply section108includes a battery and a power control circuit. Although not shown inFIG. 7, the power control circuit is connected to the battery and also connected to components of the left controller3(specifically, components that receive power supplied from the battery).

As shown inFIG. 7, the right controller4includes a communication control section111, which communicates with the main body apparatus2. Further, the right controller4includes a memory112, which is connected to the communication control section111. The communication control section111is connected to components including the terminal64. The communication control section111and the memory112have functions similar to those of the communication control section101and the memory102, respectively, of the left controller3. Thus, the communication control section111can communicate with the main body apparatus2through both wired communication via the terminal64and wireless communication not via the terminal64(specifically, communication compliant with the Bluetooth (registered trademark) standard). The communication control section111controls the method for communication performed by the right controller4with the main body apparatus2.

The right controller4includes input sections similar to the input sections of the left controller3. Specifically, the right controller4includes buttons113, the analog stick52, and inertial sensors (an acceleration sensor114and an angular velocity sensor115). These input sections have functions similar to those of the input sections of the left controller3and operate similarly to the input sections of the left controller3.

The right controller4includes a power supply section118. The power supply section118has a function similar to that of the power supply section108of the left controller3and operates similarly to the power supply section108.

As describe above, in the game system1according to the exemplary embodiment, the left controller3and the right controller4are attachable to and detachable from the main body apparatus2. Further, the unified apparatus obtained by attaching the left controller3and the right controller4to the main body apparatus2or the main body apparatus2alone is attached to the cradle and thereby can output an image (and a sound) to the stationary monitor6. A description is given below using a game system in a use form in which an image (and a sound) is output from the main body apparatus2in the state where the left controller3and the right controller4are detached from the main body apparatus2.

As described above, in the exemplary embodiment, the game system1can also be used in the state where the left controller3and the right controller4are detached from the main body apparatus2(referred to as a “separate state”). As a form in a case where an operation is performed on an application (e.g., a game application) using the game system1in the separate state, a form is possible in which two users each use the left controller3and the right controller4. Further, when three or more users perform operations using the same application, a form is possible in which a plurality of sets of the left controller3and the right controller4are prepared, and each user uses either one of the left controller3and the right controller4. Further, as another form in which a plurality of users perform operations, a form is possible in which each user uses both the left controller3and the right controller4. In this case, for example, a form is possible in which a plurality of sets of the left controller3and the right controller4are prepared, and each user uses one of the plurality of sets.

FIGS. 8 to 13are diagrams showing examples of the state where in the separate state, two users use the game system1by each user operating one of the left controller3and the right controller4. As shown inFIGS. 8 to 13, in the separate state, a first user and a second user can view an image displayed on the main body apparatus2while performing operations by the first user holding the left controller3with their both hands, and the second user holding the right controller4with their both hands.

For example, in the exemplary embodiment, the first user holds the left controller3with their both hands such that the longitudinal direction of the left controller3(a down direction shown inFIG. 1(a negative y-axis direction)), which is vertically long and approximately plate-shaped, is a transverse direction and a horizontal direction, and a side surface of the left controller3that is in contact with the main body apparatus2when the left controller3is attached to the main body apparatus2(a side surface on which a slider40is provided) is directed forward, and also the main surface of the left controller3(a surface on which the analog stick32and the like are provided) is directed upward. That is, the left controller3held with both hands of the first user is in the state where a negative x-axis direction is directed in the forward direction of the user, and a positive z-axis direction is directed upward. Further, the second user holds the right controller4with their both hands such that the longitudinal direction of the right controller4(a down direction shown inFIG. 1(a negative y-axis direction)), which is vertically long and approximately plate-shaped, is a transverse direction and a horizontal direction, and a side surface of the right controller4that is in contact with the main body apparatus2when the right controller4is attached to the main body apparatus2(a side surface on which a slider62is provided) is directed forward, and also the main surface of the right controller4(a surface on which the analog stick52and the like are provided) is directed upward. That is, the right controller4held with both hands of the second user is in the state where a positive x-axis direction is directed in the forward direction of the user, and a positive z-axis direction is directed upward.

As described above, in accordance with the fact that the operation buttons or the stick of the left controller3or the right controller4held with both hands are operated, game play is performed. In the state where the left controller3or the right controller4is held with both hands (hereinafter, such an operation method will occasionally be referred to as a “horizontally-held operation method”), each controller is moved in up, down, left, right, front, and back directions, rotated, or swung, whereby game play can also be performed in accordance with the motion or the orientation of the controller. Then, in the above game play, the acceleration sensor104of the left controller3can detect accelerations in the xyz-axis directions as operation inputs, and the angular velocity sensor105can detect angular velocities about the xyz-axis directions as operation inputs. Further, the acceleration sensor114of the right controller4can detect accelerations in the xyz-axis directions as operation inputs, and the angular velocity sensor115can detect angular velocities about the xyz-axis directions as operation inputs.

FIGS. 8 to 13show examples of game images displayed in a game played by operating the left controller3or the right controller4. As shown inFIGS. 8 and 9, in this exemplary game, an image of a game where a plurality of player characters (a first player character PC1and a second player character PC2in the examples ofFIGS. 8 and 9) compete against each other (e.g., a board game or a baseball pinball where the team of the first player character PC1and the team of the second player character PC2play baseball against each other) is displayed on the main body apparatus2. Then, the first user operating the left controller3can operate the first player character PC1by operating the analog stick32and the operation buttons33to36. Further, the second user operating the right controller4can operate the second player character PC2by operating the analog stick52and the operation buttons53to56. Further, the first user and the second user may perform operations by moving the left controller3and the right controller4themselves. It should be noted that each team may include a non-player character of which the action is automatically controlled by a computer (the processor81of the main body apparatus2). It should be noted that the first player character PC1corresponds to an example of a first operation target, and the second player character PC2corresponds to an example of a second operation target.

In this exemplary game, a horizontal board is set in a virtual space, and a baseball ground is provided on the board. Then, a player (the first player character PC1inFIG. 8) belonging to one team that is a defensive team rolls a ball-like virtual object OBJ on the board by performing a pitching action, and a player (the second player character PC2inFIG. 8) belonging to the other team that is an offensive team hits back the virtual object OBJ with a bat on the board, whereby the game progresses. That is, in this exemplary game, a game where on the board in the virtual space, the first player character PC1and the second player character PC2face each other, and the second player character PC2hits back the virtual object OBJ thrown by the first player character PC1is performed. It should be noted that the pitching action may be the action of the character throwing a ball. However, since this exemplary game imitates a baseball board game, the pitching action may be, for example, a more board-game-like action in which the character moves a pitching machine on a pinball, thereby discharging a ball.

For example, the first user operating the defensive team performs a pressing operation on the operation buttons33to36of the left controller3or performs a tilt operation on the analog stick32, and thereby can cause the first player character PC1to perform the action of throwing the virtual object OBJ by rolling the virtual object OBJ on the board. As an example, after tilting the analog stick32in the up or down direction, the first user releases the analog stick32and thereby can cause the first player character PC1to perform the pitching action. Further, the first user tilts the analog stick32in the left or right direction and thereby can cause the first player character PC1to perform the pitching action by, for example, shifting or breaking the virtual object OBJ in the left or right direction as viewed from the first user.

On the other hand, the second user operating the offensive team performs a pressing operation on the operation buttons53to56of the right controller4or performs a tilt operation on the analog stick52, and thereby can cause the second player character PC2to perform the action of hitting back the virtual object OBJ on the board. As an example, the second user presses the A-button53and thereby can cause the second player character PC2to hit back with a bat the virtual object OBJ thrown by the first player character PC1.

Here, in the exemplary embodiment, the game in a first mode and the game in a second mode can be performed. For example, in the exemplary embodiment, in a case where the main body apparatus2is in the orientation in which the display12is closer to vertical than a predetermined reference (the depth direction of the display12is close to horizontal) (hereinafter referred to as a “vertically-placed state”), the game is performed in the first mode. Further, in the exemplary embodiment, in a case where the main body apparatus2is in the orientation in which the display12is closer to horizontal than the predetermined reference (the depth direction of the display12is close to vertical) (hereinafter referred to as a “horizontally-placed state”), the game is performed in the second mode. It should be noted that the orientation of the information processing apparatus in which the screen is closer to vertical than a predetermined reference corresponds to, as an example, the vertically-placed state of the main body apparatus2. Further, the orientation of the information processing apparatus in which the screen is closer to horizontal than the predetermined reference corresponds to, as an example, the horizontally-placed state of the main body apparatus2.

It should be noted that when an image to be displayed on the display12is output to an external device, the first mode may be set. In this case, when the image is output to the external device in the state where the main body apparatus2is in the vertically-placed state or the main body apparatus2is connected to an external device, the game is performed in the first mode. Further, when the main body apparatus2is in the horizontally-placed state, and the main body apparatus2is not connected to an external device, and the image is not output to the external device, the game is performed in the second mode. As described above, when the main body apparatus2is mounted on the cradle, the game system1can display on the stationary monitor an image generated by and output from the main body apparatus2and switch the first mode and the second mode based on the presence or absence of a connection between the main body apparatus2and the cradle. When the main body apparatus2outputs an image to an external device, the external device displays a game image in the first mode. In the following description, however, an example is used where in accordance with the orientation of the main body apparatus2, a game image is displayed on the display12of the main body apparatus2by switching the first mode and the second mode. It should be noted that when an external video device different from the display12is connected to the main body apparatus2, a video output section outputs a video to the external video device. The video output section corresponds to, as an example, the lower terminal27that is connected to the cradle and outputs an image to an external device.

FIGS. 8 and 9exemplify examples of operations using the main body apparatus2placed in the vertically-placed state, and a game image in the first mode is displayed on the display12of the main body apparatus2. As shown inFIG. 8, in the first mode, it is assumed that at a location near the front of the display12of the main body apparatus2placed in the vertically-placed state, the first user and the second user perform operations side by side. Then, as shown inFIGS. 8 and 9, in the first mode, a game image including the first player character PC1and the second player character PC2is displayed on the main body apparatus2. Specifically, in the first mode, a virtual camera is placed at the position where the first player character PC1and the second player character PC2are viewed from behind the second player character PC2belonging to the offensive team, and a game image viewed from the virtual camera is displayed on the main body apparatus2. For example, in the case of a baseball board game (a table baseball game or a baseball pinball), in the first mode, a virtual camera viewing the first player character PC1as a pitcher and the second player character PC2as a batter from a backstop direction is set. Consequently, in the first mode, the virtual camera is placed on an extension of the virtual object OBJ thrown by the first player character PC1or above the extension. Thus, a game image of the line of sight from the second player character PC2as the batter to the first player character PC1as the pitcher is displayed. Thus, in the first mode, a game image in which, while opposed to the first player character PC1throwing the virtual object OBJ from the far side of the game image, the second player character PC2placed on the near side of the game image hits back the virtual object OBJ is displayed on the main body apparatus2. As an example, when the main body apparatus2is placed such that the negative y-axis direction of the main body apparatus2is the vertical direction, the virtual object OBJ thrown by the first player character PC1from near the center of the display screen is displayed by moving to the near side in a lower direction of the display screen and toward the second player character PC2(the negative y-axis direction and a pitching direction shown inFIGS. 8 and 9).

Further, as shown inFIGS. 8 and 9, in the game image in the first mode, an information image I is displayed. The information image I indicates information regarding the played game, the player characters, and the like using a letter, a number, a sign, an indicator, and the like. The information image I may inform both the first user and the second user of the information, or may inform one of the first user and the second user of the information. For example, in the examples shown inFIGS. 8 and 9, two information images I1and I2indicating the situation of the played game are displayed. Specifically, the information image I1is an image which informs the first user and the second user of the score situations of both teams using a letter and a number and to which an indicator indicating the currently offensive team is assigned. The information image I2is an image that informs the first user and the second user of the situation of the current inning (e.g., the number of strikes, the number of outs, and the like) using a letter and a sign. Then, both the information images I1and I2are displayed such that the direction from top to bottom of a letter or a number is the down direction of the main body apparatus2(the negative y-axis direction, and the pitching direction shown inFIGS. 8 and 9). Thus, the information images I1and I2are display that is easy for the first user and the second user performing operations at the location near the front of the display12of the main body apparatus2to read. It should be noted that a first information image corresponds to, as an example, the information images I1and I2.

Further, in the first mode, the left direction as viewed from the first user and the second user is the positive x-axis direction of the main body apparatus2, and the right direction as viewed from the first user and the second user is the negative x-axis direction of the main body apparatus2. Then, the directions of direction inputs using the left controller3and the right controller4(e.g., tilt operation inputs using the analog stick32and the analog stick52) are also associated based on these directions. That is, when the first user and the second user provide direction inputs of the left direction using the left controller3and the right controller4(e.g., inputs of tilting the analog stick32and the analog stick52to the left), these are direction inputs of the positive x-axis direction of the main body apparatus2and are associated with the direction from a first base to a third base in the virtual space (the right direction as viewed from the first player character PC1placed facing the virtual camera, and the left direction as viewed from the second player character PC2placed with its back against the virtual camera). When the first user wishes to pitch the virtual object OBJ from the right as viewed from the first user, the first user provides a direction input of the right direction using the left controller3(e.g., an input of tilting the analog stick32to the right). Consequently, a movement start position of the virtual object OBJ moves in the left direction as viewed from the first player character PC1. Further, when the first user wishes to pitch the virtual object OBJ from the left as viewed from the first user, the first user provides a direction input of the left direction using the left controller3(e.g., an input of tilting the analog stick32to the left). Consequently, the movement start position of the virtual object OBJ moves in the right direction as viewed from the first player character PC1. In the case of a full-fledged baseball game, not a baseball board game, when the first user wishes to break the virtual object OBJ thrown by the first player character PC1to the right as viewed from the first user, the first user provides a direction input of the right direction using the left controller3(e.g., an input of tilting the analog stick32to the right). Consequently, the virtual object OBJ moves along the trajectory in which the virtual object OBJ breaks in the left direction as viewed from the first player character PC1that is right-handed (a curveball or a slider). Further, when the first user wishes to break the virtual object OBJ thrown by the first player character PC1to the left as viewed from the first user, the first user provides a direction input of the left direction using the left controller3(e.g., an input of tilting the analog stick32to the left). Consequently, the virtual object OBJ moves along the trajectory in which the virtual object OBJ breaks in the right direction as viewed from the first player character PC1that is right-handed (a shootball or a screwball).

FIGS. 10 and 11exemplify examples of operations using the main body apparatus2placed in the horizontally-placed state, and a game image in the second mode is displayed on the display12of the main body apparatus2. As shown inFIG. 10, in the second mode, it is assumed that the first user performs an operation on the right side (further in the negative x-axis direction) of the display12of the main body apparatus2placed in the horizontally-placed state, the second user performs an operation on the left side (further in the positive x-axis direction), and the first user and the second user perform operations facing each other across the main body apparatus2. Then, as shown inFIGS. 10 and 11, also in the second mode, a game image including the first player character PC1and the second player character PC2is displayed on the main body apparatus2.

In the second mode, a virtual camera is placed such that the line-of-sight direction of the virtual camera is further downward in the virtual space than the line-of-sight direction set in the first mode. Specifically, in the second mode, a virtual camera is placed at a viewpoint looking down on the first player character PC1and the second player character PC2(e.g., a bird's-eye viewpoint or an overhead viewpoint), and a game image viewed from the virtual camera is displayed on the main body apparatus2. For example, in the case of a baseball board game (a table baseball game), in the second mode, a virtual camera looking down on the first player character PC1as a pitcher and the second player character PC2as a batter with the entirety of a ballpark as a field of view is set. Consequently, in the second mode, a game image in which a virtual camera is placed over the head of the first player character PC1or the second player character PC2is displayed. Thus, in the second mode, a game image in which, while opposed to the first player character PC1throwing the virtual object OBJ from near the center of the game image, the second player character PC2placed near one end (e.g., near the end in the positive x-axis direction) of the game image hits back the virtual object OBJ is displayed on the main body apparatus2. As an example, the virtual object OBJ thrown by the first player character PC1from near the center of the display screen is displayed by moving parallel to the display screen to the left of the display screen and toward the second player character PC2(the positive x-axis direction and the pitching direction shown inFIGS. 8 and 9).

Further, as shown inFIGS. 10 and 11, also in the game image in the second mode, the information image I is displayed. The information image I indicates information regarding the played game, the player characters, and the like using a letter, a number, a sign, an indicator, and the like. Similarly to the first mode, the information image I displayed in the second mode may inform both the first user and the second user of the information, or may inform one of the first user and the second user of the information. For example, in the examples shown inFIGS. 10 and 11, sets of information images I1and I2indicating the situation of the played game are displayed on the side where the first user plays and the side where the second user plays. Specifically, in the second mode, one set including an information image I1aand an information image I2aindicating the situation of the played game is displayed in one end portion of the display screen that is on the side where the first user plays (e.g., further in the negative x-axis direction of the main body apparatus2). Further, the other set including an information image I1b,which is the same image as the information image I1a,and an information image I2b,which is the same image as the information image I2a,is displayed in the other end portion of the display screen that is on the side where the second user plays (e.g., further in the positive x-axis direction of the main body apparatus2). Specifically, similarly to the first mode, the information images I1aand I1bare images which inform the first user and the second user of the score situations of both teams using a letter and a number and to which an indicator indicating the currently offensive team is assigned. Then, the information images12aand12bare images that inform the first user and the second user of the situation of the current inning (e.g., the number of strikes, the number of outs, and the like) using a letter and a sign.

Here, in the second mode, the information images I1aand I1bare placed in different directions in the respective end portions of the display screen. More specifically, the information image I1ais displayed in a right end portion of the display screen such that the direction from top to bottom of a letter or a number is the right direction of the main body apparatus2(the negative x-axis direction, and a direction opposite to the pitching direction shown inFIGS. 10 and 11). Thus, the information image I1ais display that is easy for the first user performing an operation from the right side of the main body apparatus2in the horizontally-placed state to read. Further, the information image I1bis displayed in a left end portion of the display screen such that the direction from top to bottom of a letter or a number is the left direction of the main body apparatus2(the positive x-axis direction, and the pitching direction shown inFIGS. 10 and 11). Thus, the information image I1bis display that is easy for the second user performing an operation from the left side of the main body apparatus2in the horizontally-placed state to read. Similarly, in the second mode, the information images I2aand I2bare also placed in different directions in the respective end portions of the display screen. More specifically, the information image I2ais displayed in the right end portion of the display screen such that the direction from top to bottom of a letter or a number is the right direction of the main body apparatus2(the negative x-axis direction, and the direction opposite to the pitching direction shown inFIGS. 10 and 11). Thus, the information image I2ais display that is easy for the first user performing an operation from the right side of the main body apparatus2in the horizontally-placed state to read. Further, the information image I2bis displayed in the left end portion of the display screen such that the direction from top to bottom of a letter or a number is the left direction of the main body apparatus2(the positive x-axis direction, and the pitching direction shown inFIGS. 10 and 11). Thus, the information image I2bis display that is easy for the second user performing an operation from the left side of the main body apparatus2in the horizontally-placed state to read.

Further, in the second mode, the left direction as viewed from the first user is the negative y-axis direction of the main body apparatus2, and the right direction as viewed from the first user is a positive y-axis direction of the main body apparatus2. Then, the direction of a direction input using the left controller3by the first user (e.g., a tilt operation input using the analog stick32) is also associated based on this direction. That is, when the first user provides a direction input of the left direction using the left controller3(e.g., an input of tilting the analog stick32to the left), a direction input of the negative y-axis direction of the main body apparatus2is provided and associated with the direction from the third base to the first base in the virtual space (the left direction as viewed from the first player character PC1, and the right direction as viewed from the second player character PC2placed facing the first player character PC1). When the first user wishes to pitch the virtual object OBJ from the left as viewed from the first user, the first user provides a direction input of the left direction using the left controller3(e.g., an input of tilting the analog stick32to the left). Consequently, the movement start position of the virtual object OBJ moves in the left direction as viewed from the first player character PC1. Further, when the first user wishes to pitch the virtual object OBJ from the right as viewed from the first user, the first user provides a direction input of the right direction using the left controller3(e.g., an input of tilting the analog stick32to the right). Consequently, the movement start position of the virtual object OBJ moves in the right direction as viewed from the first player character PC1. In the case of a full-fledged baseball game, not a baseball board game, when the first user wishes to break the virtual object OBJ thrown by the first player character PC1to the left as viewed from the first user, the first user provides a direction input of the left direction using the left controller3(e.g., an input of tilting the analog stick32to the left). Consequently, the virtual object OBJ moves along the trajectory in which the virtual object OBJ breaks in the left direction as viewed from the first player character PC1that is right-handed (a curveball or a slider). Further, when the first user wishes to break the virtual object OBJ thrown by the first player character PC1to the right as viewed from the first user, the first user provides a direction input of the right direction using the left controller3(e.g., an input of tilting the analog stick32to the right). Consequently, the virtual object OBJ moves along the trajectory in which the virtual object OBJ breaks in the right direction as viewed from the first player character PC1that is right-handed (a shootball or a screwball).

As is clear from the relationship between an operation in the left direction performed by the first user and the virtual space, the association between a left direction input to the direction input section (e.g., the analog stick32) of the left controller3and a direction in the virtual space by the first user differs between the first mode and the second mode. Specifically, in the first mode, the first user provides a direction input of the left direction using the left controller3, whereby, for example, the virtual object OBJ shifts or breaks in the right direction as viewed from the first player character PC1. In contrast, in the second mode, the first user provides a direction input of the left direction using the left controller3, whereby, for example, the virtual object OBJ shifts or breaks in the left direction as viewed from the first player character PC1. Thus, opposite directions are associated in the virtual space between the first mode and the second mode.

Further, when the first user provides a direction input of the right direction using the left controller3(e.g., an input of tilting the analog stick32to the right), a direction input of the positive y-axis direction of the main body apparatus2is provided and associated with the direction from the first base to the third base in the virtual space (the right direction as viewed from the first player character PC1, and the left direction as viewed from the second player character PC2placed facing the first player character PC1). Thus, when the first user wishes to shift or break the virtual object OBJ thrown by the first player character PC1to the right as viewed from the first user, the first user provides a direction input of the right direction using the left controller3(e.g., an input of tilting the analog stick32to the right). Consequently, the virtual object OBJ moves along the trajectory in which the virtual object OBJ shifts or breaks in the right direction as viewed from the first player character PC1that is right-handed (a shootball or a screwball).

As is clear from the relationship between an operation in the right direction performed by the first user and the virtual space, the association between a direction input to the direction input section (e.g., the analog stick32) of the left controller3by the first user and a direction in the virtual space differs between the first mode and the second mode. Specifically, in the first mode, the first user provides a direction input of the right direction using the left controller3, whereby, for example, the virtual object OBJ shifts or breaks in the left direction as viewed from the first player character PC1. In contrast, in the second mode, the first user provides a direction input of the right direction using the left controller3, whereby, for example, the virtual object OBJ shifts or breaks in the right direction as viewed from the first player character PC1. Thus, opposite directions are associated in the virtual space between the first mode and the second mode.

It should be noted that in the above description, an example is used where the association between a direction input to the direction input section (e.g., the analog stick32) of the left controller3by the first user and a direction in the virtual space differs between the first mode and the second mode. In the second mode, however, for the second user performing an operation while facing the first user across the main body apparatus2, the association may be the same. That is, in the exemplary embodiment, regarding at least one of an operation on the direction input section (e.g., the analog stick32) of the left controller3by the first user and an operation on the direction input section (e.g., the analog stick52) of the right controller4by the second user, the association between a direction input to the direction input section and a direction in the virtual space may be changed between the first mode and the second mode. For example, it is understood that in the above exemplary embodiment, the association between a direction input to the direction input section (e.g., the analog stick52) of the right controller4by the second user and a direction in the virtual space is the same between the first mode and the second mode. Specifically, in the first mode, the second user provides a direction input of the right direction using the right controller4, whereby a direction in the virtual space in the right direction as viewed from the second player character PC2is provided. Then, also in the second mode, the second user provides a direction input of the right direction using the right controller4, whereby a direction in the virtual space in the same right direction as viewed from the second player character PC2is provided. Thus, regarding the second user, the association is defined in the same direction in the virtual space between the first mode and the second mode.

Further, as an example of the association between a direction input to the above direction input section and a direction in the virtual space, the direction in which the virtual object OBJ shifts or breaks is used. However, it goes without saying that the association is also applicable to a direction in another type of control in the virtual space. For example, the association may be applied to the association between a direction input to the above direction input section and the moving direction or the placement direction of the first player character PC1or the second player character PC2in the virtual space.

Further, in the above exemplary embodiment, as an example of the information image I displayed in the first mode and the second mode, the information images I1and I2that inform both the first user and the second user of the information are used. Alternatively, the information image I may be displayed in the state where the information image I includes an information image that informs one of the first user and the second user of the information in the first mode and the second mode. Yet alternatively, only an information image that informs one of the first user and the second user of the information may be displayed. With reference toFIGS. 12 and 13, a description is given below of an example where the information image I is displayed in the state where the information image I includes an information image that informs one of the first user and the second user of the information in the first mode and the second mode.

InFIG. 12, in the game image in the first mode, in addition to the above information images I1and I2, information images I3and I4are displayed. Similarly to the information images I1and I2, also the information images I3and I4indicate information regarding the played game, the player characters, and the like using a letter, a number, a sign, an indicator, and the like. It should be noted that a first information image corresponds to the information images I3as another example. Further, when the first information image corresponds to the information images I3, a second information image corresponds to the information images I4as an example.

For example, the information images I3informs the first user of information. Specifically, the information images I3informs the first user of an operation method when operating the left controller3. The information images I3informs the first user that by tilting the analog stick32of the left controller3to the right, the first player character PC1can throw a curveball in which the virtual object OBJ shifts or breaks to the right as viewed from the first user (the left as viewed from the first player character PC1). Further, the information images I3informs the first user that by tilting the analog stick32of the left controller3to the left, the first player character PC1operated by the first user can throw a shootball in which the virtual object OBJ shifts or breaks to the left as viewed from the first user (the right as viewed from the first player character PC1).

Further, the information images I4informs the second user of information. Specifically, the information images I4informs the second user of an operation method when operating the right controller4. The information images I4informs the second user that by performing a pressing operation on the A-button53of the right controller4, the second player character PC2operated by the second user swings.

In the first mode, similarly to the information images I1and I2, the information images I3and I4are displayed such that the direction from top to bottom of a letter or a number is the down direction of the main body apparatus2(the negative y-axis direction, and the pitching direction shown inFIG. 12). Thus, in the first mode, similarly to the information images I1and I2, the information images I3is display that is easy for the first user performing an operation at the location near the front of the display12of the main body apparatus2to read. Further, in the first mode, similarly to the information images I1and I2, the information images I4is display that is easy for the second user performing an operation at the location near the front of the display12of the main body apparatus2to read.

Further, inFIG. 13, also in the game image in the second mode, in addition to the above information images I1and I2, the information images I3and I4are displayed. Similarly to the first mode, the information images I3displayed in the second mode informs the first user of the information. For example, in the example shown inFIG. 13, in addition to the set of the information images I1aand I2a,the information images I3is displayed on the side where the first user plays. Specifically, in the second mode, an information image group including the information images I1aand I2aand the information images I3is displayed in one end portion of the display screen that is on the side where the first user plays (e.g., further in the negative x-axis direction of the main body apparatus2).

Similarly to the first mode, the information images I4displayed in the second mode informs the second user of the information. For example, in the example shown inFIG. 13, in addition to the set of the information images I1band I2b,the information images I4is displayed on the side where the second user plays. Specifically, in the second mode, an information image group including the information images I1band I2band the information images I4is displayed in the other end portion of the display screen that is on the side where the second user plays (e.g., further in the positive x-axis direction of the main body apparatus2).

Here, in the second mode, the information images I3and I4are placed in different directions in the respective end portions of the display screen. More specifically, the information images I3is displayed in a right end portion of the display screen such that the direction from top to bottom of a letter or a number is the right direction of the main body apparatus2(the negative x-axis direction, and a direction opposite to the pitching direction shown inFIGS. 12 and 13). Thus, the information images I3is display that is easy for the first user performing an operation from the right side of the main body apparatus2in the horizontally-placed state to read. Further, the information images I4is displayed in a left end portion of the display screen such that the direction from top to bottom of a letter or a number is the left direction of the main body apparatus2(the positive x-axis direction, and the pitching direction shown inFIGS. 12 and 13). Thus, the information images I4is display that is easy for the second user performing an operation from the left side of the main body apparatus2in the horizontally-placed state to read.

It should be noted that the above information image I (the information images I1, I2, I3, and I4) may be displayed by being placed in the virtual space, or may be displayed by being combined with a virtual space image. In the first case, in the virtual space where the first player character PC1, the second player character PC2, the virtual object OBJ, and the like are placed, a plate-shaped polygon having a main surface perpendicular to the line-of-sight direction of the virtual camera is placed, and the information image I is pasted to the main surface of the polygon. An image rendered including the information image I is thus generated, whereby it is possible to display on the display12a game image in which the information image I is placed. In the second case, the information image I is combined in a superimposed manner with a virtual space image in which the first player character PC1, the second player character PC2, the virtual object OBJ, and the like are rendered, whereby it is possible to display a game image on the display12.

Further, in the above second mode, when the offense and defense of the player characters switch, i.e., when the first player character PC1switches from the pitcher to a batter, and the second player character PC2switches from the batter to a pitcher, as an example, the game may be performed by maintaining the direction of the virtual space displayed on the main body apparatus2and rotating by 180 degrees the main body apparatus2remaining in the horizontally-placed state. As another example, the game may be performed with the main body apparatus2remaining in the same state by displaying the virtual space displayed on the main body apparatus2by rotating the front, back, left, and right directions of the virtual space by 180 degrees. In any of the examples, the setting of the association between a direction input to the direction input section of each controller and a direction in the virtual space may be changed in accordance with the placement positions or the directions of the player characters after the offense and defense switch.

Next, with reference toFIGS. 14 and 15, a description is given of an example where the first mode or the second mode is set based on the state of the main body apparatus2. It should be noted thatFIG. 14is a diagram showing an example of a state determination method in a case where the orientation of the main body apparatus2is changed from the horizontally-placed state to the vertically-placed state.FIG. 15is a diagram showing an example of a state determination method in a case where the orientation of the main body apparatus2is changed from the vertically-placed state to the horizontally-placed state. BothFIGS. 14 and 15show the states where the surface of the display12of the main body apparatus2is directed upward, and show diagrams viewed from the side surface further in the positive x-axis direction (the side surface on the left side in the front view inFIG. 3).

In the exemplary embodiment, using a gravitational acceleration acting on the main body apparatus2, it is determined whether the main body apparatus2is in the horizontally-placed state or the vertically-placed state. Here, as described above, the main body apparatus2includes inertial sensors (the acceleration sensor89and/or the angular velocity sensor90). Based on the detection results detected by the inertial sensors (accelerations along the xyz axis directions detected by the acceleration sensor89and/or angular velocities about the xyz axes detected by the angular velocity sensor90), and using any method, the main body apparatus2can calculate a gravity vector indicating the gravitational acceleration acting on the main body apparatus2, and a y-axis direction component and a z-axis direction component of the gravity vector. For example, as shown inFIG. 14, in the exemplary embodiment, the gravity vector acting on the main body apparatus2is calculated, and the y-axis direction component and the z-axis direction component of the gravity vector are extracted. Then, in the main body apparatus2that is in the horizontally-placed state according to a determination, when the magnitude of the y-axis direction component of the gravity vector changes from less than a first threshold to greater than or equal to the first threshold, it is determined that the main body apparatus2changes from the horizontally-placed state to the vertically-placed state. It should be noted that as described above as an example, the determination of whether the main body apparatus2is in the horizontally-placed state or the vertically-placed state may be made based on the magnitude of the y-axis direction component of the gravity vector, i.e., the absolute value of the y-axis direction component. As another example, it is possible to determine, including the positivity and negativity of the value of the y-axis direction component of the gravity vector, that the main body apparatus2changes from the horizontally-placed state to the vertically-placed state. As shown inFIG. 14, in a case where the positive y-axis direction is an up direction when the main body apparatus2is in the vertically-placed state, the y-axis direction component of the gravity vector has a negative value. Thus, when the value of the y-axis direction component of the gravity vector changes to less than or equal to the first threshold, it is determined that the main body apparatus2changes from the horizontally-placed state to the vertically-placed state.

Further, as shown inFIG. 15, in the main body apparatus2that is in the vertically-placed state according to a determination, when the magnitude of the y-axis direction component of the gravity vector changes from greater than a second threshold to less than or equal to the second threshold, it is determined that the main body apparatus2changes from the vertically-placed state to the horizontally-placed state. It should be noted that based on another example described above, also when it is determined, including the positivity and negativity of the value of the y-axis direction component of the gravity vector, that the main body apparatus2changes from the vertically-placed state to the horizontally-placed state, the y-axis direction component of the gravity vector has a negative value. Thus, when the value of the y-axis direction component of the gravity vector changes to greater than or equal to the second threshold, it is determined that the main body apparatus2changes from the vertically-placed state to the horizontally-placed state.

It should be noted that the magnitude (the absolute value) of the first threshold may be set to be greater than the magnitude (the absolute value) of the second threshold. When the magnitude (the absolute value) of the first threshold and the magnitude (the absolute value) of the second threshold are set to be the same, and the determination is made using a single threshold, then in the state where the y-axis direction component of the gravity vector transitions near the threshold, it is possible that the vertically-placed state and the horizontally-placed state frequently switch. In response, the magnitude (the absolute value) of the first threshold is set to be greater than the magnitude (the absolute value) of the second threshold, whereby, after it is determined once based on one of the thresholds that the main body apparatus2is in one of the states, the determination switches to a determination based on the other threshold. Thus, it is possible to prevent the determination that the vertically-placed state and the horizontally-placed state frequently switch.

Further, it is possible that in accordance with the assumed play style, some latitude is allowed in the range where it is determined that the main body apparatus2is in the vertically-placed state, or the range where it is determined that the main body apparatus2is in the horizontally-placed state. As an example, it is possible that when a stand on which the main body apparatus2is placed in the vertically-placed state is prepared, and if the main body apparatus2is brought into the vertically-placed state using the stand, the display12is inclined by a predetermined angle from vertical. In such a case, the range where it is determined that the main body apparatus2is in the vertically-placed state may be set such that the inclination due to the use of the stand is allowed. As another example, when the main body apparatus2is placed in the horizontally-placed state, it is possible that the main body apparatus2is placed on a table. The top of some table, however, may be inclined from horizontal. Thus, it is possible that if the main body apparatus2is placed in the horizontally-placed state in such an inclined top, the display12is also inclined from horizontal. In such a case, the range where it is determined that the main body apparatus2is in the horizontally-placed state may be set such that the inclination of the top is allowed.

Further, the vertically-placed state and the horizontally-placed state of the main body apparatus2described above may be determined using another method. For example, it is also possible that a threshold is set based on the angle of the display12to the vertical direction, and the orientation of the main body apparatus2calculated based on angular velocities about the xyz axes detected by the angular velocity sensor90is determined using the threshold, thereby determining the state of the main body apparatus2. Further, also as a method for calculating the gravity vector, the gravity vector may be calculated using any method. As an example, after a gravitational acceleration acting on the main body apparatus2is detected based on accelerations along the xyz axis directions detected by the acceleration sensor89, the direction of the gravitational acceleration relative to the main body apparatus2may be sequentially calculated using angular velocities about the xyz axes detected by the angular velocity sensor90. As another example, for example, acceleration components generated on average in the main body apparatus2may be sequentially calculated using accelerations along the xyz axis directions detected by the acceleration sensor89, and the acceleration components may be extracted as a gravitational acceleration.

Next, with reference toFIGS. 16 to 19, a description is given of an example of specific processing executed by the game system1according to the exemplary embodiment.FIG. 16is a diagram showing an example of a data area set in the DRAM85of the main body apparatus2according to the exemplary embodiment. It should be noted that in the DRAM85, in addition to data shown inFIG. 16, data used for other processes is also stored, but is not described in detail here.

In a program storage area of the DRAM85, various programs Pa, which are executed by the game system1, are stored. In the exemplary embodiment, as the various programs Pa, a communication program for wirelessly communicating with the left controller3and the right controller4, an application program for performing information processing (e.g., game processing) based on data acquired from the left controller3and/or the right controller4, and the like are stored. It should be noted that the various programs Pa may be stored in advance in the flash memory84, or may be acquired from a storage medium attachable to and detachable from the game system1(e.g., a predetermined type of a storage medium attached to the slot23) and stored in the DRAM85, or may be acquired from another apparatus via a network such as the Internet and stored in the DRAM85. The processor81executes the various programs Pa stored in the DRAM85.

Further, in a data storage area of the DRAM85, various data used for processes such as a communication process and information processing executed by the game system1is stored. In the exemplary embodiment, in the DRAM85, operation data Da, angular velocity data Db, acceleration data Dc, orientation data Dd, first player character action data De, second player character action data Df, virtual object data Dg, orientation flag data Dh, virtual camera data Di, information image data Dj, image data Dk, and the like are stored.

The operation data Da is operation data appropriately acquired from each of the left controller3and/or the right controller4. As described above, operation data transmitted from each of the left controller3and/or the right controller4includes information regarding an input (specifically, information regarding an operation or the detection result of each sensor) from each of the input sections (specifically, each button, each analog stick, and each sensor). In the exemplary embodiment, operation data is transmitted in a predetermined cycle from each of the left controller3and/or the right controller4through wireless communication, and the operation data Da is appropriately updated using the received operation data. It should be noted that the update cycle of the operation data Da may be such that the operation data Da is updated every frame, which is the cycle of the processing described later executed by the main body apparatus2, or is updated every cycle in which the above operation data is transmitted through the wireless communication.

The angular velocity data Db is data indicating angular velocities generated in the main body apparatus2. For example, the angular velocity data Db includes data indicating angular velocities about the xyz axes generated in the main body apparatus2, and the like.

The acceleration data Dc is data indicating accelerations generated in the main body apparatus2. For example, the acceleration data Dc includes data indicating accelerations in the xyz axis directions generated in the main body apparatus2, and the like.

The orientation data Dd is data indicating the orientation of the main body apparatus2in real space. As an example, the orientation data Dd is data regarding a gravitational acceleration generated in the main body apparatus2and is data indicating the magnitude or the value of a y-axis direction component of a gravity vector indicating the gravitational acceleration generated in the main body apparatus2.

The first player character action data De is data indicating the position, the direction, the orientation, the action, and the like of the first player character PC1in the virtual space. The second player character action data Df is data indicating the position, the direction, the orientation, the action, and the like of the second player character PC2in the virtual space. The virtual object data Dg is data indicating the position, the moving direction, and the like of the virtual object OBJ in the virtual space.

The orientation flag data Dh is data indicating an orientation flag indicating whether the main body apparatus2is in the vertically-placed state or the horizontally-placed state. The orientation flag is set to on when the main body apparatus2is in the vertically-placed state, and is set to off when the main body apparatus2is in the horizontally-placed state.

The virtual camera data Di is data indicating the position and the direction of the virtual camera in the virtual space set in accordance with whether the first mode or the second mode is set.

The information image data Dj is data indicating the content of the information image I displayed on the display screen (e.g., the display12of the main body apparatus2).

The image data Dk is data for displaying images (e.g., an image of a player character, an image of a virtual object, an information image, a field image, a background image, and the like) on the display screen (e.g., the display12of the main body apparatus2) when a game is performed.

Next, with reference toFIGS. 17 to 19, a detailed example of information processing (game processing) according to the exemplary embodiment is described.FIG. 17is a flow chart showing an example of game processing executed by the game system1.FIG. 18is a subroutine showing a detailed example of a first mode game process performed in step S125inFIG. 17.FIG. 19is a subroutine showing a detailed example of a second mode game process performed in step S126inFIG. 17. In the exemplary embodiment, a series of processes shown inFIGS. 17 to 19is performed by the processor81executing a communication program or a predetermined application program (a game program) included in the various programs Pa. Further, the information processing shown inFIGS. 17 to 19is started at any timing.

It should be noted that the processes of all of the steps in the flow charts shown inFIGS. 17 to 19are merely illustrative. Thus, the processing order of the steps may be changed, or another process may be performed in addition to (or instead of) the processes of all of the steps, so long as similar results are obtained. Further, in the exemplary embodiment, descriptions are given on the assumption that the processor81performs the processes of all of the steps in the flow charts. Alternatively, a processor or a dedicated circuit other than the processor81may perform the processes of some of the steps in the flow charts. Yet alternatively, part of the processing performed by the main body apparatus2may be executed by another information processing apparatus capable of communicating with the main body apparatus2(e.g., a server capable of communicating with the main body apparatus2via a network). That is, all the processes shown inFIGS. 17 to 19may be executed by the cooperation of a plurality of information processing apparatuses including the main body apparatus2.

InFIG. 17, the processor81performs initialization in the game processing (step S120), and the processing proceeds to the next step. For example, in the initialization, the processor81initializes parameters for performing the processing described below.

Next, the processor81acquires operation data from each of the left controller3and/or the right controller4and updates the operation data Da (step S121), and the processing proceeds to the next step.

Next, the processor81acquires inertial data (acceleration data and/or angular velocity data) from the inertial sensors (the acceleration sensor89and/or the angular velocity sensor90) provided in the main body apparatus2and updates the acceleration data Dc and/or the angular velocity data Db (step S122), and the processing proceeds to the next step.

Next, the processor81determines the orientation of the main body apparatus2(step S123), and the processing proceeds to the next step. For example, using the acceleration data and/or the angular velocity data stored in the angular velocity data Db and/or the acceleration data Dc, the processor81calculates a gravity vector of a gravitational acceleration acting on the main body apparatus2and calculates a y-axis direction component of the gravity vector. Then, when the orientation flag indicated by the orientation flag data Dh is set to on, i.e., the vertically-placed state, and if the magnitude of the y-axis direction component is less than or equal to a second threshold (or a value including the positivity and negativity of the y-axis direction component is greater than or equal to the second threshold), the processor81sets the orientation flag to off, i.e., the horizontally-placed state, thereby updating the orientation flag data Dh (seeFIG. 15). Further, when the orientation flag indicated by the orientation flag data Dh is set to off, i.e., the horizontally-placed state, and if the magnitude of the y-axis direction component is greater than or equal to a first threshold (or a value including the positivity and negativity of the y-axis direction component is less than or equal to the first threshold), the processor81sets the orientation flag to on, i.e., the vertically-placed state, thereby updating the orientation flag data Dh (seeFIG. 14). It should be noted that when the main body apparatus2connected to the cradle outputs an image to an external device (e.g., the stationary monitor), the processor81sets the orientation flag to on, thereby updating the orientation flag data Dh. It should be noted that based on inertial data of the inertial sensors, a computer for calculating an orientation calculates the orientation of the information processing apparatus, and as an example, corresponds to the processor81that performs the process of step S123.

Next, the processor81determines whether or not the orientation of the main body apparatus2is in the vertically-placed state (step S124). For example, when the orientation flag indicated by the orientation flag data Dh is set to on, the determination is affirmative in the above step S124. Then, when the orientation of the main body apparatus2is in the vertically-placed state, the processing proceeds to step S125. On the other hand, when the orientation of the main body apparatus2is in the horizontally-placed state, the processing proceeds to step S126. It should be noted that a computer for setting a mode sets to a first mode a case where the information processing apparatus is in an orientation in which the screen is closer to vertical than a predetermined reference, and sets to a second mode a case where an external video device is not connected to a video output section, and the information processing apparatus is in an orientation in which the screen is closer to horizontal than the reference, and as an example, corresponds to the processor81that performs the process of step S124.

In step S125, the processor81performs a first mode game process, and the processing proceeds to step S127. With reference toFIG. 18, a description is given below of the first mode game process performed in step S127.

InFIG. 18, the processor81performs an action process on the first player character PC1(step S131), and the processing proceeds to the next step. For example, with reference to operation data acquired from the controller for operating the first player character PC1(e.g., the left controller3operated by the first user) in the operation data acquired in the above step S121, the processor81sets the motion of the first player character PC1in the first mode corresponding to the operation data. Then, based on the set motion of the first player character PC1, the processor81sets the position, the direction, the orientation, the action, and the like of the first player character PC1in the virtual space, thereby updating the first player character action data De. It should be noted that in the process of the above step S131, for the association between a direction input to the direction input section (e.g., the analog stick32) of the controller for operating the first player character PC1(e.g., the left controller3) and the motion direction of the first player character PC1in the virtual space, as described above, settings based on the orientation of the main body apparatus2in the first mode (the vertically-placed state; seeFIGS. 8, 9, and 12) are used.

Next, the processor81performs an action process on the second player character PC2(step S132), and the processing proceeds to the next step. For example, with reference to operation data acquired from the controller for operating the second player character PC2(e.g., the right controller4operated by the second user) in the operation data acquired in the above step S121, the processor81sets the motion of the second player character PC2in the first mode corresponding to the operation data. Then, based on the set motion of the second player character PC2, the processor81sets the position, the direction, the orientation, the action, and the like of the second player character PC2in the virtual space, thereby updating the second player character action data Df It should be noted that in the process of the above step S132, for the association between a direction input to the direction input section (e.g., the analog stick52) of the controller for operating the second player character PC2(e.g., the right controller4) and the motion direction of the second player character PC2in the virtual space, as described above, settings based on the orientation of the main body apparatus2in the first mode (the vertically-placed state; seeFIGS. 8, 9, and 12) are used.

Next, the processor81performs an action process on the virtual object OBJ (step S133), and the processing proceeds to the next step. For example, in accordance with the actions of the first player character PC1and/or the second player character PC2and the operation data acquired in the above step S121, the processor81sets the motion of the virtual object OBJ in the first mode. Then, based on the set motion of the virtual object OBJ, the processor81sets the position, the moving direction, and the like of the virtual object OBJ in the virtual space, thereby updating the virtual object action data Dg. It should be noted that in the process of the above step S133, for the association between directions input to the direction input sections (e.g., the analog sticks32and52) of the controllers for operating the first player character PC1and/or the second player character PC2and the motion direction of the virtual object OBJ in the virtual space, as described above, settings based on the orientation of the main body apparatus2in the first mode (the vertically-placed state; seeFIGS. 8, 9, and 12) are used.

Next, the processor81generates an information image (step S134), and the processing proceeds to the next step. As an example, the processor81generates an information image (e.g., the information image I1or the information image I2) for informing both the first user and the second user of information regarding the played game, the player characters, and the like. As another example, the processor81generates an information image (e.g., the information images I3or the information images I4) for informing one of the first user and the second user of information regarding the played game, the player characters, and the like. It should be noted that a computer for executing game processing executes predetermined game processing by controlling the first operation target based on first operation data acquired from a first operation device and controlling the second operation target based on second operation data acquired from a second operation device, and as an example, corresponds to the processor81that performs the processes of steps S131to S134.

Next, the processor81performs a first display control process for generating a display image for the first mode and displaying the display image on the display screen (step S135), and the processing of this subroutine ends. For example, based on the first player character action data De, the second player character action data Df, and the virtual object data Dg, the processor81places the first player character PC1, the second player character PC2, and the virtual object OBJ in the virtual space. Further, the processor81places the information image I generated in the above step S134at a predetermined position in the virtual space such that the up-down direction of the information image is the up-down direction of the virtual space. Then, the processor81places the virtual camera such that the line-of-sight direction of the virtual camera is a first line-of-sight direction (e.g., the direction in which the first player character PC1, the second player character PC2, the virtual object OBJ, and the information image I are viewed from behind the first player character PC1or the second player character PC2), generates a virtual space image viewed from the virtual camera, and displays the virtual space image on the display12. It should be noted that the information image I may be combined in a superimposed manner with a virtual space image in which the virtual space where the first player character PC1, the second player character PC2, the virtual object OBJ, and the like are placed is viewed from the virtual camera. It should be noted that a computer for generating a game image, based on a virtual camera placed in the virtual space, generates a game image including the first operation target and the second operation target and further including, between a first information image indicating first information and a second information image indicating second information, at least the first information image, and as an example, corresponds to the processor81that performs the process of step S135.

Referring back toFIG. 17, when it is determined in the above step S124that the orientation of the main body apparatus2is in the horizontally-placed state, the processor81performs a second mode game process (step S126), and the processing proceeds to step S127.

With reference toFIG. 19, a description is given below of the second mode game process performed in the above step S126.

InFIG. 19, the processor81performs an action process on the first player character PC1(step S141), and the processing proceeds to the next step. For example, with reference to operation data acquired from the controller for operating the first player character PC1(e.g., the left controller3operated by the first user) in the operation data acquired in the above step S121, the processor81sets the motion of the first player character PC1in the second mode corresponding to the operation data. Then, based on the set motion of the first player character PC1, the processor81sets the position, the direction, the orientation, the action, and the like of the first player character PC1in the virtual space, thereby updating the first player character action data De. It should be noted that in the process of the above step S141, for the association between a direction input to the direction input section (e.g., the analog stick32) of the controller for operating the first player character PC1(e.g., the left controller3) and the motion direction of the first player character PC1in the virtual space, as described above, settings based on the orientation of the main body apparatus2in the second mode (the horizontally-placed state; seeFIGS. 10, 11, and 13) are used.

Next, the processor81performs an action process on the second player character PC2(step S142), and the processing proceeds to the next step. For example, with reference to operation data acquired from the controller for operating the second player character PC2(e.g., the right controller4operated by the second user) in the operation data acquired in the above step S121, the processor81sets the motion of the second player character PC2in the second mode corresponding to the operation data. Then, based on the set motion of the second player character PC2, the processor81sets the position, the direction, the orientation, the action, and the like of the second player character PC2in the virtual space, thereby updating the second player character action data Df It should be noted that in the process of the above step S142, for the association between a direction input to the direction input section (e.g., the analog stick52) of the controller for operating the second player character PC2(e.g., the right controller4) and the motion direction of the second player character PC2in the virtual space, as described above, settings based on the orientation of the main body apparatus2in the second mode (the horizontally-placed state; seeFIGS. 10, 11, and 13) are used.

Next, the processor81performs an action process on the virtual object OBJ (step S143), and the processing proceeds to the next step. For example, in accordance with the actions of the first player character PC1and/or the second player character PC2and the operation data acquired in the above step S121, the processor81sets the motion of the virtual object OBJ in the second mode. Then, based on the set motion of the virtual object OBJ, the processor81sets the position, the moving direction, and the like of the virtual object OBJ in the virtual space, thereby updating the virtual object action data Dg. It should be noted that in the process of the above step S143, for the association between directions input to the direction input sections (e.g., the analog sticks32and52) of the controllers for operating the first player character PC1and/or the second player character PC2and the motion direction of the virtual object OBJ in the virtual space, as described above, settings based on the orientation of the main body apparatus2in the second mode (the horizontally-placed state; seeFIGS. 10, 11, and 13) are used.

Next, the processor81generates an information image (step S144), and the processing proceeds to the next step. As an example, the processor81generates an information image (e.g., the information image I1or the information image I2) for informing both the first user and the second user of information regarding the played game, the player characters, and the like. As another example, the processor81generates an information image (e.g., the information images I3or the information images I4) for informing one of the first user and the second user of information regarding the played game, the player characters, and the like. It should be noted that a computer for executing game processing executes predetermined game processing by controlling the first operation target based on first operation data acquired from a first operation device and controlling the second operation target based on second operation data acquired from a second operation device, and as another example, corresponds to the processor81that performs the processes of steps S141to S144.

Next, the processor81performs a second display control process for generating a display image for the second mode and displaying the display image on the display screen (step S145), and the processing of this subroutine ends. For example, based on the first player character action data De, the second player character action data Df, and the virtual object data Dg, the processor81places the first player character PC1, the second player character PC2, and the virtual object OBJ in the virtual space. Then, in the information image I generated in the above step S144, the processor81places information image for informing only the first user of information at a predetermined position in the virtual space such that the up-down direction of the information image is the horizontal direction of the virtual space and is one direction (e.g., the right direction) of the main body apparatus2. Further, in the information image I generated in the above step S144, the processor81places an information image for informing only the second user of information at a predetermined position in the virtual space such that the up-down direction of the information image is the horizontal direction of the virtual space and is a direction (e.g., the left direction) opposite to the one direction of the main body apparatus2. Further, in the information image I generated in the above step S144, the processor81prepares information images for informing both the first user and the second user for the first user and the second user and places the information images at predetermined positions in the virtual space such that the up-down direction of the information image for the first user is the one direction, and the up-down direction of the information image for the second user is the opposite direction. Then, the processor81places the virtual camera such that the line-of-sight direction of the virtual camera is a second line-of-sight direction (e.g., a bird's-eye viewpoint or an overhead viewpoint) looking down on the first player character PC1, the second player character PC2, the virtual object OBJ, and the information image I, which is further downward in the virtual space than the first line-of-sight direction. Then, the processor81generates a virtual space image viewed from the virtual camera and displays the virtual space image on the display12. It should be noted that the information image I may be combined in a superimposed manner with a virtual space image in which the virtual space where the first player character PC1, the second player character PC2, the virtual object OBJ, and the like are placed is viewed from the virtual camera. It should be noted that a computer for generating a game image, based on a virtual camera placed in the virtual space, generates a game image including the first operation target and the second operation target and further including, between a first information image indicating first information and a second information image indicating second information, at least the first information image, and as another example, corresponds to the processor81that performs the process of step S145.

Referring back toFIG. 17, in step S127, the processor81determines whether or not the game is to be ended. In the above step S127, examples of a condition for ending the game include: the fact that the result of the game is finalized; and the fact that the user performs the operation of ending the game. When the game is not to be ended, the processing returns to the above step S121, and the process of step S121is repeated. When the game is to be ended, the processing of the flow chart ends. Hereinafter, the series of processes of steps S121to S127are repeatedly executed until it is determined in step S127that the game is to be ended.

As described above, in the exemplary embodiment, it is possible to perform a game in different play styles in accordance with the state of the main body apparatus2including the display12and increase the variety when a game is performed by placing the main body apparatus2. Further, the information image I displayed on the display12is displayed in the state where the information image I is easy for each user to view in accordance with the state of the main body apparatus2. Thus, it is easy for the users to understand the game.

It should be noted that in the above exemplary embodiment, in the first mode, the virtual camera is placed at the position where the first player character PC1and the second player character PC2are viewed from behind the second player character PC2(i.e., in front of the first player character PC1). Alternatively, the virtual camera may be placed at another viewpoint. As an example, when the first player character PC1and the second player character PC2are placed in the virtual space while facing in the same direction, the virtual camera may be placed at the position where the first player character PC1and the second player character PC2are viewed from behind the first player character PC1and the second player character PC2. As another example, in the first mode, the virtual camera may be placed at the position where the first player character PC1and the second player character PC2are viewed from the sides of the first player character PC1and the second player character PC2. In this case, it is possible to display a game image suitable for a case where the game progresses while the first player character PC1and the second player character PC2proceed in the same direction.

Further, in the above exemplary embodiment, an exemplary game where a baseball board game (a table baseball game or a baseball pinball) is played is used. Alternatively, the exemplary embodiment may be applied to another game. For example, the exemplary embodiment can be applied to various games such as a table soccer game, a table athletic game, and a sugoroku game. As an example, when the exemplary embodiment is applied to the table soccer game, it is possible that a user is urged to play in either of the first mode and the second mode in accordance with the game scene. Specifically, in the table soccer game, it is possible that in a particular scene where the first player character PC1and the second player character PC2confront each other while facing each other (e.g., the scene of a penalty kick where a kicker and a goalkeeper confront each other one-on-one), the user is urged to play in the first mode so as to play viewing a game image from behind the goalkeeper. It is possible that in a normal scene, the user is urged to play in the second mode so as to play viewing a game image looking down on the entirety of a soccer field.

Further, in the above description, a form is used in which the first user and the second user perform game play. Alternatively, the number of users performing game play using the game system1may be three or more, or may be one. When three or more users perform game play, three or more controllers wirelessly connected to the main body apparatus2may be used, and in the second mode, the direction of the information image I for each user may be adjusted in accordance with the play position of the user relative to the main body apparatus2.

Further, in the above exemplary embodiment, a method for detecting the orientation of the main body apparatus2is a mere example. Alternatively, the orientation of the main body apparatus2may be detected using another method or another piece of data. Further, the controller for controlling the action of the first player character PC1or the second player character PC2may be not only the left controller3or the right controller4, but also another controller.

Further, an additional apparatus (e.g., a cradle) may be any additional apparatus attachable to and detachable from the main body apparatus2. The additional apparatus may or may not have the function of charging the main body apparatus2as in the exemplary embodiment.

Further, the game system1may be any apparatus, and may be a mobile game apparatus, any mobile electronic device (a PDA (Personal Digital Assistant), a mobile phone, a personal computer, a camera, a tablet, or the like) or the like.

Further, the above descriptions have been given using an example where the game system1performs information processing (game processing) and a communication process. Alternatively, another apparatus may perform at least some of the processing steps. For example, if the game system1is further configured to communicate with another apparatus (e.g., another server, another image display device, another game apparatus, or another mobile terminal), the other apparatus may move in conjunction with to perform the processing steps. Another apparatus may thus perform at least some of the processing steps, thereby enabling processing similar to that described above. Further, the above information processing (game processing) can be performed by a processor or the cooperation of a plurality of processors, the processor or the plurality of processors included in an information processing system including at least one information processing apparatus. Further, in the above exemplary embodiment, information processing can be performed by the processor81of the game system1executing a predetermined program. Alternatively, part or all of the processing of the flow charts may be performed by a dedicated circuit included in the game system1.

Here, according to the above variations, it is possible to achieve the exemplary embodiment also by a system form such as cloud computing, or a system form such as a distributed wide area network or a local area network. For example, in a system form such as a distributed local area network, it is possible to execute the processing between a stationary information processing apparatus (a stationary game apparatus) and a mobile information processing apparatus (a mobile game apparatus) by the cooperation of the apparatuses. It should be noted that, in these system forms, there is no particular limitation on which apparatus performs the above processing. Thus, it goes without saying that it is possible to achieve the exemplary embodiment by sharing the processing in any manner.

Further, the processing orders, the setting values, the conditions used in the determinations, and the like that are used in the information processing described above are merely illustrative. Thus, it goes without saying that the exemplary embodiment can be achieved also with other orders, other values, and other conditions.

Further, the above program may be supplied to the game system1not only through an external storage medium such as an external memory, but also through a wired or wireless communication link. Further, the program may be stored in advance in a non-volatile storage device included in the apparatus. It should be noted that examples of an information storage medium having stored therein the program may include CD-ROMs, DVDs, optical disk storage media similar to these, flexible disks, hard disks, magneto-optical disks, and magnetic tapes, as well as non-volatile memories. Alternatively, an information storage medium having stored therein the program may be a volatile memory for storing the program. It can be said that such a storage medium is a storage medium readable by a computer or the like. For example, it is possible to provide the various functions described above by causing a computer or the like to load a program from the storage medium and execute it.

While some exemplary systems, exemplary methods, exemplary devices, and exemplary apparatuses have been described in detail above, the above descriptions are merely illustrative in all respects, and do not limit the scope of the systems, the methods, the devices, and the apparatuses. It goes without saying that the systems, the methods, the devices, and the apparatuses can be improved and modified in various manners without departing the spirit and scope of the appended claims. It is understood that the scope of the systems, the methods, the devices, and the apparatuses should be interpreted only by the scope of the appended claims. Further, it is understood that the specific descriptions of the exemplary embodiment enable a person skilled in the art to carry out an equivalent scope on the basis of the descriptions of the exemplary embodiment and general technical knowledge. When used in the specification, the components and the like described in the singular with the word “a” or “an” preceding them do not exclude the plurals of the components. Furthermore, it should be understood that, unless otherwise stated, the terms used in the specification are used in their common meanings in the field. Thus, unless otherwise defined, all the jargons and the technical terms used in the specification have the same meanings as those generally understood by a person skilled in the art in the field of the exemplary embodiment. If there is a conflict, the specification (including definitions) takes precedence.

As described above, the exemplary embodiment can be used as a game system, a game program, a game apparatus, a game processing method, and the like that are capable of, for example, increasing the variety when a game is performed.

Claims

  1. A game system including an information processing apparatus, and a first operation device and a second operation device wirelessly connectable to the information processing apparatus, the information processing apparatus comprising: a screen on which an image is displayable;an inertial sensor;a video output configured to, when an external video device different from the screen is connected to the video output, output a video to the external video device;and at least one computer configured to: based on inertial data of the inertial sensor, calculate an orientation of the information processing apparatus;execute game processing by, in a virtual space, controlling a first operation target based on first operation data acquired from the first operation device and controlling a second operation target based on second operation data acquired from the second operation device;set the game processing to operate in one of at least first and second modes, the first mode being set in a case where an external video device is connected to the video output, the first mode being set in a case where the information processing apparatus is in an orientation in which the screen faces forward more than upward, and the second mode being set in a case where an external video device is not connected to the video output and the information processing apparatus is in an orientation in which the screen faces upward more than forward;and based on a virtual camera placed in the virtual space, generate a game image including the first operation target and the second operation target and further including, between a first information image indicating first information and a second information image indicating second information, at least the first information image, wherein in the generating of the game image, when the first mode is set, the virtual camera is set such that a line-of-sight direction of the virtual camera is a first line-of-sight direction, and a game image in which the first information image is placed in a first direction or a game image in which the first information image and the second information image are placed in the same direction is generated, and when the second mode is set, the virtual camera is set such that the line-of-sight direction of the virtual camera is a second line-of-sight direction that is further downward in the virtual space than the first line-of-sight direction, and a game image in which the plurality of first information images are placed in different directions or a game image in which the first information image and the second information image are placed in different directions is generated.
  1. The game system according to claim 1 , wherein: the first operation device and the second operation device each include a directional input, the first operation data includes first direction input data acquired from the directional input of the first operation device, the second operation data includes second direction input data acquired from the directional input of the second operation device, in the executing of the game processing, the first operation target is moved in the virtual space based on the first direction input data, and the second operation target is moved in the virtual space based on the second direction input data, and in the executing of the game processing, regarding at least one of the first direction input data and the second direction input data, an association between a direction input to the directional input and a direction in the virtual space is changed between the first mode and the second mode.
  2. The game system according to claim 2 , wherein in the executing of the game processing, when the first mode is set, in accordance with the fact that the directional input of the first operation device is operated in a first direction, the operation target is controlled as an indication indicating a second direction in the virtual space, when the second mode is set, in accordance with the fact that the directional input of the first operation device is operated in the first direction, the operation target is controlled as an indication indicating a third direction opposite to the second direction in the virtual space, when the first mode is set, in accordance with the fact that the directional input of the second operation device is operated in a fourth direction, the operation target is controlled as an indication indicating a fifth direction in the virtual space, and when the second mode is set, in accordance with the fact that the directional input of the second operation device is operated in the fourth direction, the operation target is controlled as an indication indicating the fifth direction in the virtual space.
  3. The game system according to claim 1 , wherein: the first information includes a letter and/or a number for a user operating the first operation device, and the second information includes a letter and/or a number for a user operating the second operation device.
  4. The game system according to claim 1 , wherein in the generating of the game image, when the first mode is set, the virtual camera is placed behind one of the first operation target and the second operation target, and a game image including the first operation target and the second operation target is generated.
  5. The game system according to claim 1 , wherein in the generating of the game image, when the second mode is set, the virtual camera is set at a bird's-eye viewpoint, and a game image including the first operation target and the second operation target is generated.
  6. The game system according to claim 1 , wherein the game processing provides, in the virtual space, a game where the first operation target is associated with flying a virtual object based on the first operation data, and the second operation target is associated with hitting back the virtual object based on the second operation data.
  7. The game system according to claim 1 , wherein the game processing provides, in the virtual space, a game where the first operation target and the second operation target compete against each other while facing each other.
  8. A non-transitory computer-readable storage medium having stored therein instructions executable by a computer included in an information processing apparatus wirelessly connectable to a first operation device and a second operation device, the information processing apparatus comprising: a screen on which an image is displayable;an inertial sensor;and a video output configured to, when an external video device different from the screen is connected to the video output, output a video to the external video device;the instructions, when executed, causing the computer to perform operations comprising: based on inertial data of the inertial sensor, calculating an orientation of the information processing apparatus;executing game processing by, in a virtual space, controlling a first operation target based on first operation data acquired from the first operation device and controlling a second operation target based on second operation data acquired from the second operation device;set the game processing to operate in one of at least first and second modes, the first mode being set in a case where an external video device is connected to the video output, the first being set in a case where the information processing apparatus is in an orientation in which the screen faces forward more than upward, and the second mode being set in a case where an external video device is not connected to the video output and the information processing apparatus is in an orientation in which the screen faces upward more than forward;and based on a virtual camera placed in the virtual space, generating a game image including the first operation target and the second operation target and further including, between a first information image indicating first information and a second information image indicating second information, at least the first information image, wherein in the generating of the game image, when the first mode is set, the virtual camera is set such that a line-of-sight direction of the virtual camera is a first line-of-sight direction, and a game image in which the first information image is placed in a first direction or a game image in which the first information image and the second information image are placed in the same direction is generated, and when the second mode is set, the virtual camera is set such that the line-of-sight direction of the virtual camera is a second line-of-sight direction that is further downward in the virtual space than the first line-of-sight direction, and a game image in which the plurality of first information images are placed in different directions or a game image in which the first information image and the second information image are placed in different directions is generated.
  9. The non-transitory computer-readable storage medium according to claim 9 , wherein: the first operation device and the second operation device each include a directional input, the first operation data includes first direction input data acquired from the directional input of the first operation device, the second operation data includes second direction input data acquired from the directional input of the second operation device, in the executing of the game processing, the first operation target is moved in the virtual space based on the first direction input data, and the second operation target is moved in the virtual space based on the second direction input data, and in the executing of the game processing, regarding at least one of the first direction input data and the second direction input data, an association between a direction input to the directional input and a direction in the virtual space is changed between the first mode and the second mode.
  10. The non-transitory computer-readable storage medium according to claim 10 , wherein in the executing of the game processing, when the first mode is set, in accordance with the fact that the directional input of the first operation device is operated in a first direction, the operation target is controlled as an indication indicating a second direction in the virtual space, when the second mode is set, in accordance with the fact that the directional input of the first operation device is operated in the first direction, the operation target is controlled as an indication indicating a third direction opposite to the second direction in the virtual space, when the first mode is set, in accordance with the fact that the directional input of the second operation device is operated in a fourth direction, the operation target is controlled as an indication indicating a fifth direction in the virtual space, and when the second mode is set, in accordance with the fact that the directional input of the second operation device is operated in the fourth direction, the operation target is controlled as an indication indicating the fifth direction in the virtual space.
  11. The non-transitory computer-readable storage medium according to claim 9 , wherein: the first information includes a letter and/or a number for a user operating the first operation device, and the second information includes a letter and/or a number for a user operating the second operation device.
  12. The non-transitory computer-readable storage medium according to claim 9 , wherein in the generating of the game image, when the second mode is set, the virtual camera is placed behind one of the first operation target and the second operation target, and a game image including the first operation target and the second operation target is generated.
  13. The non-transitory computer-readable storage medium according to claim 9 , wherein in the generating of the game image, when the second mode, the virtual camera is set at a bird's-eye viewpoint, and a game image including the first operation target and the second operation target is generated.
  14. The non-transitory computer-readable storage medium according to claim 9 , wherein the game processing provides, in the virtual space, a game where the first operation target is associated with flying a virtual object based on the first operation data, and the second operation target is associated with hitting back the virtual object based on the second operation data.
  15. The non-transitory computer-readable storage medium according to claim 9 , wherein the game processing provides, in the virtual space, a game where the first operation target and the second operation target compete against each other while facing each other.
  16. A game apparatus wirelessly connectable to a first operation device and a second operation device, the game apparatus comprising: a screen on which an image is displayable;an inertial sensor;a video output configured to, when an external video device different from the screen is connected to the video output, output a video to the external video device;and at least one computer configured to: based on inertial data of the inertial sensor, calculate an orientation of the information processing apparatus;execute game processing by, in a virtual space, controlling a first operation target based on first operation data acquired from the first operation device and controlling a second operation target based on second operation data acquired from the second operation device;set the game processing to operate in one of at least first and second modes, the first mode being set in a case where an external video device is connected to the video output, the first mode being set in a case where the information processing apparatus is in an orientation in which the screen faces forward more than upward, and the second mode being set in a case where an external video device is not connected to the video output and the information processing apparatus is in an orientation in which the screen faces upward more than forward;and based on a virtual camera placed in the virtual space, generate a game image including the first operation target and the second operation target and further including, between a first information image indicating first information and a second information image indicating second information, at least the first information image, wherein in the generating of the game image, when the first mode is set, the virtual camera is set such that a line-of-sight direction of the virtual camera is a first line-of-sight direction, and a game image in which the first information image is placed in a first direction or a game image in which the first information image and the second information image are placed in the same direction is generated, and when the second mode is set, the virtual camera is set such that the line-of-sight direction of the virtual camera is a second line-of-sight direction that is further downward in the virtual space than the first line-of-sight direction, and a game image in which the plurality of first information images are placed in different directions or a game image in which the first information image and the second information image are placed in different directions is generated.
  17. A game processing method using an information processing apparatus, and a first operation device and a second operation device wirelessly connectable to the information processing apparatus, the information processing apparatus comprising: a screen on which an image is displayable;an inertial sensor;and a video output configured to, when an external video device different from the screen is connected to the video output, output a video to the external video device, the game processing method comprising: based on inertial data of the inertial sensor, calculating an orientation of the information processing apparatus;executing game processing by, in a virtual space, controlling a first operation target based on first operation data acquired from the first operation device and controlling a second operation target based on second operation data acquired from the second operation device;setting the game processing to operate in one of at least first and second modes, the first mode being set in a case where an external video device is connected to the video output, the first mode being set in a case where the information processing apparatus is in an orientation in which the screen faces forward more than upward, and the second mode being set in a case where an external video device is not connected to the video output and the information processing apparatus is in an orientation in which the faces upward more than forward;and based on a virtual camera placed in the virtual space, generating a game image including the first operation target and the second operation target and further including, between a first information image indicating first information and a second information image indicating second information, at least the first information image, wherein in the generating of the game image, when the first mode is set, the virtual camera is set such that a line-of-sight direction of the virtual camera is a first line-of-sight direction, and a game image in which the first information image is placed in a first direction or a game image in which the first information image and the second information image are placed in the same direction is generated, and when the second mode is set, the virtual camera is set such that the line-of-sight direction of the virtual camera is a second line-of-sight direction that is further downward in the virtual space than the first line-of-sight direction, and a game image in which the plurality of first information images are placed in different directions or a game image in which the first information image and the second information image are placed in different directions is generated.

Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.