U.S. Pat. No. 11,498,004

COMPUTER-READABLE NON-TRANSITORY STORAGE MEDIUM HAVING INSTRUCTIONS STORED THEREIN, GAME APPARATUS, GAME SYSTEM, AND GAME PROCESSING METHOD

AssigneeNINTENDO CO., LTD.

Issue DateMay 26, 2021

Illustrative Figure

Abstract

In an exemplary embodiment, on the basis of a first direction input performed on an operation device, a player object is caused to perform a posturing action of holding an item object in an orientation corresponding to an input direction according to the first direction input. Further, in accordance with cancelation of the first direction input, the player object is caused to perform a swinging action of swinging the item object.

Description

DETAILED DESCRIPTION OF NON-LIMITING EXAMPLE EMBODIMENTS Hereinafter, the exemplary embodiment will be described. A game system according to an example of the exemplary embodiment is described below. An example of a game system1according to the exemplary embodiment includes a main body apparatus (an information processing apparatus, which functions as a game apparatus main body in the exemplary embodiment)2, a left controller3, and a right controller4. Each of the left controller3and the right controller4is attachable to and detachable from the main body apparatus2. That is, the game system1can be used as a unified apparatus obtained by attaching each of the left controller3and the right controller4to the main body apparatus2. Further, in the game system1, the main body apparatus2, the left controller3, and the right controller4can also be used as separate bodies (seeFIG. 2). Hereinafter, first, the hardware configuration of the game system1according to the exemplary embodiment is described, and then, the control of the game system1according to the exemplary embodiment is described. FIG. 1shows an example of the state where the left controller3and the right controller4are attached to the main body apparatus2. As shown inFIG. 1, each of the left controller3and the right controller4is attached to and unified with the main body apparatus2. The main body apparatus2is an apparatus for performing various processes (e.g., game processing) in the game system1. The main body apparatus2includes a display12. Each of the left controller3and the right controller4is an apparatus including operation sections with which a user provides inputs. FIG. 2shows an example of the state where each of the left controller3and the right controller4is detached from the main body apparatus2. As shown inFIGS. 1 and 2, the left controller3and the right controller4are attachable to and detachable from the main body apparatus2. It should be noted that hereinafter, the left controller3and the right controller4will occasionally ...

DETAILED DESCRIPTION OF NON-LIMITING EXAMPLE EMBODIMENTS

Hereinafter, the exemplary embodiment will be described.

A game system according to an example of the exemplary embodiment is described below. An example of a game system1according to the exemplary embodiment includes a main body apparatus (an information processing apparatus, which functions as a game apparatus main body in the exemplary embodiment)2, a left controller3, and a right controller4. Each of the left controller3and the right controller4is attachable to and detachable from the main body apparatus2. That is, the game system1can be used as a unified apparatus obtained by attaching each of the left controller3and the right controller4to the main body apparatus2. Further, in the game system1, the main body apparatus2, the left controller3, and the right controller4can also be used as separate bodies (seeFIG. 2). Hereinafter, first, the hardware configuration of the game system1according to the exemplary embodiment is described, and then, the control of the game system1according to the exemplary embodiment is described.

FIG. 1shows an example of the state where the left controller3and the right controller4are attached to the main body apparatus2. As shown inFIG. 1, each of the left controller3and the right controller4is attached to and unified with the main body apparatus2. The main body apparatus2is an apparatus for performing various processes (e.g., game processing) in the game system1. The main body apparatus2includes a display12. Each of the left controller3and the right controller4is an apparatus including operation sections with which a user provides inputs.

FIG. 2shows an example of the state where each of the left controller3and the right controller4is detached from the main body apparatus2. As shown inFIGS. 1 and 2, the left controller3and the right controller4are attachable to and detachable from the main body apparatus2. It should be noted that hereinafter, the left controller3and the right controller4will occasionally be referred to collectively as a “controller”.

FIG. 3is six orthogonal views showing an example of the main body apparatus2. As shown inFIG. 3, the main body apparatus2includes an approximately plate-shaped housing11. In the exemplary embodiment, a main surface (in other words, a surface on a front side, i.e., a surface on which the display12is provided) of the housing11has a substantially rectangular shape.

It should be noted that the shape and the size of the housing11are discretionary. As an example, the housing11may be of a portable size. Further, the main body apparatus2alone or the unified apparatus obtained by attaching the left controller3and the right controller4to the main body apparatus2may function as a mobile apparatus. The main body apparatus2or the unified apparatus may function as a handheld apparatus or a portable apparatus.

As shown inFIG. 3, the main body apparatus2includes the display12, which is provided on the main surface of the housing11. The display12displays an image generated by the main body apparatus2. In the exemplary embodiment, the display12is a liquid crystal display device (LCD). The display12, however, may be a display device of any type.

The main body apparatus2includes a touch panel13on the screen of the display12. In the exemplary embodiment, the touch panel13is of a type capable of receiving a multi-touch input (e.g., electrical capacitance type). However, the touch panel13may be of any type, and may be, for example, of a type capable of receiving a single-touch input (e.g., resistive film type).

The main body apparatus2includes speakers (i.e., speakers88shown inFIG. 6) within the housing11. As shown inFIG. 3, speaker holes11aand11bare formed on the main surface of the housing11. Then, sounds outputted from the speakers88are outputted through the speaker holes11aand11b.

Further, the main body apparatus2includes a left terminal17, which is a terminal for the main body apparatus2to perform wired communication with the left controller3, and a right terminal21, which is a terminal for the main body apparatus2to perform wired communication with the right controller4.

As shown inFIG. 3, the main body apparatus2includes a slot23. The slot23is provided on an upper side surface of the housing11. The slot23is so shaped as to allow a predetermined type of storage medium to be attached to the slot23. The predetermined type of storage medium is, for example, a dedicated storage medium (e.g., a dedicated memory card) for the game system1and an information processing apparatus of the same type as the game system1. The predetermined type of storage medium is used to store, for example, data (e.g., saved data of an application or the like) used by the main body apparatus2and/or a program (e.g., a program for an application or the like) executed by the main body apparatus2. Further, the main body apparatus2includes a power button28.

The main body apparatus2includes a lower terminal27. The lower terminal27is a terminal for the main body apparatus2to communicate with a cradle. In the exemplary embodiment, the lower terminal27is a USB connector (more specifically, a female connector). Further, when the unified apparatus or the main body apparatus2alone is mounted on the cradle, the game system1can display on a stationary monitor an image generated by and outputted from the main body apparatus2. Further, in the exemplary embodiment, the cradle has the function of charging the unified apparatus or the main body apparatus2alone mounted on the cradle. Further, the cradle has the function of a hub device (specifically, a USB hub).

FIG. 4is six orthogonal views showing an example of the left controller3. As shown inFIG. 4, the left controller3includes a housing31. In the exemplary embodiment, the housing31has a vertically long shape, i.e., is shaped to be long in an up-down direction (i.e., a y-axis direction shown inFIGS. 1 and 4). In the state where the left controller3is detached from the main body apparatus2, the left controller3can also be held in the orientation in which the left controller3is vertically long. The housing31has such a shape and a size that when held in the orientation in which the housing31is vertically long, the housing31can be held with one hand, particularly, the left hand. Further, the left controller3can also be held in the orientation in which the left controller3is horizontally long. When held in the orientation in which the left controller3is horizontally long, the left controller3may be held with both hands.

The left controller3includes a left analog stick (hereinafter, referred to as a “left stick”)32. As shown inFIG. 4, the left stick32is provided on a main surface of the housing31. The left stick32can be used as a direction input section with which a direction can be inputted. The user tilts the left stick32and thereby can input a direction corresponding to the direction of the tilt (and input a magnitude corresponding to the angle of the tilt). It should be noted that the left controller3may include a directional pad, a slide stick that allows a slide input, or the like as the direction input section, instead of the analog stick. Further, in the exemplary embodiment, it is possible to provide an input by pressing the left stick32.

The left controller3includes various operation buttons. The left controller3includes four operation buttons33to36(specifically, a right direction button33, a down direction button34, an up direction button35, and a left direction button36) on the main surface of the housing31. Further, the left controller3includes a record button37and a “−” (minus) button47. The left controller3includes a first L-button38and a ZL-button39in an upper left portion of a side surface of the housing31. Further, the left controller3includes a second L-button43and a second R-button44, on the side surface of the housing31on which the left controller3is attached to the main body apparatus2. These operation buttons are used to give instructions depending on various programs (e.g., an OS program and an application program) executed by the main body apparatus2.

Further, the left controller3includes a terminal42for the left controller3to perform wired communication with the main body apparatus2.

FIG. 5is six orthogonal views showing an example of the right controller4. As shown inFIG. 5, the right controller4includes a housing51. In the exemplary embodiment, the housing51has a vertically long shape, i.e., is shaped to be long in the up-down direction. In the state where the right controller4is detached from the main body apparatus2, the right controller4can also be held in the orientation in which the right controller4is vertically long. The housing51has such a shape and a size that when held in the orientation in which the housing51is vertically long, the housing51can be held with one hand, particularly the right hand. Further, the right controller4can also be held in the orientation in which the right controller4is horizontally long. When held in the orientation in which the right controller4is horizontally long, the right controller4may be held with both hands.

Similarly to the left controller3, the right controller4includes a right analog stick (hereinafter, referred to as a “right stick”)52as a direction input section. In the exemplary embodiment, the right stick52has the same configuration as that of the left stick32of the left controller3. Further, the right controller4may include a directional pad, a slide stick that allows a slide input, or the like, instead of the analog stick. Further, similarly to the left controller3, the right controller4includes four operation buttons53to56(specifically, an A-button53, a B-button54, an X-button55, and a Y-button56) on a main surface of the housing51. Further, the right controller4includes a “+” (plus) button57and a home button58. Further, the right controller4includes a first R-button60and a ZR-button61in an upper right portion of a side surface of the housing51. Further, similarly to the left controller3, the right controller4includes a second L-button65and a second R-button66.

Further, the right controller4includes a terminal64for the right controller4to perform wired communication with the main body apparatus2.

FIG. 6is a block diagram showing an example of the internal configuration of the main body apparatus2. The main body apparatus2includes components81to91,97, and98shown inFIG. 6in addition to the components shown inFIG. 3. Some of the components81to91,97, and98may be mounted as electronic components on an electronic circuit board and accommodated in the housing11.

The main body apparatus2includes a processor81. The processor81is an information processing section for executing various types of information processing to be executed by the main body apparatus2. For example, the processor81may be composed only of a CPU (Central Processing Unit), or may be composed of a SoC (System-on-a-chip) having a plurality of functions such as a CPU function and a GPU (Graphics Processing Unit) function. The processor81executes an information processing program (e.g., a game program) stored in a storage section (specifically, an internal storage medium such as a flash memory84, an external storage medium attached to the slot23, or the like), thereby performing the various types of information processing.

The main body apparatus2includes the flash memory84and a DRAM (Dynamic Random Access Memory)85as examples of internal storage media built into the main body apparatus2. The flash memory84and the DRAM85are connected to the processor81. The flash memory84is a memory mainly used to store various data (or programs) to be saved in the main body apparatus2. The DRAM85is a memory used to temporarily store various data used for information processing.

The main body apparatus2includes a slot interface (hereinafter, abbreviated as “I/F”)91. The slot I/F91is connected to the processor81. The slot I/F91is connected to the slot23, and in accordance with an instruction from the processor81, reads and writes data from and to the predetermined type of storage medium (e.g., a dedicated memory card) attached to the slot23.

The processor81appropriately reads and writes data from and to the flash memory84, the DRAM85, and each of the above storage media, thereby performing the above information processing.

The main body apparatus2includes a network communication section82. The network communication section82is connected to the processor81. The network communication section82communicates (specifically, through wireless communication) with an external apparatus via a network. In the exemplary embodiment, as a first communication form, the network communication section82connects to a wireless LAN and communicates with an external apparatus, using a method compliant with the Wi-Fi standard. Further, as a second communication form, the network communication section82wirelessly communicates with another main body apparatus2of the same type, using a predetermined method for communication (e.g., communication based on a unique protocol or infrared light communication). It should be noted that the wireless communication in the above second communication form achieves the function of enabling so-called “local communication” in which the main body apparatus2can wirelessly communicate with another main body apparatus2placed in a closed local network area, and the plurality of main body apparatuses2directly communicate with each other to transmit and receive data.

The main body apparatus2includes a controller communication section83. The controller communication section83is connected to the processor81. The controller communication section83wirelessly communicates with the left controller3and/or the right controller4. The communication method between the main body apparatus2and the left controller3and the right controller4is discretionary. In the exemplary embodiment, the controller communication section83performs communication compliant with the Bluetooth (registered trademark) standard with the left controller3and with the right controller4.

The processor81is connected to the left terminal17, the right terminal21, and the lower terminal27. When performing wired communication with the left controller3, the processor81transmits data to the left controller3via the left terminal17and also receives operation data from the left controller3via the left terminal17. Further, when performing wired communication with the right controller4, the processor81transmits data to the right controller4via the right terminal21and also receives operation data from the right controller4via the right terminal21. Further, when communicating with the cradle, the processor81transmits data to the cradle via the lower terminal27. As described above, in the exemplary embodiment, the main body apparatus2can perform both wired communication and wireless communication with each of the left controller3and the right controller4. Further, when the unified apparatus obtained by attaching the left controller3and the right controller4to the main body apparatus2or the main body apparatus2alone is attached to the cradle, the main body apparatus2can output data (e.g., image data or sound data) to the stationary monitor or the like via the cradle.

Here, the main body apparatus2can communicate with a plurality of left controllers3simultaneously (in other words, in parallel). Further, the main body apparatus2can communicate with a plurality of right controllers4simultaneously (in other words, in parallel). Thus, a plurality of users can simultaneously provide inputs to the main body apparatus2, each using a set of the left controller3and the right controller4. As an example, a first user can provide an input to the main body apparatus2using a first set of the left controller3and the right controller4, and simultaneously, a second user can provide an input to the main body apparatus2using a second set of the left controller3and the right controller4.

The main body apparatus2includes a touch panel controller86, which is a circuit for controlling the touch panel13. The touch panel controller86is connected between the touch panel13and the processor81. On the basis of a signal from the touch panel13, the touch panel controller86generates data indicating the position at which a touch input has been performed, for example, and outputs the data to the processor81.

Further, the display12is connected to the processor81. The processor81displays a generated image (e.g., an image generated by executing the above information processing) and/or an externally acquired image on the display12.

The main body apparatus2includes a codec circuit87and speakers (specifically, a left speaker and a right speaker)88. The codec circuit87is connected to the speakers88and a sound input/output terminal25and also connected to the processor81. The codec circuit87is a circuit for controlling the input and output of sound data to and from the speakers88and the sound input/output terminal25.

The main body apparatus2includes a power control section97and a battery98. The power control section97is connected to the battery98and the processor81. Further, although not shown inFIG. 6, the power control section97is connected to components of the main body apparatus2(specifically, components that receive power supplied from the battery98, the left terminal17, and the right terminal21). Based on a command from the processor81, the power control section97controls the supply of power from the battery98to the above components.

Further, the battery98is connected to the lower terminal27. When an external charging device (e.g., the cradle) is connected to the lower terminal27, and power is supplied to the main body apparatus2via the lower terminal27, the battery98is charged with the supplied power.

FIG. 7is a block diagram showing examples of the internal configurations of the main body apparatus2, the left controller3, and the right controller4. It should be noted that the details of the internal configuration of the main body apparatus2are shown inFIG. 6and therefore are omitted inFIG. 7.

The left controller3includes a communication control section101, which communicates with the main body apparatus2. As shown inFIG. 7, the communication control section101is connected to components including the terminal42. In the exemplary embodiment, the communication control section101can communicate with the main body apparatus2through both wired communication via the terminal42and wireless communication not via the terminal42. The communication control section101controls the method for communication performed by the left controller3with the main body apparatus2. That is, when the left controller3is attached to the main body apparatus2, the communication control section101communicates with the main body apparatus2via the terminal42. Further, when the left controller3is detached from the main body apparatus2, the communication control section101wirelessly communicates with the main body apparatus2(specifically, the controller communication section83). The wireless communication between the communication control section101and the controller communication section83is performed in accordance with the Bluetooth (registered trademark) standard, for example.

Further, the left controller3includes a memory102such as a flash memory. The communication control section101includes, for example, a microcomputer (or a microprocessor) and executes firmware stored in the memory102, thereby performing various processes.

The left controller3includes buttons103(specifically, the buttons33to39,43,44, and47). Further, the left controller3includes the left stick32. Each of the buttons103and the left stick32outputs information regarding an operation performed on itself to the communication control section101repeatedly at appropriate timings.

The left controller3includes inertial sensors. Specifically, the left controller3includes an acceleration sensor104. Further, the left controller3includes an angular velocity sensor105. In the exemplary embodiment, the acceleration sensor104detects the magnitudes of accelerations along predetermined three axial (e.g., xyz axes shown inFIG. 4) directions. It should be noted that the acceleration sensor104may detect an acceleration along one axial direction or accelerations along two axial directions. In the exemplary embodiment, the angular velocity sensor105detects angular velocities about predetermined three axes (e.g., the xyz axes shown inFIG. 4). It should be noted that the angular velocity sensor105may detect an angular velocity about one axis or angular velocities about two axes. Each of the acceleration sensor104and the angular velocity sensor105is connected to the communication control section101. Then, the detection results of the acceleration sensor104and the angular velocity sensor105are outputted to the communication control section101repeatedly at appropriate timings.

The communication control section101acquires information regarding an input (specifically, information regarding an operation or the detection result of the sensor) from each of input sections (specifically, the buttons103, the left stick32, and the sensors104and105). The communication control section101transmits operation data including the acquired information (or information obtained by performing predetermined processing on the acquired information) to the main body apparatus2. It should be noted that the operation data is transmitted repeatedly, once every predetermined time. It should be noted that the interval at which the information regarding an input is transmitted from each of the input sections to the main body apparatus2may or may not be the same.

The above operation data is transmitted to the main body apparatus2, whereby the main body apparatus2can obtain inputs provided to the left controller3. That is, the main body apparatus2can determine operations on the buttons103and the left stick32based on the operation data. Further, the main body apparatus2can calculate information regarding the motion and/or the orientation of the left controller3based on the operation data (specifically, the detection results of the acceleration sensor104and the angular velocity sensor105).

The left controller3includes a power supply section108. In the exemplary embodiment, the power supply section108includes a battery and a power control circuit. Although not shown inFIG. 7, the power control circuit is connected to the battery and also connected to components of the left controller3(specifically, components that receive power supplied from the battery).

As shown inFIG. 7, the right controller4includes a communication control section111, which communicates with the main body apparatus2. Further, the right controller4includes a memory112, which is connected to the communication control section111. The communication control section111is connected to components including the terminal64. The communication control section111and the memory112have functions similar to those of the communication control section101and the memory102, respectively, of the left controller3. Thus, the communication control section111can communicate with the main body apparatus2through both wired communication via the terminal64and wireless communication not via the terminal64(specifically, communication compliant with the Bluetooth (registered trademark) standard). The communication control section111controls the method for communication performed by the right controller4with the main body apparatus2.

The right controller4includes input sections similar to the input sections of the left controller3. Specifically, the right controller4includes buttons113, the right stick52, and inertial sensors (an acceleration sensor114and an angular velocity sensor115). These input sections have functions similar to those of the input sections of the left controller3and operate similarly to the input sections of the left controller3.

The right controller4includes a power supply section118. The power supply section118has a function similar to that of the power supply section108of the left controller3and operates similarly to the power supply section108.

[Outline of Game Processing of Exemplary Embodiment]

Next, the outline of operation of game processing executed by the game system1according to the exemplary embodiment will be described. Here, in the above game system1, the left controller3and the right controller4are attachable to and detachable from the main body apparatus2. When a game is played in the state where the left controller3and the right controller4are attached to the main body apparatus2, the game image is outputted to the display12. When the main body apparatus2alone in the state where the left controller3and the right controller4are detached from the main body apparatus2is attached to the cradle, the main body apparatus2can also output the game image to a stationary monitor or the like via the cradle. In the exemplary embodiment, description is given of an example case where the game is played in the latter form, i.e., in the form in which the main body apparatus2alone in the state where the left controller3and the right controller4are detached from the main body apparatus2is attached to the cradle and the main body apparatus2outputs the game image and the like to the stationary monitor or the like via the cradle.

The game (hereinafter, referred to as “this game”) assumed in the exemplary embodiment is an action adventure game in which a player object is operated in a virtual three-dimensional space.FIG. 8is an example of a game screen assumed in the exemplary embodiment. InFIG. 8, how the three-dimensional virtual game space is seen is shown on the game screen. In the three-dimensional virtual game space, a player object201is present and an image seen from behind and captured by a virtual camera is displayed. The entire body of the player object201is displayed on the game screen. Further, the player object201carries a sword object204on its back. The sword object204is in a state of being in a sheath, and the sword object204can be drawn from the sheath through an operation described later. Hereinafter, drawing the sword object204from the sheath will be referred to as “sword drawing”.FIG. 9shows an example of the screen regarding the player object201after sword drawing has been performed.FIG. 9shows a state where the sword object204is gripped with the right hand of the player object201. In this state, the player can move the sword object204within a range based on the movable range of the right arm of the player object201.

In this game, the player object201can be caused to perform a movement of “swinging” the sword object204. Thus, the player object201can attack a predetermined enemy object (not shown) by swinging the sword object204. Further, the player object201can also “slash” a predetermined object other than an enemy, such as a plant object. The processing according to the exemplary embodiment is processing regarding an operation for causing the player object201to perform a movement of “swinging” the sword object204. In the exemplary embodiment, the movement of “swinging” the sword object204includes two actions, i.e., a “posturing action” and a “swinging action” described later.

Next, before describing specific operation contents, the “state (in the game)” of the player object201related to the movement of “swinging” as described above, and the transition thereof will be described. Hereinafter, the state in the game of the player object201will be referred to as a “PO state”. First, in this game, the PO state where the sword object204is in the sheath as shown inFIG. 8is referred to as a “non-sword-drawn state”. In this state, when a predetermined sword drawing operation is performed, a sword drawing motion is displayed, and then, as shown inFIG. 9, the player object201enters a state where the player object201has the sword object204with the right hand. In the exemplary embodiment, this PO state is referred to as a “posturing state”. In the posturing state, the player can cause the player object201to perform a posturing action. The posturing action is an action of changing the position and orientation of the sword object204in the movable range of the right arm of the player object201. That is, the posturing action is such an action in which the player object201takes a posture of holding the sword before swinging the sword.FIG. 10shows an example in which the player object201takes a posture of holding the sword object204with the tip thereof directed toward the right direction. In the posturing state, the orientation of the sword object204can be changed by an operation described later. Then, in the posturing state, when a predetermined condition has been satisfied (i.e., when a predetermined operation has been performed), a motion of actually swinging the sword object204is displayed. Hereinafter, the PO state in the middle of swinging the sword object204will be referred to as a “swinging state”. Further, the motion of swinging the sword will be referred to as a “swinging action”.FIG. 11shows an example of the player object201in the “swinging state”. In the example inFIG. 11, a swinging action like a horizontal slashing from the right direction toward the left direction is shown.

FIG. 12shows a transition relationship between the PO states described above. InFIG. 12, first, when a “sword drawing operation” as described later is performed in the non-sword-drawn state, the PO state transitions to the posturing state described above. Next, in the posturing state, when a condition for transitioning to the swinging action described above has been satisfied, the PO state transitions to the swinging state. Details of this condition will be described later. Hereinafter, this condition will be referred to as a “sword-swinging execution condition”. When the PO state has transitioned to the swinging state, a motion related to the swinging action as described above is displayed. Then, when the swinging action ends, the PO state returns (transitions) to the posturing state. Then, in the posturing state, when a “sword sheathing operation” for putting the sword object204into the sheath is performed, the PO state transitions to the above non-sword-drawn state. Therefore, for example, in the exemplary embodiment, when an enemy object is attacked by using the sword object204, a series of actions comprising two actions, i.e., the above-described posturing action and swinging action, are performed. In other words, in the exemplary embodiment, the movement of swinging the sword object204is realized as a series of actions “from posturing to swinging”.

[Operation Mode]

Next, specific operations for causing the movement as described above are described. In the exemplary embodiment, two operation modes are provided for causing the movements as described above to be performed. A first operation mode is an operation mode that mainly uses the right stick52provided to the right controller4. A second operation mode is an operation mode that uses inertial sensors of the right controller4. It should be noted that, in the exemplary embodiment, either one of these operation modes is set. For example, on a game setting screen, an item for setting an operation mode is provided, and either one of the operation modes is designated. Therefore, when the first operation mode is set, operation according to the second operation mode cannot be performed. The same applies vice versa.

[Second Operation Mode]

Next, each operation mode will be described. For convenience of description, the second operation mode is described first. A described above, the second operation mode uses inertial sensors of the right controller4. Specifically, in the second operation mode, the player regards the right controller4as the sword object204, and can change the orientation of or swing the sword object204by tilting or swinging the right controller4.

In an example of the operation, first, in the non-sword-drawn state, when the player presses a predetermined button, the player object201performs sword drawing and the PO state transitions to the posturing state. At this time, on the basis of outputs from the inertial sensors, the orientation of the right controller4is calculated, and the orientation is reflected in the orientation of the sword object204. For example, the inertial sensors include a gyro sensor, and the orientation of the right controller4is calculated on the basis of the angular velocity detected by the gyro sensor. Here, a case where the orientation as shown inFIG. 9has been established as a result of a sword drawing operation is assumed. Next, it is assumed that, when the screen is in the state as shown inFIG. 9, the player has horizontally stretched the right arm and the right controller4has also taken a substantially horizontal orientation. In this case, the orientation of the right controller4is calculated on the basis of outputs from the inertial sensors, and this orientation is reflected in the position and orientation of the sword object204. As a result, the position and orientation of the sword object204(and the player object201) are changed to those as shown inFIG. 10.

Further, it is assumed that, when the game screen is in the state as shown inFIG. 10, the player swings the right controller4in the left direction at not less than a predetermined speed. In this case, a motion of swinging the controller is detected on the basis of outputs from the inertial sensor. For example, the inertial sensors include an acceleration sensor, and it is determined that the controller has been swung, on the basis of determination that an acceleration having not less than a predetermined value has been detected. As a result, the PO state transitions to the swinging state, and change in the orientation of the right controller4in this swinging motion is reflected in change in the position and orientation of the sword object204. As a result, the player object201performs the swinging action as shown inFIG. 11.

As described above, in the second operation mode, on the basis of outputs of the inertial sensors, the orientation of the right controller4is calculated, and swinging is determined. Then, the calculated orientation of the right controller4is reflected in the sword object204, whereby a series of actions comprising the above-described posturing action and swinging action are realized. Accordingly, the sword object204can be swung by moving the right controller4itself, and thus, an intuitive operation can be performed. In addition, a user experience as if the user was actually swinging the sword can be provided.

[First Operation Mode]

As described above, in the second operation mode, the inertial sensors are used, whereby an operation method of moving the right controller4itself is provided. Here, in the game system1according to the exemplary embodiment, as described above, the left controller3and the right controller4are attachable to and detachable from the main body apparatus2. Therefore, there are also cases where the game screen is displayed on the display12when the game is played in the state where the left controller3and the right controller4are attached to the main body apparatus2as shown inFIG. 1. In such a case, it is considered that the operations as described above using the inertial sensors of the right controller4are difficult to be performed. In addition, there may be a case where the player does not like in the first place the operation method such as the second operation mode. Therefore, the first operation mode is provided in the exemplary embodiment. In this operation mode, it is possible to perform a series of actions comprising the posturing action and the swinging action, as in the case of the second operation mode, through a simple direction input operation using the right stick52. Therefore, even in the state where the left controller3and the right controller4are attached to the main body apparatus2, it is possible to cause the player object201to perform movement of swinging the sword object204, as in the case of the second operation mode.

Next, the outline of operations and processing in the first operation mode is described. First, handling of operation input data of the right stick52is described. In the first operation mode, operation input data from the right stick52is obtained as a two-dimensional value. That is, it is assumed that operation input data from the right stick52is obtained as a two-dimensional value of (x, y). Then, in the exemplary embodiment, a two-dimensional plane (hereinafter, stick plane) of a coordinate system in which, with the right stick52viewed from immediately above, the right direction is the x-axis positive direction and the up direction is the y-axis positive direction, is assumed. In this stick plane, the position of the right stick52being in a neutral state (the state where the right stick52is not tilted in any direction) is defined as the origin. In this case, the vector (hereinafter, referred to as an input vector) connecting this origin and the coordinate represented by the above-described two-dimensional value indicates the input strength and the input direction. That is, the length of the input vector indicates the degree of tilt of the right stick52, and the orientation of the input vector indicates the input direction.

Next, the relationship between the input content of the right stick52and the posturing action and swinging action is described. In the exemplary embodiment, the stick plane is associated with an x-y plane that is defined with the player object201seen from behind.FIG. 13shows the correspondence between the stick plane and the x-y plane (hereinafter, referred to as a player object plane). InFIG. 13, a circular stick plane is shown on the left side. Further, inFIG. 13, the player object201is shown on the right, and the player object plane is indicated by a dotted line surrounding the player object201. It should be noted that the outer circumference of the circle of the stick plane indicates the input limit (where the right stick52is tilted to the maximum) of the right stick52. InFIG. 13, a plane in which the player object201is seen from behind is assumed, and correspondence in which a substantial center portion of the player object201matches the origin of the stick plane is assumed. It should be noted that, in another example, a root portion of the right arm and the origin of the stick plane may be associated with each other.

On the assumption of the above correspondence relationship, the following operation can be performed in the first operation mode. First, it is assumed that the operation of the sword drawing is the same as that in the second operation mode described above. When sword drawing has been performed, a predetermined sword drawing motion is displayed, and then, the PO state transitions to the posturing state. In this posturing state, the orientation of the sword object204can be changed in accordance with an input of the right stick52. In the case where the input of the right stick52is neutral, a predetermined orientation corresponding thereto is taken.

FIG. 14shows an example of a case where the right stick52has been tilted to the maximum in the right direction. InFIG. 14, the stick plane and the input vector thereof (arrow) are shown on the left side, and the orientation of the sword object204(and the player object201) corresponding to this input is shown on the right side. As shown inFIG. 14, when the right stick52has been tilted to the maximum in the right direction, the orientation of the sword object204is also changed, in accordance with this, to an orientation in which the tip of the sword object204is directed to the right. It should be noted that the position in the depth direction (on the z axis) may be any position corresponding to the game content, but basically, is a position that is influenced by the movable range of the right arm of the player object201having the sword object204.

A case in which, from the orientation shown inFIG. 14, only the input direction of the right stick52is further rotated counter-clockwise while the right stick52is tilted to the maximum, is assumed.FIGS. 15 to 18show examples of changes in the orientation in this case.FIG. 15shows a case where the input direction of the right stick52is an upper right direction. In this case, the tip of the sword object204is also in an orientation directed to the upper right direction on the player object plane. Further, when the input direction of the right stick52is changed to the straight up direction, the orientation of the tip of the sword object204is changed to an orientation directed to the straight up direction in the player object plane, as shown inFIG. 16. Further, when the input direction of the right stick52is changed to the left direction, the tip of the sword object204is changed to an orientation directed to the left direction in the player object plane, as shown inFIG. 17. Further, when the input direction of the right stick52is changed to the straight down direction, the orientation of the tip of the sword object204is changed to an orientation directed to the straight down direction in the player object plane, as shown inFIG. 18.

In this manner, in the first operation mode, in the posturing state, the posturing action can be performed by an input operation of the right stick52. That is, the orientation of the sword object204can be changed such that (the tip of) the sword object204is directed to a direction corresponding to the direction input of the right stick52.

Next, the swinging action in the first operation mode is described. In the exemplary embodiment, in the posturing state, when the player releases the finger from the right stick52in a state where the right stick52is tilted to the maximum, the posturing state can be caused to transition to the swinging state. The right stick52is a stick that (naturally) returns to the neutral position when the finger is released in a state where the right stick52is tilted in a certain direction. Accordingly, it is possible to almost instantaneously return the right stick52to the neutral state by releasing the finger from the right stick52in a state where the right stick52is tilted. In the first operation mode, using the release (cancelation of the direction input of the right stick) of the finger in a state where the right stick52is tilted to the maximum, as a trigger, the PO state of the player object201is transitioned from the posturing state to the swinging state, whereby the swinging action as described above is performed. Hereinafter, the operation of canceling the direction input serving as a trigger for the swinging action will be referred to as a “release operation”.

Here, supplementary description of a determination method of the release operation is given. Each ofFIGS. 19 and 20illustrates a determination method of the release operation. Each ofFIGS. 19 and 20shows the stick plane. First, inFIG. 19, a first threshold131is shown at a position that is on a slightly inner side of the outer circumference of the stick plane. In the exemplary embodiment, in the posturing state, when the release operation is performed while the length of the input vector from the origin exceeds the first threshold131, the swinging action can be performed. Conversely, when the finger is released from the right stick52in a state where the length of the input vector does not exceed the first threshold, the release operation cannot be performed. That is, the player object201does not enter the swinging state and maintains the posturing state, and the orientation of the sword object204is merely changed (i.e., the sword object204takes the orientation at the time of the neutral state). That is, in order to perform the swinging action, the right stick52is required to have a tilt that is large to some extent. In the description below, the first threshold131will be referred to as a “swing occurrence threshold”.

Next, inFIG. 20, a second threshold132is shown on a slightly outer side of the origin of the stick plane. In the exemplary embodiment, when the length of the input vector has been changed, within a predetermined time, e.g., within three frames, from an input state where the length of the input vector exceeds a swing occurrence threshold131as inFIG. 19, to a state where the length of the input vector does not exceed the second threshold132, it is determined that the above-described release operation has occurred. That is, in a case where the finger has been released from the stick, when the time until the stick returns to the vicinity of the neutral position is short to some extent, it is determined that the above-described operation of releasing the finger from the stick has been performed. In the description below, the second threshold132will be referred to as a “release determination threshold”. It should be noted that the aforementioned “within three frames” is merely an example, and it is understood that an appropriate number of frames may be used in accordance with the game content.

As described above, the player performs the release operation as described above while the player object201is in the posturing state, thereby being able to cause the player object201to perform the swinging action. The direction in which the sword object204is swung is assumed to be a direction opposite to the direction in which the right stick52has been tilted. For example, as shown inFIG. 14, when the finger is released in a state where the stick is tilted to the maximum in the right direction, a swinging action of completely swinging the sword toward the left direction, as shown inFIG. 11, is performed. According to this operation method, by tilting the right stick52to the maximum in a direction opposite to the direction in which the player wants to swing the sword, and then releasing the finger, the player can cause the sword object204to be swung in the desired direction. In addition, although it is necessary to tilt the right stick52to a substantially maximum extent in order to perform the swinging action, the tilting direction of the right stick52can be changed along the outer circumference portion surrounding the shaft of the right stick52. That is, the outer circumference portion of the shaft of the right stick52plays a kind of guide role, whereby aiming in the direction of swinging the sword is facilitated. Therefore, accurate direction input is facilitated with respect to the direction of swinging the sword object204. The flow of a series of actions from the posturing action to the swinging action as described above facilitates accurate direction input.

The ease of accurate direction input due to the flow of a series of actions from the posturing action to the swinging action as described above is particularly useful when there is a gimmick that requires swinging the sword object204in a predetermined direction in the progress of the game, for example. Examples of the gimmick include an enemy object that is defeated only when the sword is swung in a predetermined direction, a trap object that is released only by being hit by the sword object204from a predetermined direction, and the like. In such a case, the following game processing is performed: the swing direction in the swinging action is determined; and then whether the attack on the enemy has been successful or not, whether release of the trap has been successful or not, or the like is determined.

In the first operation mode, since the processing as described above is performed, it is also possible to perform the posturing action and the swinging action consecutively and instantaneously through an operation of flicking the right stick52(hereinafter, a flicking operation). As the flicking operation, for example, a case where the right stick52is operated so as to be flicked in the right direction in a state where the sword is drawn and the right stick52is in the neutral state, is assumed. In this case, an input of the right stick52is provided as follows: an input vector exceeding the swing occurrence threshold131occurs from the neutral state, and then, the input vector is changed to be smaller than a release determination threshold132. As a result, the occurrence of the input vector exceeding the swing occurrence threshold131in the right direction, and the change into an input vector smaller than the release determination threshold132(and determination that a release operation has occurred) occur in a very short time of about several frames. Therefore, by simply performing such a flicking operation, the player can cause the player object201to perform a swinging action in a direction opposite to the direction of the flicking, within a time that can be regarded as substantially “instant”. Accordingly, it is possible to provide the player with an operation of instantaneously executing a swinging action through the flicking operation, in addition to an operation of executing a swinging action after the direction of swinging is carefully aimed through the posturing action.

In the exemplary embodiment, an example in which the player presses a predetermined button in the non-sword-drawn state, thereby transitioning the PO state to the posturing state, has been described. However, in another exemplary embodiment, in the first operation mode, sword drawing may be performed when an input of the right stick52has been provided in the non-sword-drawn state. In this case, if the flicking operation is performed, it is also possible to cause the player object201to perform a motion of suddenly executing the swinging action from the non-sword-drawn state.

With respect to the release operation, an example in which the finger is released from the right stick52has been described. In this regard, as the determination processing, if it can be determined that the input state as shown inFIG. 19has changed to the input state as shown inFIG. 20within a predetermined time, the operation is determined as the release operation. Therefore, in a case where the finger is not released from the right stick52and a direction input is quickly performed from a state where the right stick52is tilted to the maximum to the right to a state where the right stick52is tilted to the maximum to the left, for example, if the condition that “the input vector becomes smaller than the release determination threshold within three frames” is satisfied, it is determined that the release operation has been performed, and the swinging action can also be performed. Therefore, for example, if operations of alternately tiling the right stick52to the left and right are quickly repeated, swinging actions of right swinging and left swinging are alternately repeated. Accordingly, it is also possible to cause the player object201to perform an action as if the player object201was consecutively performing cutting. That is, when such consecutive operations are performed, a set of the posturing action and the swinging action is consecutively executed.

As described above, in the exemplary embodiment, with respect to the posturing action and the swinging action, an operation method using the inertial sensors is provided to the player as the second operation mode. Further, also in the first operation mode, a posturing action and a swinging action similar to those in the second operation mode can be performed simply through an operation of the right stick52. Accordingly, an appropriate operation method suitable for the player can be provided.

Further, in this game, in addition to the operations regarding the sword object204as described above, the following operations are also possible.

[Moving Operation]

In this game, the player object201can be moved by using the left stick32. Therefore, while the player object201is moved by the left stick32, an operation of the sword object204using the right controller4can also be performed. Accordingly, for example, attacking an enemy object while moving can also be performed.

[Virtual Camera Operation]

In this game, the imaging direction of the virtual camera can also be changed by operating the right stick52in a state where a predetermined button is pressed. For example, when the right stick52is operated while the first R-button60is pressed, the orientation of the virtual camera can be changed on the basis of the direction input therefrom. During the operation of the virtual camera, control (in each of the first and second operation modes) of the sword object204by the right stick52is not performed. That is, while the first R-button60is pressed, the right stick52is used for the virtual camera operation, and while the first R-button60is not pressed, the right stick52is used for controlling the sword object204.

Other than these, various operations related to the progress of the game, such as operations of using various items, can be performed.

[Details of Game Processing of Exemplary Embodiment]

Next, with reference toFIGS. 21 to 27, the game processing according to the exemplary embodiment will be described in more detail.

[Data to be Used]

First, various data to be used in this game processing will be described.FIG. 21is a memory map showing an example of various data stored in the DRAM85of the main body apparatus2. The DRAM85of the main body apparatus2has stored therein a game program301, operation data302, operation mode information307, PO state information308, release operation determination data309, sword swing parameters310, virtual camera setting data311, object data312, and the like.

The game program301is a program for executing the game processing according to the exemplary embodiment.

The operation data302is data obtained from the left controller3and the right controller4, and is data indicating the content of an operation performed by the player. The operation data302includes, at least, digital button data303, right analog stick data304, left analog stick data305, and inertial sensor data306. The digital button data303is data indicating the pressed states of various buttons of the controller. The right analog stick data304is data for indicating the content of an operation performed on the right stick52. Specifically, two-dimensional data of x, y is included. The left analog stick data305is data for indicating the content of an operation performed on the left stick32. The inertial sensor data306is data indicating detection results of the inertial sensors such as the above-described acceleration sensor and angular velocity sensor. Specifically, acceleration data and angular velocity data are included.

The operation mode information307is information for indicating whether the current operation mode is the first operation mode or the second operation mode.

The PO state information308is information for indicating which of the above-described “non-sword-drawn state”, “posturing state”, and “swinging state” the state of the player object201is. The initial value is assumed to be the “non-sword-drawn state”.

The release operation determination data309is data for determining whether or not the above-described release operation has been performed. Specifically, the right analog stick data304corresponding to the last several frames is stored (in another example, data indicating the above-described input vector may be stored). The data is sequentially replaced from the oldest data.

The sword swing parameters310are various parameters for moving the sword object204in the swinging action. For example, the sword swing parameters310are parameters indicating the moving direction, the moving speed, and the like of the sword object204.

The virtual camera setting data311is data for designating the imaging direction and the like of the virtual camera. In accordance with an operation of the right stick52with the first R-button60being pressed as described above, the imaging direction is changed.

The object data312is data indicating the appearances of various objects that appear in this game, including the player object201. Model data, texture data, and the like are included in the object data312.

Other than these, various data that are necessary for the game processing are also generated as appropriate, and are stored in the DRAM85.

[Details of Processing Executed by the Processor81]

Next, with reference to the flowchart inFIG. 22, details of the game processing according to the exemplary embodiment will be described. It should be noted that, in the following, processing regarding the operation of the sword object204as described above is mainly described, and detailed description of the other game processing is omitted.

FIG. 22is a flowchart showing details of this game processing. InFIG. 22, first, in step S1, the processor81executes an initialization process for initializing data to be used in the processing thereafter. Specifically, data indicating the “non-sword-drawn state” is stored as the PO state information308in the DRAM85. Further, the processor81sets, in the virtual camera setting data311, various parameters (position, angle of view, gaze point) that enable display of a game image seen from behind the player object201, as shown inFIG. 8. Further, the processor81constructs a three-dimensional virtual game space and places the player object201and the like as appropriate. A game image obtained by the virtual camera capturing the thus constructed game space is generated, and the generated game image is outputted to the stationary monitor or the like.

Next, in step S2, the processor81refers to the PO state information308and determines whether or not the PO state is the swinging state. As a result of this determination, when the PO state is the swinging state (YES in step S2), the processor81executes the process of step S10described later. Meanwhile, when the PO state is not the swinging state (NO in step S2), the processor81determines, in step S3, whether or not the PO state is the posturing state. As a result, when the PO state is the posturing state (YES in step S3), the processor81advances to the process of step S9described later.

Meanwhile, when the PO state is not the posturing state (NO in step S3), the PO state is the non-sword-drawn state, and thus, the processor81acquires the operation data302from the DRAM85in step S4.

Next, in step S5, the processor81executes a posture transition determination process. In this process, determination and the like as to whether or not to transition the PO state from the non-sword-drawn state to the posturing state are performed.FIG. 23is a flowchart showing details of the posture transition determination process. InFIG. 23, first, in step S21, in the process regarding the current frame, whether or not a sword drawing motion is being displayed is determined. In the exemplary embodiment, when an operation of instructing sword drawing has been performed, a motion of the player object201drawing the sword object204is displayed over several frames. This determination is for determining whether or not this motion is being displayed. As a result of this determination, when the sword drawing motion is not being displayed (NO in step S21), the processor81determines, in step S22, whether or not a sword drawing operation has been performed, on the basis of the operation content indicated by the operation data302. In the exemplary embodiment, whether or not a predetermined button assigned for the sword drawing operation has been pressed is determined. In addition to this, in another exemplary embodiment, for example, it may be determined that the sword drawing operation has been performed, also when the right controller4has been swung at not less than a predetermined acceleration, or when a direction input to the right stick52has occurred.

As a result of the determination above, when the sword drawing operation has been performed (YES in step S22), the processor81performs, in step S23, display setting for displaying the sword drawing motion. On the basis of this setting, the sword drawing motion is displayed in a game image output process of step S7described later. In addition, due to this setting, in the determination in step S21, it is determined that the sword drawing motion is being displayed. This setting can be set by, for example, turning on a flag (not shown) indicating that the sword drawing motion is being displayed.

Meanwhile, when the sword drawing operation has not been performed (NO in step S22), the process of step S23is skipped, and the posture transition determination process ends.

Next, a case where, as a result of the determination in step S21, it has been determined that the sword drawing motion is being displayed (YES in step S21), is described. In this case, in step S24, the processor81determines whether or not the sword drawing motion has ended. When the sword drawing motion has not ended (NO in step S24), the posture transition determination process ends. As a result, the sword drawing motion is continued to be displayed. Meanwhile, when the sword drawing motion has ended (YES in step S24), the processor81sets, in step S25, ending of the display of the sword drawing motion. For example, the processor81turns off the above-described flag indicating that the sword drawing motion is being displayed. Next, in step S26, the processor81sets data indicating a posturing state, in the PO state information308. That is, in the exemplary embodiment, until the sword drawing motion ends, the PO state does not transition to the posturing state. It should be noted that, in another exemplary embodiment, the PO state may be caused to transition to the posturing state at the time point when the sword drawing operation has been performed. Then, the posture transition determination process ends.

With reference back toFIG. 22, next, in step S6, the processor81executes various game processes other than a posturing action process and a swinging action process which are described later. Specifically, a process of moving the player object201on the basis of the operation content (the left analog stick data305) of the left stick32is performed. In addition, on the basis of the operation content of the right stick52with the first R-button60pressed, a process of changing parameters of the virtual camera is also performed. Further, a process of storing, in the release operation determination data309, the right analog stick data304in the operation data302acquired for the frames of this time, is also executed. Further, in the case where the sword drawing motion is being displayed, a process of causing the player object201to perform a predetermined sword drawing motion is also executed. Other than these, also while the player object201is not performing the posturing action or the swinging action, various game processes related to the progress of the game are executed as appropriate.

Next, in step S7, the processor81executes the game image output process. Specifically, the processor81generates a game image by causing the virtual camera to capture the virtual game space reflecting the results of the above processes and of the posturing action process and the swinging action process which are described later. Then, the processor81outputs the game image to the stationary monitor.

Next, in step S8, the processor81determines whether or not an ending condition for the game according to the exemplary embodiment has been satisfied. Examples of the ending condition include that the player has performed an ending operation of the game. As a result of this determination, when the determination is YES, the processor81ends the game processing, and when the determination is NO, the processor81returns to step S2and repeats the game processing according to the exemplary embodiment. That is, until the game processing ends, the processes from step S2to step S7are repeated for each frame, which is the time unit for rendering.

[Posturing Action Process]

Next, the posturing action process in step S9is described. Each ofFIGS. 24 and 25is a flowchart showing details of the posturing action process. InFIG. 24, first, in step S31, the processor81acquires the operation data302.

Next, in step S32, the processor81determines whether or not the current operation mode is the first operation mode (the operation mode using the right stick52). As a result of this determination, when the current operation mode is the first operation mode (YES in step S32), the processor81determines, in step S33, whether or not an operation (sword sheathing operation) for ending the posturing state in the first operation mode has been performed. In the exemplary embodiment, this ending operation is pressing of a predetermined button. It should be noted that, in the exemplary embodiment, the operation for ending the posturing state is common between the first operation mode and the second operation mode. With respect to this ending operation, when different operations are adopted between the first operation mode and the second operation mode, the operation contents may be determined so as to be suitable for the respective modes.

As a result of the determination above, when the operation for ending the posturing state has been performed (YES in step S33), the processor81sets the non-sword-drawn state in the PO state information308in step S34. In addition, the processor81also performs as appropriate display setting for a sword sheathing motion of putting the sword object204into the sheath. Then, the posturing action process ends.

Meanwhile, as a result of the determination in step S33, when the operation for ending the posturing state has not been performed (NO in step S33), the processor81next calculates, in step S35, the above-described input vector on the basis of the right analog stick data304. Next, in step S36, the processor81determines whether or not the release operation as described above has occurred, on the basis of the calculated input vector and the release operation determination data309. That is, whether or not a transition condition to the swinging state has been satisfied is determined. More specifically, the processor81determines whether or not the state has changed within three frames, from the state where the input exceeds the swing occurrence threshold131as shown inFIG. 19, to the state in which the input is smaller than the release determination threshold132as shown inFIG. 20. As a result of the determination, when the release operation has not occurred (NO in step S36), the processor81sets, in step S37, an orientation and a position of the sword object204on the basis of the input vector calculated in step S35. Accordingly, as the posturing action, the orientation and position of the sword object204can be changed in accordance with the direction input of the right stick52. In a case where the input of the right stick52is neutral, a predetermined orientation and a predetermined position corresponding to this case are set. Then, the posturing action process ends.

Meanwhile, as a result of the determination in step S36, when the release operation has occurred (YES in step S36), the processor81sets, in step S38, data indicating a swinging state in the PO state information308.

Next, in step S39, the processor81executes a swing content setting process. This process is a process of setting the contents of the sword swing parameters310. On the basis of the contents set here, change in the orientation and moving of the sword object204will be performed in the process regarding the next frame.FIG. 26is a flowchart showing details of the swing content setting process. InFIG. 26, the processor81determines, in step S61, whether or not the operation mode is the first operation mode, on the basis of the operation mode information307. When the operation mode is the first operation mode (YES in step S61), a process of setting the contents of the sword swing parameters310on the basis of the input of the right stick52is performed. First, in step S62, the processor81sets the moving direction of the sword object204on the basis of the calculated input vector. Specifically, the processor81calculates a direction opposite to the direction of the input vector that is immediately before the release operation. Then, the processor81sets the contents of the sword swing parameters310such that the calculated direction indicates the moving direction of the sword object204.

Next, in step S63, the processor81sets the moving direction of the sword object204on the basis of the input vector. Specifically, the processor81calculates a moving speed on the basis of the length of the input vector that is immediately before the release operation. Then, the processor81sets the contents of the sword swing parameters310such that the calculated moving speed indicates the moving speed of the sword object204. That is, the greater the tilt of the right stick52, the faster the moving speed of the sword object204becomes. It should be noted that, in another exemplary embodiment, the moving speed of the sword object204in the swinging action in the first operation mode may be a preset fixed speed, irrespective of the length of the input vector. Then, the swing content setting process ends.

Meanwhile, as a result of the determination in step S61, when the operation mode is the second operation mode (NO in step S61), a process of setting the contents of the sword swing parameters310on the basis of the inertial sensor data306is performed. First, in step S64, the processor81calculates the direction in which the right controller4has been swung, on the basis of acceleration data included in the inertial sensor data306. Then, the processor81sets the contents of the sword swing parameters310such that the calculated direction indicates the moving direction of the sword object204.

Next, in step S65, the processor81calculates the speed at which the right controller4has been swung, on the basis of acceleration data included in the inertial sensor data306. Then, the processor81sets the contents of the sword swing parameters310such that the calculated speed indicates the moving speed of the sword object204. Then, the swing content setting process ends.

With reference back toFIG. 24, upon the ending of the swing content setting process, the posturing action process ends.

Next, a case where, as a result of the determination in step S32, the operation mode is the second operation mode (NO in step S32), is described. In this case, in step S40inFIG. 25, the processor81determines whether or not an operation for ending the posturing state in the second operation mode has been performed. In the exemplary embodiment, this ending operation is pressing of a predetermined button as in the case of the first operation mode.

As a result of the determination above, when the operation for ending the posturing state has been performed (YES in step S40), the processor81sets, in step S41, the non-sword-drawn state in the PO state information308. Then, the posturing action process ends.

Meanwhile, as a result of the determination in step S40, when the operation for ending the posturing state has not been performed (NO in step S40), the processor81next determines, in step S42, whether or not an acceleration having not less than a predetermined value has occurred, on the basis of the inertial sensor data306. This predetermined value is a magnitude of acceleration that is sufficient to allow determination of “having swung” the right controller4. That is, whether or not the right controller4has been swung at not less than a predetermined speed is determined. As a result of this determination, when the value of the acceleration is less than the predetermined value (NO in step S42), the processor81calculates, in step S43, an orientation of the right controller4on the basis of acceleration data and angular velocity data included in the inertial sensor data306.

Next, in step S44, the processor81calculates an orientation and a position of the sword object204so as to correspond to the calculated orientation of the right controller4. Accordingly, the posturing action using the inertial sensors is realized. Then, the posturing action process ends.

Meanwhile, as a result of the determination in step S42, when the value of the acceleration is not less than the predetermined value (YES in step S42), the transition condition from the posturing state to the swinging state in the second operation mode is satisfied. In this case, in step S45, the processor81sets the swinging state in the PO state information308. Then, in step S46, the processor81executes a swing content setting process. This process is the same as the process of step S39described above, and thus, detailed description thereof is omitted. However, in the case of the second operation mode, NO is determined in the determination in step S61, and then the processes thereafter are performed. Then, the posturing action process ends. Upon the ending of the posturing action process, the processing is advanced to step S6.

[Swinging Action Process]

Next, details of the swinging action process in step S10is described.FIG. 27is a flowchart showing details of the swinging action process. InFIG. 27, first, in step S81, the processor81acquires the operation data302.

Next, in step S82, on the basis of the contents of the sword swing parameters310, the processor81moves the sword object204and, in addition, changes the orientation of the sword object204.

Next, in step S83, the processor81performs a hitting determination process. That is, whether or not the sword object204has contacted a predetermined object is determined. The determination result is stored as appropriate in the DRAM85, and various game processes based on the hitting determination result are executed in other game processes in step S6.

Next, in step S84, the processor81determines whether or not the swinging action has ended. For example, whether or not the swinging action (display of the sword swinging motion) has ended is determined on the basis of, for example, whether or not a time corresponding to a predetermined number of frames since the start of the swinging action has elapsed. As a result of this determination, when the swinging action has not ended (NO in step S84), a swing content setting process is executed in step S85. This process is the same as the above-described process in step S39, and thus, detailed description thereof is omitted. Then, the swinging action process ends.

Meanwhile, as a result of the determination in step S84, when the swinging action has ended (YES in step S84), the processor81sets, in step S86, data indicating a posturing state in the PO state information308. Accordingly, transition from the swinging state to the posturing state is performed. Then, the swinging action process ends. Upon the ending of the swinging action process, the processing is advanced to step S6.

This is the end of detailed description of the game processing according to the exemplary embodiment.

As described above, in the exemplary embodiment, a series of actions comprising “posturing” and “swinging” of the sword object204can be performed through a simple operation of direction input of the analog stick. In addition, due to the operation method in which “swinging” is executed by canceling the direction input of the analog stick, the timing and direction of the swinging can be easily determined. Further, the series of the above-described actions based on the motion and orientation of the controller itself using the inertial sensors can also be performed. In this case, whether the analog stick operation or the operation based on the inertial sensors is used, a common action, which is an action comprising “posturing” and “swinging” of the sword object204, can be realized. Accordingly, the player can be provided with a variety of operation methods for the same action, whereby the convenience for the player can be enhanced.

[Modification]

In the example in the above exemplary embodiment, at the time of release operation, a direction opposite to the input direction of the right stick is set as the direction in which the sword object is swung. In another exemplary embodiment, with respect to the direction in which the sword is swung, the sword object204may be swung in a predetermined direction, not limited to such an opposite direction. For example, a specific direction is set in advance as the direction in which the sword is swung, and when the above-described release operation has been performed, the sword may always be swung in this specific direction, irrespective of the input direction at that time.

In the example in the above exemplary embodiment, with respect to the posturing action, when a release operation is performed while an input exceeding the swing occurrence threshold131shown inFIG. 19has occurred, a swinging action is performed. Further, in this example, even when there is a direction input not exceeding the swing occurrence threshold131, the posturing action itself can be performed. In another exemplary embodiment, only when a direction input exceeding the swing occurrence threshold131has been provided may the posturing action be performed. In this case, even when the direction input has been provided, if the direction input does not exceed the swing occurrence threshold131(e.g., an input in a range of not less than the release determination threshold132and less than the swing occurrence threshold131), the posturing action as described above may be prevented from being performed. In this case, a state where “sword drawing has been performed but posturing is not performed” is added between the “non-sword-drawn state” and the “posturing state”. Further, in this state, although the player object201is not in the posture of holding the sword, the position and orientation of the sword may be changed in accordance with the input content. Conversely, in this state, without changing the position and orientation of the sword corresponding to the input content, the sword object204may be fixed to a position and an orientation defined as a default for sword drawing, for example.

Further, in the example in the above exemplary embodiment, with respect to the release operation, when the input state exceeding the swing occurrence threshold131has changed, within a predetermined time, to an input state not exceeding the release determination threshold132, the swinging action is performed. The method for determining the release operation is not limited thereto, and cancelation of the direction input may be determined by another method. Then, when cancelation of the direction input has occurred, the swinging action may be performed.

In the above exemplary embodiment, while the swinging action is performed in the first operation mode, an operation regarding the sword object204is not substantially received until the swinging action ends. Not limited to this, in another exemplary embodiment, the next operation may be received during the swinging action. For example, a case in which it takes a time corresponding to 10 frames from the start of the swinging action to the end thereof is assumed. In this case, a configuration may be adopted in which an operation regarding the sword object204is not received from the start to the 120th frame but the operation is received after the 120th frame. This makes it possible to provide the player with an operation in which the motion of swinging the sword by the player object201is canceled during the motion, and the next posturing action to swinging action are caused to be performed. Therefore, a greater variety of play can be provided to the player, and the entertainment characteristics of the game can be enhanced.

In the example in the above exemplary embodiment, the analog stick is used in the first operation mode. The input device is not limited to the analog stick. The above processes can be applied also when using an analog-input-type direction input device that realizes a predetermined neutral position when there is no input. For example, the above processes can be applied to a slide pad or the like.

In the example in the above exemplary embodiment, a case where the sword object204is swung is described. However, the above processes can be applied also when a weapon object of another type is used as long as the weapon is one that is to be swung while being held with a hand. For example, the weapon may be a club, an ax, or the like.

Further, the processes in the first operation mode can be applied, not only when using an analog-type input device but also when using a digital-type direction input button capable of providing up, down, left, and right direction inputs, for example. In this case, as the release operation, instead of various thresholds as described above, a timing at which the direction input is switched from ON to OFF may be used. Then, the swinging action may be performed such that the sword is swung in a direction opposite to the inputted direction, or a predetermined direction other than the opposite direction.

In the above exemplary embodiment, a case in which a series of processes regarding the game processing are executed in a single apparatus has been described. However, in another exemplary embodiment, the series of processes may be executed in an information processing system including a plurality of information processing apparatuses. For example, in an information processing system including a terminal-side apparatus and a server-side apparatus communicable with the terminal-side apparatus via a network, some of the series of processes above may be executed by the server-side apparatus. Further, in an information processing system including a terminal-side apparatus and a server-side apparatus communicable with the terminal-side apparatus via a network, major processes among the series of processes above may be executed by the server-side apparatus, and some of the processes may be executed in the terminal-side apparatus. Further, in the above information processing system, the system on the server side may be implemented by a plurality of information processing apparatuses, and processes that should be executed on the server side may be shared and executed by a plurality of information processing apparatuses. Further, a configuration of a so-called cloud gaming may be adopted. For example, a configuration may be adopted in which: the main body apparatus2sends operation data indicating operations performed by the player to a predetermined server; various game processes are executed in the server; and the execution result is streaming-distributed as a moving image/sound to the main body apparatus2.

While the exemplary embodiment has been described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is to be understood that various modifications and variations can be made without departing from the scope of the exemplary embodiment.

Claims

  1. A computer-readable non-transitory storage medium having stored therein instructions that, when executed by a processor of an information processing system, cause the processor to: control, in a virtual space, a player object having an item object;cause, based on a first direction input performed on an operation device, the player object to perform a posturing action of holding the item object in an orientation corresponding to an input direction according to the first direction input;and cause, in accordance with cancelation of the first direction input, the player object to perform a swinging action of swinging the item object wherein the swinging action causes the player object to swing the item object from a direction in which the player object holds the item object in the posturing action toward an opposite direction.
  1. The storage medium according to claim 1, wherein the instructions further cause the processor to perform game processing including a success determination process based on the direction in which the item object is swung in the swinging action.
  2. The storage medium according to claim 1, wherein the operation device includes a first stick for providing a first direction input, and outputs direction input data corresponding to a tilt of the first stick, and the instructions further cause the processor to: cause the player object to perform the posturing action in accordance with a tilting direction of the first stick while a tilting degree of the first stick exceeds a first reference;and cause, when a state where the tilting degree exceeds the first reference has transitioned to a state where the tilting degree does not exceed a second reference, the player object to perform the swinging action, assuming that the cancelation of the first direction input has been performed.
  3. The storage medium according to claim 3, wherein the second reference is a tilting degree smaller than the first reference, and when the state where the tilting degree exceeds the first reference has changed to the state where the tilting degree does not exceed the second reference within a predetermined time, the swinging action is caused to be performed, assuming that the cancelation of the first direction input has been performed.
  4. The storage medium according to claim 4, wherein when the tilting degree is between the first reference and the second reference, the instructions further cause the processor to cause the player object to perform an action of moving the item object, on the basis of the tilting degree and the tilting direction.
  5. The storage medium according to claim 1, wherein the operation device further includes a second stick for providing a second direction input, and outputs direction input data corresponding to a tilt of the second stick, and the instructions further cause the processor to move the player object in the virtual space on the basis of the second direction input.
  6. The storage medium according to claim 1, wherein the operation device further includes an inertial sensor, and the instructions further cause the processor to: in a first operation mode, cause the player object to perform the posturing action and the swinging action on the basis of the first direction input;and in a second operation mode, cause the player object to perform the posturing action on the basis of an orientation of the operation device, and cause the player object to perform the swinging action on the basis of a swing input performed on the operation device.
  7. The storage medium according to claim 1, wherein the item object is a weapon object for attacking an enemy object by the swinging action.
  8. The storage medium according to claim 1, wherein the instructions further cause the processor to, while an input on the operation device other than the first direction input is performed together with the first direction input, control a virtual camera on the basis of the first direction input, without causing the player object to perform the posturing action and the swinging action.
  9. A game apparatus comprising at least one processor, the processor being configured to: control, in a virtual space, a player object having an item object;cause, based on data indicating a first direction input which has been performed on and outputted from an operation device capable of providing a direction input, the player object to perform a posturing action of holding the item object in an orientation corresponding to an input direction according to the first direction input;and cause, in accordance with data indicating cancelation of the first direction input outputted from the operation device, the player object to perform a swinging action of swinging the item object wherein the swinging action causes the player object to swing the item object from a direction in which the player object holds the item object in the posturing action toward an opposite direction.
  10. The game apparatus according to claim 10, wherein the operation device includes a first stick for providing a first direction input, and the processor acquires direction input data corresponding to a tilt of the first stick and outputted from the operation device, causes the player object to perform the posturing action in accordance with a tilting direction of the first stick while a tilting degree of the first stick exceeds a first reference, and causes, when a state where the tilting degree exceeds the first reference has transitioned to a state where the tilting degree does not exceed a second reference, the player object to perform the swinging action, assuming that the cancelation of the first direction input has been performed.
  11. The game apparatus according to claim 10, wherein the operation device further includes a second stick for providing a second direction input, and the processor acquires direction input data corresponding to a tilt of the second stick and outputted from the operation device, and moves the player object in the virtual space on the basis of the second direction input.
  12. A game system comprising at least one processor, the processor being configured to: control, in a virtual space, a player object having an item object;cause, based on a first direction input performed on an operation device capable of providing a direction input, the player object to perform a posturing action of holding the item object in an orientation corresponding to an input direction according to the first direction input;and cause, in accordance with cancelation of the first direction input, the player object to perform a swinging action of swinging the item object wherein the swinging action causes the player object to swing the item object from a direction in which the player object holds the item object in the posturing action toward an opposite direction.
  13. The game system according to claim 13, wherein the processor further performs game processing including a success determination process based on the direction in which the item object is swung in the swinging action.
  14. The game system according to claim 13, wherein the operation device includes a first stick for providing a first direction input, and outputs direction input data corresponding to a tilt of the first stick, and the processor further causes the player object to perform the posturing action in accordance with a tilting direction of the first stick while a tilting degree of the first stick exceeds a first reference, and causes, when a state where the tilting degree exceeds the first reference has transitioned to a state where the tilting degree does not exceed a second reference, the player object to perform the swinging action, assuming that the cancelation of the first direction input has been performed.
  15. The game system according to claim 15, wherein the second reference is a tilting degree smaller than the first reference, and when the state where the tilting degree exceeds the first reference has changed to the state where the tilting degree does not exceed the second reference within a predetermined time, the processor causes the player object to perform the swinging action, assuming that the cancelation of the first direction input has been performed.
  16. The game system according to claim 16, wherein when the tilting degree is between the first reference and the second reference, the processor further causes the player object to perform an action of moving the item object, on the basis of the tilting degree and the tilting direction.
  17. The game system according to claim 13, wherein the operation device further includes a second stick for providing a second direction input, and outputs direction input data corresponding to a tilt of the second stick, and the processor further moves the player object in the virtual space on the basis of the second direction input.
  18. The game system according to claim 13, wherein the operation device further includes an inertial sensor, and the processor further, in a first operation mode, causes the player object to perform the posturing action and the swinging action on the basis of the first direction input, and in a second operation mode, causes the player object to perform the posturing action on the basis of an orientation of the operation device, and causes the player object to perform the swinging action on the basis of a swing input performed on the operation device.
  19. The game system according to claim 19, wherein the game system includes the operation device and a main body apparatus which the operation device is attachable to and detachable from, when the operation device is in a state of being attached to the main body apparatus, only the first operation mode is usable, and when the operation device is in a state of not being attached to the main body apparatus, the first operation mode and the second operation mode are switchable with each other.
  20. The game system according to claim 13, wherein the item object is a weapon object for attacking an enemy object by the swinging action.
  21. The game system according to claim 13, wherein while an input on the operation device other than the first direction input is performed together with the first direction input, the processor further controls a virtual camera on the basis of the first direction input, without causing the player object to perform the posturing action and the swinging action.
  22. A game processing method executed by a processor configured to control an information processing system, the game processing method causing the processor to: control, in a virtual space, a player object having an item object;cause, based on a first direction input performed on an operation device capable of providing a direction input, the player object to perform a posturing action of holding the item object in an orientation corresponding to an input direction according to the first direction input;and cause, in accordance with cancelation of the first direction input, the player object to perform a swinging action of swinging the item object wherein the swinging action causes the player object to swing the item object from a direction in which the player object holds the item object in the posturing action toward an opposite direction.
  23. The game processing method according to claim 23, wherein the operation device includes a first stick for providing a first direction input, and outputs direction input data corresponding to a tilt of the first stick, and the processor further causes the player object to perform the posturing action in accordance with a tilting direction of the first stick while a tilting degree of the first stick exceeds a first reference, and causes, when a state where the tilting degree exceeds the first reference has transitioned to a state where the tilting degree does not exceed a second reference, the player object to perform the swinging action, assuming that the cancelation of the first direction input has been performed.
  24. The game processing method according to claim 23, wherein the operation device further includes a second stick for providing a second direction input, and outputs direction input data corresponding to a tilt of the second stick, and the processor further moves the player object in the virtual space on the basis of the second direction input.

Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.