U.S. Pat. No. 11,890,543

STORAGE MEDIUM, INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, AND GAME PROCESSING METHOD

AssigneeNintendo Co., Ltd.; HAL LABORATORY, INC.

Issue DateJuly 6, 2022

Illustrative Figure

Abstract

An example of an information processing system controls an action by a player character object in a virtual space in response to an action instruction based on an operation input. When the player character object performs the action, the information processing system defines a hit detection area used for defining whether the action has hit another object other than the player character object at a position that is determined based on the position and the orientation of the player character object in the virtual space, and expands the hit detection area in the depth direction of a virtual camera. If the expanded hit detection area is in contact with the other object, the information processing system performs a process based on the action against the other object.

Description

DETAILED DESCRIPTION OF NON-LIMITING EXAMPLE EMBODIMENTS [1. Configuration of Game System] A game system according to an example of an exemplary embodiment is described below. An example of a game system1according to the exemplary embodiment includes a main body apparatus (an information processing apparatus; which functions as a game apparatus main body in the exemplary embodiment)2, a left controller3, and a right controller4. Each of the left controller3and the right controller4is attachable to and detachable from the main body apparatus2. That is, the game system1can be used as a unified apparatus obtained by attaching each of the left controller3and the right controller4to the main body apparatus2. Further, in the game system1, the main body apparatus2, the left controller3, and the right controller4can also be used as separate bodies (seeFIG.2). Hereinafter, first, the hardware configuration of the game system1according to the exemplary embodiment is described, and then, the control of the game system1according to the exemplary embodiment is described. FIG.1is a diagram showing an example of the state where the left controller3and the right controller4are attached to the main body apparatus2. As shown inFIG.1, each of the left controller3and the right controller4is attached to and unified with the main body apparatus2. The main body apparatus2is an apparatus for performing various processes (e.g., game processing) in the game system1. The main body apparatus2includes a display12. Each of the left controller3and the right controller4is an apparatus including operation sections with which a user provides inputs. FIG.2is a diagram showing an example of the state where each of the left controller3and the right controller4is detached from the main body apparatus2. As shown inFIGS.1and2, the left controller3and the right controller4are attachable to and detachable from the main body apparatus2. It should be noted that hereinafter, the left controller3and the right controller4will occasionally be referred to ...

DETAILED DESCRIPTION OF NON-LIMITING EXAMPLE EMBODIMENTS

[1. Configuration of Game System]

A game system according to an example of an exemplary embodiment is described below. An example of a game system1according to the exemplary embodiment includes a main body apparatus (an information processing apparatus; which functions as a game apparatus main body in the exemplary embodiment)2, a left controller3, and a right controller4. Each of the left controller3and the right controller4is attachable to and detachable from the main body apparatus2. That is, the game system1can be used as a unified apparatus obtained by attaching each of the left controller3and the right controller4to the main body apparatus2. Further, in the game system1, the main body apparatus2, the left controller3, and the right controller4can also be used as separate bodies (seeFIG.2). Hereinafter, first, the hardware configuration of the game system1according to the exemplary embodiment is described, and then, the control of the game system1according to the exemplary embodiment is described.

FIG.1is a diagram showing an example of the state where the left controller3and the right controller4are attached to the main body apparatus2. As shown inFIG.1, each of the left controller3and the right controller4is attached to and unified with the main body apparatus2. The main body apparatus2is an apparatus for performing various processes (e.g., game processing) in the game system1. The main body apparatus2includes a display12. Each of the left controller3and the right controller4is an apparatus including operation sections with which a user provides inputs.

FIG.2is a diagram showing an example of the state where each of the left controller3and the right controller4is detached from the main body apparatus2. As shown inFIGS.1and2, the left controller3and the right controller4are attachable to and detachable from the main body apparatus2. It should be noted that hereinafter, the left controller3and the right controller4will occasionally be referred to collectively as a “controller”.

FIG.3is six orthogonal views showing an example of the main body apparatus2. As shown inFIG.3, the main body apparatus2includes an approximately plate-shaped housing11. In the exemplary embodiment, a main surface (in other words, a surface on a front side, i.e., a surface on which the display12is provided) of the housing11has a generally rectangular shape.

It should be noted that the shape and the size of the housing11are optional. As an example, the housing11may be of a portable size. Further, the main body apparatus2alone or the unified apparatus obtained by attaching the left controller3and the right controller4to the main body apparatus2may function as a mobile apparatus. The main body apparatus2or the unified apparatus may function as a handheld apparatus or a portable apparatus.

As shown inFIG.3, the main body apparatus2includes the display12, which is provided on the main surface of the housing11. The display12displays an image generated by the main body apparatus2. In the exemplary embodiment, the display12is a liquid crystal display device (LCD). The display12, however, may be a display device of any type.

Further, the main body apparatus2includes a touch panel13on a screen of the display12. In the exemplary embodiment, the touch panel13is of a type that allows a multi-touch input (e.g., a capacitive type). The touch panel13, however, may be of any type. For example, the touch panel13may be of a type that allows a single-touch input (e.g., a resistive type).

The main body apparatus2includes speakers (i.e., speakers88shown inFIG.6) within the housing11. As shown inFIG.3, speaker holes11aand11bare formed on the main surface of the housing11. Then, sounds output from the speakers88are output through the speaker holes11aand11b.

Further, the main body apparatus2includes a left terminal17, which is a terminal for the main body apparatus2to perform wired communication with the left controller3, and a right terminal21, which is a terminal for the main body apparatus2to perform wired communication with the right controller4.

As shown inFIG.3, the main body apparatus2includes a slot23. The slot23is provided on an upper side surface of the housing11. The slot23is so shaped as to allow a predetermined type of storage medium to be attached to the slot23. The predetermined type of storage medium is, for example, a dedicated storage medium (e.g., a dedicated memory card) for the game system1and an information processing apparatus of the same type as the game system1. The predetermined type of storage medium is used to store, for example, data (e.g., saved data of an application or the like) used by the main body apparatus2and/or a program (e.g., a program for an application or the like) executed by the main body apparatus2. Further, the main body apparatus2includes a power button28.

The main body apparatus2includes a lower terminal27. The lower terminal27is a terminal for the main body apparatus2to communicate with a cradle. In the exemplary embodiment, the lower terminal27is a USB connector (more specifically, a female connector). Further, when the unified apparatus or the main body apparatus2alone is mounted on the cradle, the game system1can display on a stationary monitor an image generated by and output from the main body apparatus2. Further, in the exemplary embodiment, the cradle has the function of charging the unified apparatus or the main body apparatus2alone mounted on the cradle. Further, the cradle has the function of a hub device (specifically, a USB hub).

FIG.4is six orthogonal views showing an example of the left controller3. As shown inFIG.4, the left controller3includes a housing31. In the exemplary embodiment, the housing31has a vertically long shape, i.e., is shaped to be long in an up-down direction (i.e., a y-axis direction shown inFIGS.1and4). In the state where the left controller3is detached from the main body apparatus2, the left controller3can also be held in the orientation in which the left controller3is vertically long. The housing31has such a shape and a size that when held in the orientation in which the housing31is vertically long, the housing31can be held with one hand, particularly the left hand. Further, the left controller3can also be held in the orientation in which the left controller3is horizontally long. When held in the orientation in which the left controller3is horizontally long, the left controller3may be held with both hands.

The left controller3includes an analog stick32. As shown inFIG.4, the analog stick32is provided on a main surface of the housing31. The analog stick32can be used as a direction input section with which a direction can be input. The user tilts the analog stick32and thereby can input a direction corresponding to the direction of the tilt (and input a magnitude corresponding to the angle of the tilt). It should be noted that the left controller3may include a directional pad, a slide stick that allows a slide input, or the like as the direction input section, instead of the analog stick. Further, in the exemplary embodiment, it is possible to provide an input by pressing the analog stick32.

The left controller3includes various operation buttons. The left controller3includes four operation buttons33to36(specifically, a right direction button33, a down direction button34, an up direction button35, and a left direction button36) on the main surface of the housing31. Further, the left controller3includes a record button37and a “−” (minus) button47. The left controller3includes a first L-button38and a ZL-button39in an upper left portion of a side surface of the housing31. Further, the left controller3includes a second L-button43and a second R-button44, on the side surface of the housing31on which the left controller3is attached to the main body apparatus2. These operation buttons are used to give instructions depending on various programs (e.g., an OS program and an application program) executed by the main body apparatus2.

Further, the left controller3includes a terminal42for the left controller3to perform wired communication with the main body apparatus2.

FIG.5is six orthogonal views showing an example of the right controller4. As shown inFIG.5, the right controller4includes a housing51. In the exemplary embodiment, the housing51has a vertically long shape, i.e., is shaped to be long in the up-down direction. In the state where the right controller4is detached from the main body apparatus2, the right controller4can also be held in the orientation in which the right controller4is vertically long. The housing51has such a shape and a size that when held in the orientation in which the housing51is vertically long, the housing51can be held with one hand, particularly the right hand. Further, the right controller4can also be held in the orientation in which the right controller4is horizontally long. When held in the orientation in which the right controller4is horizontally long, the right controller4may be held with both hands.

Similarly to the left controller3, the right controller4includes an analog stick52as a direction input section. In the exemplary embodiment, the analog stick52has the same configuration as that of the analog stick32of the left controller3. Further, the right controller4may include a directional pad, a slide stick that allows a slide input, or the like, instead of the analog stick. Further, similarly to the left controller3, the right controller4includes four operation buttons53to56(specifically, an A-button53, a B-button54, an X-button55, and a Y-button56) on a main surface of the housing51. Further, the right controller4includes a “+” (plus) button57and a home button58. Further, the right controller4includes a first R-button60and a ZR-button61in an upper right portion of a side surface of the housing51. Further, similarly to the left controller3, the right controller4includes a second L-button65and a second R-button66.

Further, the right controller4includes a terminal64for the right controller4to perform wired communication with the main body apparatus2.

FIG.6is a block diagram showing an example of the internal configuration of the main body apparatus2. The main body apparatus2includes components81to85,87,88,91,97, and98shown inFIG.6in addition to the components shown inFIG.3. Some of the components81to85,87,88,91,97, and98may be mounted as electronic components on an electronic circuit board and accommodated in the housing11.

The main body apparatus2includes a processor81. The processor81is an information processing section for executing various types of information processing to be executed by the main body apparatus2. For example, the processor81may be composed only of a CPU (Central Processing Unit), or may be composed of a SoC (System-on-a-chip) having a plurality of functions such as a CPU function and a GPU (Graphics Processing Unit) function. The processor81executes an information processing program (e.g., a game program) stored in a storage section (specifically, an internal storage medium such as a flash memory84, an external storage medium attached to the slot23, or the like), thereby performing the various types of information processing.

The main body apparatus2includes a flash memory84and a DRAM (Dynamic Random Access Memory)85as examples of internal storage media built into the main body apparatus2. The flash memory84and the DRAM85are connected to the processor81. The flash memory84is a memory mainly used to store various data (or programs) to be saved in the main body apparatus2. The DRAM85is a memory used to temporarily store various data used for information processing.

The main body apparatus2includes a slot interface (hereinafter abbreviated as “I/F”)91. The slot I/F91is connected to the processor81. The slot I/F91is connected to the slot23, and in accordance with an instruction from the processor81, reads and writes data from and to the predetermined type of storage medium (e.g., a dedicated memory card) attached to the slot23.

The processor81appropriately reads and writes data from and to the flash memory84, the DRAM85, and each of the above storage media, thereby performing the above information processing.

The main body apparatus2includes a network communication section82. The network communication section82is connected to the processor81. The network communication section82communicates (specifically, through wireless communication) with an external apparatus via a network. In the exemplary embodiment, as a first communication form, the network communication section82connects to a wireless LAN and communicates with an external apparatus, using a method compliant with the Wi-Fi standard. Further, as a second communication form, the network communication section82wirelessly communicates with another main body apparatus2of the same type, using a predetermined communication method (e.g., communication based on a unique protocol or infrared light communication). It should be noted that the wireless communication in the above second communication form achieves the function of enabling so-called “local communication” in which the main body apparatus2can wirelessly communicate with another main body apparatus2placed in a closed local network area, and the plurality of main body apparatuses2directly communicate with each other to transmit and receive data.

The main body apparatus2includes a controller communication section83. The controller communication section83is connected to the processor81. The controller communication section83wirelessly communicates with the left controller3and/or the right controller4. The communication method between the main body apparatus2and the left controller3and the right controller4is optional. In the exemplary embodiment, the controller communication section83performs communication compliant with the Bluetooth (registered trademark) standard with the left controller3and with the right controller4.

The processor81is connected to the left terminal17, the right terminal21, and the lower terminal27. When performing wired communication with the left controller3, the processor81transmits data to the left controller3via the left terminal17and also receives operation data from the left controller3via the left terminal17. Further, when performing wired communication with the right controller4, the processor81transmits data to the right controller4via the right terminal21and also receives operation data from the right controller4via the right terminal21. Further, when communicating with the cradle, the processor81transmits data to the cradle via the lower terminal27. As described above, in the exemplary embodiment, the main body apparatus2can perform both wired communication and wireless communication with each of the left controller3and the right controller4. Further, when the unified apparatus obtained by attaching the left controller3and the right controller4to the main body apparatus2or the main body apparatus2alone is attached to the cradle, the main body apparatus2can output data (e.g., image data or sound data) to the stationary monitor or the like via the cradle.

Here, the main body apparatus2can communicate with a plurality of left controllers3simultaneously (in other words, in parallel). Further, the main body apparatus2can communicate with a plurality of right controllers4simultaneously (in other words, in parallel). Thus, a plurality of users can simultaneously provide inputs to the main body apparatus2, each using a set of the left controller3and the right controller4. As an example, a first user can provide an input to the main body apparatus2using a first set of the left controller3and the right controller4, and simultaneously, a second user can provide an input to the main body apparatus2using a second set of the left controller3and the right controller4.

Further, the display12is connected to the processor81. The processor81displays a generated image (e.g., an image generated by executing the above information processing) and/or an externally acquired image on the display12.

The main body apparatus2includes a codec circuit87and speakers (specifically, a left speaker and a right speaker)88. The codec circuit87is connected to the speakers88and a sound input/output terminal25and also connected to the processor81. The codec circuit87is a circuit for controlling the input and output of sound data to and from the speakers88and the sound input/output terminal25.

The main body apparatus2includes a power control section97and a battery98. The power control section97is connected to the battery98and the processor81. Further, although not shown inFIG.6, the power control section97is connected to components of the main body apparatus2(specifically, components that receive power supplied from the battery98, the left terminal17, and the right terminal21). Based on a command from the processor81, the power control section97controls the supply of power from the battery98to the above components.

Further, the battery98is connected to the lower terminal27. When an external charging device (e.g., the cradle) is connected to the lower terminal27, and power is supplied to the main body apparatus2via the lower terminal27, the battery98is charged with the supplied power.

FIG.7is a block diagram showing examples of the internal configurations of the main body apparatus2, the left controller3, and the right controller4. It should be noted that the details of the internal configuration of the main body apparatus2are shown inFIG.6and therefore are omitted inFIG.7.

The left controller3includes a communication control section101, which communicates with the main body apparatus2. As shown inFIG.7, the communication control section101is connected to components including the terminal42. In the exemplary embodiment, the communication control section101can communicate with the main body apparatus2through both wired communication via the terminal42and wireless communication not via the terminal42. The communication control section101controls the method for communication performed by the left controller3with the main body apparatus2. That is, when the left controller3is attached to the main body apparatus2, the communication control section101communicates with the main body apparatus2via the terminal42. Further, when the left controller3is detached from the main body apparatus2, the communication control section101wirelessly communicates with the main body apparatus2(specifically, the controller communication section83). The wireless communication between the communication control section101and the controller communication section83is performed in accordance with the Bluetooth (registered trademark) standard, for example.

Further, the left controller3includes a memory102such as a flash memory. The communication control section101includes, for example, a microcomputer (or a microprocessor) and executes firmware stored in the memory102, thereby performing various processes.

The left controller3includes buttons103(specifically, the buttons33to39,43,44, and47). Further, the left controller3includes the analog stick (“stick” inFIG.7)32. Each of the buttons103and the analog stick32outputs information regarding an operation performed on itself to the communication control section101repeatedly at appropriate timing.

The communication control section101acquires information regarding an input (specifically, information regarding an operation or the detection result of the sensor) from each of input sections (specifically, the buttons103and the analog stick32). The communication control section101transmits operation data including the acquired information (or information obtained by performing predetermined processing on the acquired information) to the main body apparatus2. It should be noted that the operation data is transmitted repeatedly, once every predetermined time. It should be noted that the interval at which the information regarding an input is transmitted from each of the input sections to the main body apparatus2may or may not be the same.

The above operation data is transmitted to the main body apparatus2, whereby the main body apparatus2can obtain inputs provided to the left controller3. That is, the main body apparatus2can determine operations on the buttons103and the analog stick32based on the operation data.

The left controller3includes a power supply section108. In the exemplary embodiment, the power supply section108includes a battery and a power control circuit. Although not shown inFIG.7, the power control circuit is connected to the battery and also connected to components of the left controller3(specifically, components that receive power supplied from the battery).

As shown inFIG.7, the right controller4includes a communication control section111, which communicates with the main body apparatus2. Further, the right controller4includes a memory112, which is connected to the communication control section111. The communication control section111is connected to components including the terminal64. The communication control section111and the memory112have functions similar to those of the communication control section101and the memory102, respectively, of the left controller3. Thus, the communication control section111can communicate with the main body apparatus2through both wired communication via the terminal64and wireless communication not via the terminal64(specifically, communication compliant with the Bluetooth (registered trademark) standard). The communication control section111controls the method for communication performed by the right controller4with the main body apparatus2.

The right controller4includes input sections similar to the input sections of the left controller3. Specifically, the right controller4includes buttons113and the analog stick52. These input sections have functions similar to those of the input sections of the left controller3and operate similarly to the input sections of the left controller3.

The right controller4includes a power supply section118. The power supply section118has a function similar to that of the power supply section108of the left controller3and operates similarly to the power supply section108.

[2. Outline of Process Performed on Game System]

Next, referring toFIG.8toFIG.16, the process to be executed on the game system1will be outlined. In the present embodiment, the game system1controls the action of a player character object (hereinafter referred to simply as the “player character”), which is an object to be controlled by the user (referred to also as the player) of the game system1, in the virtual space (referred to also as the game space). In response to an action by the player character, the game system1performs hit detection for that action. Specifically, in the present embodiment, the action is an attack action against an enemy character object (hereinafter referred to simple as the “enemy character”), and the game system1determines whether the attack action has hit the enemy character. Note that in other embodiments, the action is not limited to an attack action but may be any action (see “[4. Functions/effects and variations of present embodiment]” to be described below).

The hit detection area is used for the determination described above. That is, the game system1defines the hit detection area in the game space for an attack action, and determines that the attack action has hit an enemy character if at least a part of the enemy character is included in the hit detection area. On the other hand, if the enemy character is not included in the hit detection area, the game system1determines that the attack action has not hit the enemy character.

In the present embodiment, the player character is able to perform a plurality of kinds of attack actions. While there is no limitation on the attack actions, the method of defining the hit detection area will now be described for three kinds of attack actions, e.g., an attack action of swinging a sword (hereinafter referred to as the “sword action”), an attack action of swinging a hammer (hereinafter referred to as the “hammer action”), and an attack action of throwing a cutter (hereinafter referred to as the “cutter action”).

FIG.8is a view showing an example of the player character performing the sword action and the hit detection area that is defined during the action. When an attack action is performed, the game system1defines hit detection areas202(four hit detection areas202inFIG.8) based on the position and the orientation of a player character201performing the attack action. Specifically, when the player character201performs the sword action, the game system1defines hit detection areas202at positions including the path of a sword object203as shown inFIG.8.

Note that although the details will be described below, in the present embodiment, the game system1expands the hit detection areas. Hereinafter, original hit detection areas to be expanded will be referred to as the “reference areas”. In the example shown inFIG.8, four spherical areas202are defined as reference areas. Note that there is no limitation on the shape and the number of reference areas (referred to also as hit detection areas) defined for a single attack action. For example, areas209may be defined for different parts, e.g., the base, the middle and the tip, of the sword as shown inFIG.9so that the elongate shape of the sword object203is covered by a plurality of (three in the example shown inFIG.9) areas209. Note that in the example shown inFIG.9, three areas209are defined so as to cover the area conforming to the shape of the sword object203as hit detection areas, and four sets of hit detection areas, each set including the three areas209(i.e., a total of 12 hit detection areas), are defined as reference areas so as to cover the area conforming to the swing path of the sword object203as hit detection areas. Although the details will be described below, in the present embodiment, the game system1defines the number and the shape of reference areas in accordance with the type of the attack action so that the reference areas are arranged in a positional relationship (referred to also as the arrangement pattern) in accordance with the type of the attack action.

As described above, since original hit detection areas to be expanded are spherical, the game system1can manage these areas with two parameters, i.e., the center position and the radius, and it is possible to easily perform hit detection.

In the present embodiment, for the sword action, the game system1defines the four reference areas202simultaneously (i.e., in one frame). Over a period in which the sword action is performed, the game system1continuously defines the four reference areas202. Note that the four reference areas202do not move during the sword action.

Note that when a plurality of reference areas are defined simultaneously, the game system1may define the reference areas so that one or more of the reference areas overlap with the other reference area or reference areas (seeFIG.8). Then, it is possible to reduce the possibility that another object arranged on the path of the attack action (e.g., the path of the sword) may not be included in the hit detection areas, thereby erroneously determining that the attack action has not hit the other object.

FIG.10is a view showing an example of the player character shown inFIG.8and the hit detection area as the game space is viewed from above. In the example shown inFIG.10, a virtual camera204for generating the game image is arranged sideward of the player character201. Note thatFIG.10shows only one of the four reference areas202defined for the sake of simplicity of the drawing, and shows only some of additional areas, to be described below, that are defined based on the reference area shown.

In the present embodiment, the game system1expands the hit detection area. Specifically, the game system1expands the hit detection area by setting additional areas205and206at positions that are shifted from the reference area202in the depth direction (the direction of the one-dot-chain line shown inFIG.10) of the virtual camera204. Note that the size and the shape of the additional areas205and206are the same as the size and the shape of the reference area202. Such an expansion method can be said to be a method of expanding the hit detection area by moving the reference area202in the depth direction. The expanded hit detection area is an area that is included at least one of the reference area202and the additional areas205and206.

As described above, by expanding the hit detection area in the depth direction of the virtual camera204, it is more likely to be determined that the attack action has hit an enemy character even if the position at which the attack action is performed and the position of the enemy character are shifted from each other in the depth direction. Thus, even if it is difficult for the user to accurately grasp the position of the enemy character in the depth direction, it is made easier for the user to perform the attack action to hit the enemy character. Therefore, it is possible to improve the controllability of the attack action.

Note that in the present embodiment, the game system1expands the hit detection area by adding the additional areas205and206having the same shape as the reference area202at positions that are shifted in the depth direction of the virtual camera (in other words, expands the hit detection area by moving the reference area202in the depth direction of the virtual camera). Then, since the game system1can easily define the additional areas205and206by the process of changing the position of the reference area202, it is possible to easily perform the expansion process.

Note that although not shown inFIG.10, the game system1expands each of the four reference areas202by defining additional areas for each of the four reference areas202. Note however that in other embodiments, where a plurality of reference areas are defined, the game system1does not need to expand all the reference areas but may expand at least one or more of the reference areas. For example, consider a case where a plurality of reference areas are arranged in a ring shape, and another reference area is further arranged at the center of the reference areas arranged in a ring shape. In this case, whether or not to expand the reference area arranged at the center, there will be no significant change to the expanded hit detection area as a whole. Therefore, for example, in this case, there is no need to expand the reference area arranged at the center.

As shown inFIG.10, in the present embodiment, the additional areas205and206are defined so that the centers of the additional areas are located on the straight line that passes through the position of the virtual camera204and the center of the reference area202. Then, as viewed from the position of the virtual camera204, there is substantially no change to the (apparent) extent of the hit detection area even if expanded. Therefore, the game system1can expand the hit detection area without causing the user to feel awkward.

In the present embodiment, the game system1defines the additional area205on the near side and the additional area206on the far side, relative to the reference area202, in the depth direction of the virtual camera204. Thus, the game system1expands the hit detection area by adding an additional area both on the near side and on the far side in the depth direction of the virtual camera204. Then, whether the position at which the attack action is performed is shifted on the near side or on the far side relative to the position of the enemy character, it is more likely to be determined that the attack action has hit the enemy character. Therefore, in either case, it is easier for the user to perform the attack action to hit the enemy character, and it is possible to perform the attack action to more reliably hit the enemy character.

Note that in other embodiments, the game system1may expand the hit detection area only on the near side or on the far side in the depth direction of the virtual camera relative to the reference area. For example, in a game in which the player character primarily proceeds toward the far side in the depth direction of the virtual camera, the enemy character is often present on the far side relative to the player character201. Therefore, in such a game, the game system1may define the additional area only on the far side in the depth direction relative to the reference area. Then, the game system1can reduce the number of additional areas and reduce the process load of the hit detection.

In the present embodiment, the additional areas205and206are defined at positions that are shifted, by a predetermined distance, from the reference area202(more specifically, the center of the reference area202) in the depth direction of the virtual camera204. The predetermined distance may be determined based on the size of the player character201. For example, the predetermined distance is determined to a half of the size of the player character201(e.g., the width of the player character201). Note that while the predetermined distance is determined to be an equal length on the near side and on the far side in the depth direction of the virtual camera204in the present embodiment, the predetermined distance may be determined to be different lengths on the near side and on the far side in other embodiments.

In the present embodiment, each additional area is defined so as to partially overlap with the reference area corresponding to the additional area (seeFIG.10). Then, there will be no gap between the reference area and an additional area defined based on the reference area, and it is possible to reduce the possibility that when an enemy character is present in this gap, the attack action is erroneously determined to have not hit the enemy character.

FIG.11, as isFIG.10, is a view showing another example of the player character shown inFIG.8and the hit detection area as the game space is viewed from above. As opposed toFIG.10,FIG.11shows an example where the virtual camera204is arranged reward of the player character201.

Also in the case shown inFIG.11, as in the case shown inFIG.10, the game system1defines the additional areas205and206at positions that are shifted by the predetermined distance in the depth direction of the virtual camera204relative to the reference area202. Therefore, also in the case shown inFIG.11, as in the case shown inFIG.10, the hit detection area is expanded, thereby realizing similar advantageous effects to those of the case shown inFIG.10. Thus, in the present embodiment, the hit detection area is expanded in the depth direction of the virtual camera204irrespective of the position and the orientation of the player character201relative to the virtual camera204. This realizes an advantageous effect that it is easier to perform the attack action to hit the enemy character, irrespective of the position and the orientation of the player character201.

FIG.12is a view showing an example of the player character performing a hammer action and the hit detection area that is defined during the action. The state (a) shown inFIG.12is a state at a point in time when the player character201starts swinging the hammer, and the state (b) shown inFIG.12is a state at a point in time when some time has elapsed since the state (a).

Also with the hammer action, as with the sword action, the game system1defines the hit detection area based on the position and the orientation of the player character201performing the attack action. Here, when the hammer action is performed, the game system1defines a reference area211in the vicinity of the position of the head of the hammer object210as shown inFIG.12. Note that with the hammer action, as opposed to the sword action, the reference area211is defined so as to move in accordance with the movement of the hammer object210during the attack action. That is, the reference area211moves during the hammer action (seeFIG.12).

In the present embodiment, with the hammer action, there is one reference area that is defined simultaneously (specifically, defined in one frame). Note however that also with the hammer action, as with the sword action, a plurality of reference areas may be defined simultaneously.

In the present embodiment, the reference area211during the hammer action has a capsule shape (in other words, a columnar shape) as opposed to the sword action. The game system1defines the so-shaped reference area211based on a first unit area212and a second unit area213, which are spherical. Specifically, the game system1defines, as the reference area211, the first unit area212, the second unit area213, and a connecting area214that connects together the two unit areas212and213(seeFIG.12). Note that in the present embodiment, a connecting area214is the path of the first unit area212if the first unit area212were to move straight to the second unit area213. As described above, in the present embodiment, the game system1can easily define a non-spherical reference area211based on the spherical unit areas212and213.

Thus, in the present embodiment, the size and the shape of the reference area may differ depending on the kind of the attack action, and may be determined appropriately depending on the attack action.

FIG.13is a view showing an example of the player character shown inFIG.12and the hit detection area as the game space is viewed from above. The state (a) shown inFIG.13is a state at the same point in time as the state (a) shown inFIG.12, and the state (b) shown inFIG.13is a state at the same point in time as the state (b) shown inFIG.12. In the example shown inFIG.13, the virtual camera204is arranged sideward of the player character201.

Also with the hammer action as with the sword action, the game system1expands the hit detection area in the depth direction of the virtual camera204. Specifically, as shown inFIG.13, the game system1expands the hit detection area by defining the additional areas216and217at positions that are shifted from the reference area211in the depth direction of the virtual camera204. While the detailed method of defining the additional areas216and217will be described below, the game system1defines the additional areas216and217so that the centers of the additional areas216and217are located on the straight line that connects together the position of the virtual camera204and the center of the reference area211(seeFIG.13). The general shape of each of the additional areas216and217is a capsule shape as is the reference area211.

With the hammer action, the reference area211moves during the attack action as described above. Therefore, during the hammer action, the game system1defines the additional areas216and217based on the reference area211at the current time. That is, the additional areas216and217are defined on the near side and the far side in the depth direction of the virtual camera204relative to the reference area211at the current time (seeFIG.13).

FIG.14is a view showing an example of a method of defining additional areas shown inFIG.13. In the present embodiment, the game system1defines the additional areas for the hammer action as follows. First, the game system1defines additional unit areas221,222,224and225at positions that are shifted in the depth direction of the virtual camera204from the unit areas212and213of the reference area211. Specifically, the first additional unit area221is defined on the near side in the depth direction relative to the first unit area212, and the second additional unit area222is defined on the near side in the depth direction relative to the second unit area213. The third additional unit area224is defined on the far side in the depth direction relative to the first unit area212, and the fourth additional unit area225is defined on the far side in the depth direction relative to the second unit area213. Next, the game system1defines, as the additional area216, two additional unit areas221and222defined on the near side of the reference area211and an additional connecting area223that connects together the additional unit areas221and222. The game system1defines, as the additional area217, two additional unit areas224and225defined on the far side of the reference area211and an additional connecting area226that connects together the additional unit areas224and225. As described above, the additional areas216and217are defined based on the spherical additional unit areas221,222,224and225. Note that since the additional areas216and217are defined as described above, the size and the shape of each of the additional areas216and217are slightly different from the size and the shape of the reference area211during the hammer action.

As described above, with the hammer action, the hit detection area to be expanded (i.e., the reference area211) is an area that includes the first unit area212and the second unit area213, which are arranged in a predetermined positional relationship, and includes the connecting area214that connects together the first unit area212and the second unit area213by a predetermined rule (seeFIG.12). The game system1expands the hit detection area by defining the first additional unit area221(or the third additional unit area224) at a position that is shifted by a predetermined amount in the depth direction of the virtual camera204relative to the first unit area212, defining the second additional unit area222(or the fourth additional unit area225) at a position that is shifted by a predetermined amount in the depth direction of the virtual camera204relative to the second unit area213, and adding, to the hit detection area, the additional area217including the first additional unit area221, the second additional unit area222, and the additional connecting area226that connects together the first additional unit area221and the second additional unit area222by the predetermined rule (seeFIG.14). Thus, even if the shape of the reference area is not a simple shape (e.g., a spherical shape) it is possible to easily define additional areas by defining the additional areas based on unit areas on which the reference area is based.

Note that the “predetermined rule” in the present embodiment is a rule that “the first unit area and the second unit area are connected straight”. Here, in other embodiments, the predetermined rule may be any rule. For example, in other embodiments, the predetermined rule may be a rule that “the first unit area and the second unit area are linked together along the path of the head of the hammer object during the hammer action”. By defining the additional connecting areas and the connecting area using the same rule, it is possible to easily define additional areas having the same or similar shape as the reference area, irrespective of the predetermined rule.

Note that while the size and the shape of each of the additional areas216and217are different from the size and the shape of the reference area211depending on the position in the depth direction in the example of the hammer action described above, additional areas whose size and/or shape are equal to those of the reference area211may be defined in other embodiments. That is, in other embodiments, irrespective of the shape of the reference area, the additional areas may have the same size and the same shape as the reference area, or may have the same shape as the reference area with the size thereof being adjusted.

Note that the distance from the reference area to an additional area (more specifically, the distance from the center of the reference area to the center of the additional area, i.e., the predetermined distance) in the hammer action is equal to that in the sword action. In the present embodiment, the game system1sets the predetermined distance to the same value irrespective of the kind of the attack action. Here, if the predetermined distance differs between different kinds of attack actions, the positional relationship (i.e., the positional relationship between the player character201and the enemy character) for the determination that the attack action has hit the enemy character will differ between different kinds of attack actions, thereby possibly making the user feel awkward. In contrast, in the present embodiment, it is possible to reduce such a possibility by using the same predetermined distance for different kinds of attack actions.

On the other hand, in the present embodiment, since the shape and the size of the reference area differ between different attack actions, if the same predetermined distance is used for different kinds of attack actions, a gap may be produced between the reference area and the additional area, thereby erroneously determining that the attack action has not hit the enemy character, depending on the attack action (more specifically, depending on the size of the reference area that is set for each kind of the attack action). Therefore, in the present embodiment, the game system1sets the predetermined distance so that the reference area partially overlaps with a portion of the additional area that is defined based on the reference area for any of the different kinds of attack actions. Thus, it is possible to prevent the erroneous determination described above.

FIG.15is a view showing an example of the player character performing a cutter action and the hit detection area that is defined during the action. The state (a) shown inFIG.15is a state at a point in time after the player character201has thrown a cutter object231, and the state (b) shown inFIG.12is a state at a point in time when some time has elapsed since the state (a).

During the cutter action, the cutter object231flies, spinning, forward of the player character201as shown inFIG.15, and then returns to the position of the player character201. The game system1defines a plurality of (four inFIG.15) reference areas232to235at the position of the cutter object231. That is, also in the cutter action, as in the other attack actions, the game system1defines reference areas based on the position and the orientation of the player character201performing the attack action (more specifically, based on the position and the orientation of the cutter object231based on the position and the orientation of the player character201). Note that during the cutter action, the game system1defines the four reference areas232to235simultaneously (i.e., in one frame). In the present embodiment, each reference area in the cutter action is spherical.

As described above, also in the cutter action, as in the hammer action, the reference areas are defined so as to move during the attack action. Note that the reference areas232to235may or may not be defined to revolve around the center of the cutter object231as the cutter object231spins.

Note that in the example shown inFIG.15, while the number of reference areas defined for the cutter object231is four for the sake of simplicity of the drawing, more reference areas may be defined in practice so that the reference areas overlap with each other.

In the present embodiment, if a predetermined condition is satisfied (e.g., if the player character201has been charged with power for a certain period of time) during the cutter action, the cutter object231is enlarged (the state (b) shown inFIG.15). In this case, the game system1enlarges the reference areas232to235as the cutter object231is enlarged. Specifically, the reference areas232to235are enlarged by the same ratio as the ratio by which the cutter object231is enlarged. Thus, in the present embodiment, since the size of the reference areas also changes in response to the change in size of the object of the attack action (herein, the cutter object231), the extent of the hit detection area can be set to an appropriate extent in accordance with the change in the object.

FIG.16is a view showing an example of the cutter shown inFIG.15and the hit detection area as the game space is viewed from above. The state (a) shown inFIG.16is a state at the same point in time as the state (a) shown inFIG.15, and the state (b) shown inFIG.16is a state at the same point in time as the state (b) shown inFIG.15.

Also in the cutter action, as in the sword action and the hammer action, the game system1expands the hit detection area in the depth direction of the virtual camera204. Specifically, as shown inFIG.16, the game system1defines additional areas241to248at positions that are shifted from the reference areas232to235, respectively, on the near side and on the far side in the depth direction of the virtual camera204. Specifically, the game system1defines the additional areas241to248so that the centers of the additional areas241to248are located on straight lines connecting between the position of the virtual camera204and the centers of the reference areas232to235(seeFIG.16). Note that the size and the shape of the additional areas241to248are equal to the size and the shape of the reference areas232to235.

When the reference areas232to235are enlarged as in the state (b) shown inFIG.16, the additional areas241to248that are enlarged accordingly are defined. That is, during the cutter action, the game system1determines the size of the additional areas241to248based on the size of the reference areas232to235at the current time. Then, the additional areas241to248can be defined in an appropriate size in accordance with the change in size of the cutter object231.

Note that the distance from the reference area to the additional area (more specifically, the distance from the center of the reference area to the center of the additional area, i.e., the predetermined distance) in the cutter action is equal to that in the sword action and the hammer action. Thus, as the predetermined distance is kept unchanged between different kinds of attack actions, it is possible to reduce the possibility that the positional relationship for the determination that the attack action has hit the enemy character will differ between different attack actions, thereby making the user feel awkward.

While three kinds of attack actions, i.e., the sword action, the hammer action and the cutter action, have been described above, the game system1expands the hit detection area also for other attack actions performed by the player character201. Note however that the game system1does not need to expand the hit detection area for all the attack actions performed by the player character201. For example, for an attack action in which a large number of reference areas are defined, resulting in a large hit detection area, it is possible to easily perform the attack action so as hit the enemy character without expanding the hit detection area, and there is little possibility that the user feels awkward because of the attack action missing the enemy character. Therefore, for such an attack action, the game system1does not need to define additional areas (i.e., does not need to expand the hit detection area). Thus, with the game system1, it is possible to reduce the number of additional areas and to reduce the process load of the hit detection.

In the present embodiment, the enemy character may also perform an attack action (referred to as an “enemy attack action”), and the game system1determines whether the enemy attack action has hit the player character201. The game system1uses the hit detection area also for this determination. Note however that in the present embodiment, the hit detection area is not expanded for the enemy attack action. That is, when an enemy attack action by which an enemy character attacks the player character201based on a control of the enemy character, the game system1defines an enemy hit detection area used for determining whether the enemy attack action has hit the player character based on the position and the orientation of the enemy character in the virtual space, without expanding the enemy hit detection area in the depth direction of the virtual camera204. Thus, it is possible to reduce the amount of computation for the process of expanding the hit detection area and the hit detection process, thereby reducing the process load of the game system1. Note that for the enemy attack action, it is believed that there is little possibility that the user feels awkward about the controllability when the attack action does not hit the player character201even if the hit detection area is not expanded (because the enemy character is not an object controlled by the user).

If the hit detection area defined for the attack action by the player character201is in contact with the enemy character (i.e., if at least a part of the enemy character is included in the hit detection area), the game system1determines that the attack action by the player character201has hit the enemy character. In this case, the game system1executes the process of giving a damage to the enemy character. Here, the process of giving a damage means to include the process of (a) decreasing the value of a parameter representing hit points if such a parameter is set for the enemy character, or (b) knocking down the enemy character that has been hit by the attack action (e.g., the process of making the enemy character fall down and remain still on the ground, or eliminating the enemy character from the game space).

Note that when the hit detection area defined for the attack action by the enemy character is in contact with the player character201, the game system1determines that the attack action by the enemy character has hit the player character201. In such a case, the game system1executes the process of giving a damage to the player character201.

In the present embodiment, the hit detection area is not displayed. Note however that the game system1may display a special effect image of the attack action in at least a part of the hit detection area so that the user can recognize the extent of the hit detection area based on the special effect image.

[3. Specific Example of Process Performed on Game System]

Next, referring toFIG.17andFIG.18, a specific example of an information process performed on the game system1will be described.

FIG.17is a chart showing an example of various data to be used in an information process performed on the game system1. Various data shown inFIG.17are stored in a storage medium (e.g., the flash memory84, the DRAM85, and/or a memory card in the slot23) that can be accessed by the main body apparatus2.

As shown inFIG.17, the game system1stores a game program. The game program is a game program for executing a game process of the present embodiment (specifically, the process shown inFIG.18). The game system1stores the player character data, the enemy character data, the camera data and the hit detection area data.

The player character data represents various information regarding the player character201. Specifically, the player character data includes data representing the position and the orientation of the player character201in the game space. In addition to these data, the player character data may also include data representing a parameter representing the hit points of the player character201.

The enemy character data represents various information regarding the enemy character. Specifically, the enemy character data includes data representing the position and the orientation of the enemy character in the game space. In addition to these data, the enemy character data may also include data representing a parameter representing the hit points of the enemy character.

The camera data includes data representing the position and the orientation of the virtual camera204in the game space. In addition to these data, the camera data may include data representing the angle of view of the virtual camera, etc.

The hit detection area data represents the extent of the hit detection area. In the present embodiment, the hit detection area data includes reference area data and additional area data.

The reference area data represents the extent of the reference area described above. The reference area data may be any data with which the extent of the reference area can be identified. For example, if the reference area is spherical, the reference area data may be data representing the position of the center of the sphere and the radius thereof. If the reference area has a capsule shape as described above, the reference area data may be data representing the position of the center of each of the two spherical unit areas and the radius thereof. If a plurality of reference areas are defined, the reference area data represents the extent of each of the reference areas.

The additional area data represents the extent of the additional area described above. The additional area data may be any data with which the extent of the additional area can be identified. For example, the additional area data, as is the reference area data, may be data representing the position of the center of the spherical additional area or additional unit area and the radius thereof. If a plurality of additional areas are defined, the additional area data represents the extent of each of the additional areas.

FIG.18is a flow chart showing an example of a flow of a game process to be executed by the game system1. The game process shown inFIG.18is started, for example, in response to the player giving an instruction to start the game while the game program described above is being executed.

Note that in the present embodiment, it is assumed that the processor81of the main body apparatus2executes the processes of the steps shown inFIG.18by executing the game program stored in the game system1. Note however that in other embodiments, some of the processes of the steps may be executed by another processor (e.g., a dedicated circuit) different from the processor81. When the game system1can communicate with another information processing apparatus (e.g., a server), some of the processes of the steps shown inFIG.18may be executed on the other information processing apparatus (i.e., the game system1may include the other information processing apparatus). The processes of the steps shown inFIG.18are merely illustrative, and the order of steps to be performed may be switched around or other processes may be executed in addition to (or instead of) the processes of the steps, as long as similar results are obtained.

The processor81executes the processes of the steps shown inFIG.18using a memory (e.g., the DRAM85). That is, the processor81stores information (in other words, data) obtained in each process step in the memory, and when the information is used in a subsequent process step, the information is read out from the memory and used.

In step S1shown inFIG.18, the processor81obtains operation data representing an operation input by the user. That is, the processor81obtains, at appropriate points in time, operation data that is received from each controller via the controller communication section83and/or the terminals17and21, and stores the operation data in the memory. The process of step S2is executed, following step S1.

In step S2, the processor81controls the virtual camera204in the game space. There is no limitation on the specific method for controlling the virtual camera204. For example, the processor81controls the position and the orientation of the virtual camera204based on the operation data obtained in step S1. The processor81may control the virtual camera204based on the position of the player character201in addition to (or instead of) controlling the virtual camera204based on an operation input by the user. Note that in step S2, the processor81updates the camera data stored in the storage medium so as to represent the position and the orientation after the control. The process of step S3is executed, following step S2.

In step S3, the processor81controls the actions of objects (i.e., the player character201and enemy characters) in the game space. Specifically, the processor81determines a move instruction and an action instruction by the user based on the operation data obtained in step S1. The processor81controls the player character201to move in the game space based on a move instruction, and controls the player character201to perform an attack action in response to an action instruction. The processor81controls the enemy characters to move in the game space or perform attack actions based on an algorithm predetermined in the game program. The processor81updates the content of the player character data and the enemy character data stored in the storage medium so as to represent the state after being controlled. The process of step S4is executed, following step S3.

In step S4, the processor81determines whether an attack action by the player character201or an enemy character is being performed. This determination is performed based on the player character data and the enemy character data stored in the storage medium. If the determination result from step S4is affirmative, the process of step S5is executed. On the other hand, if the determination result from step S4is negative, the process of step S10is executed, skipping the process of steps S5to S9.

In step S5, the processor81defines the hit detection area for the attack action being currently performed. The hit detection area defined in step S5is an un-expanded hit detection area (i.e., the reference area). Specifically, the processor81defines reference areas in the game space according to the method described in “[2. Outline of process performed on game system]” above. Note that in the present embodiment, the game program includes data in which each attack action is associated with an arrangement pattern of reference areas (e.g., the pattern in which four reference areas are arranged along the path of the sword for the sword action; seeFIG.8). Based on the arrangement pattern, the processor81defines reference areas in an arrangement in accordance with the kind of the attack action. The processor81updates the content of the reference area data stored in the storage medium so as to represent the defined reference areas. The process of step S6is executed, following step S5.

In step S6, the processor81determines whether the attack action being currently performed is an attack action by the player character201. This determination is performed based on the player character data and the enemy character data stored in the storage medium. If the determination result from step S6is affirmative, the process of step S7is executed. On the other hand, if the determination result from step S6is negative, (i.e., if an enemy character is performing an attack action), the process of step S8is executed, skipping the process of step S7(i.e., the process of expanding the hit detection area). That is, in the present embodiment, the hit detection area is expanded for an attack action by the player character201, and the hit detection area is not expanded for an attack action by an enemy character.

In step S7, the processor81expands the hit detection area defined in step S5. That is, the processor81defines additional areas based on the reference areas defined in step S5. Specifically, the processor81defines additional area in the game space in accordance with the method described in “[2. Outline of process performed on game system]” above. The processor81updates the content of the additional area data stored in the storage medium so as to represent the defined additional areas. The process of step S8is executed, following step S7.

Here, while the game is played, the process loop of steps S1to S11shown inFIG.18is repeatedly executed at a rate of once per a predetermined amount of time (specifically, one frame time). Therefore, the process of steps S5and S7for defining the hit detection area is started in response to an attack action by the player character201being started (i.e., in response to the determination result from step S4becoming affirmative), and is repeatedly executed during the attack action (i.e., while the determination result from step S4is affirmative). Therefore, during the attack action, the processor81updates the hit detection area (more specifically, reference areas and additional areas) in accordance with passage of time.

As described above, in the present embodiment, the game system1starts the control of an attack action by the player character201in response to an action instruction by the user (step S3), and defines the hit detection area (more specifically, reference areas and additional areas) based on the position and the orientation of the player character201in the game space (steps S5and S7). For a predetermined period of time after the start of an attack action (i.e., a period of time in which the determination result from step S4is affirmative), the game system1continues to control the attack action by the player character201(step S3) and updates the hit detection area in accordance with passage of time (steps S5and S7). The process of updating the hit detection area is performed by updating the un-expanded hit detection area (i.e., reference areas) in accordance with passage of time based on the pattern associated with the attack action (step S5), and expanding the updated hit detection area in the depth direction of the virtual camera204(step S7). Thus, in the present embodiment, the hit detection area is defined continuously during the attack action, thus dynamically expanding the hit detection area. Then, even if the un-expanded hit detection area (i.e., reference areas) changes dynamically, the expansion of the hit detection area can be done precisely in response to changes.

Note that the process of “updating the un-expanded hit detection area in accordance with passage of time” may be a process of changing at least one element of the position, the size and the shape of the hit detection area in accordance with passage of time, or may be a process in which these elements do not change as a result.

In step S8, the processor81determines whether the attack action being performed by the player character201or an enemy character has hit another object. That is, the processor81determines whether the other object is included in the hit detection area defined for the attack action. Note that if the attack action is an action by the player character201, the hit detection area used for the determination is a hit detection area that is composed of reference areas defined in step S5and additional areas defined in step S7. On the other hand, if the attack action is an action by an enemy character, the hit detection area used for the determination is the reference areas defined in step S5. The determination of whether the attack action by the player character201has hit the other object is performed with reference to the enemy character data and the hit detection area data stored in the storage medium. The determination of whether the attack action by an enemy character has hit the player character201is performed with reference to the player character data and the hit detection area data stored in the storage medium. If the determination result from step S8is affirmative, the process of step S9is executed. On the other hand, if the determination result from step S8is negative, the process of step S10is executed, skipping the process of step S9.

In step S9, the processor81executes the process of inflicting a damage to the other object (i.e., the enemy character or the player character201), which has been hit by the attack action. For example, the processor81may decrease the hit points of the other object or eliminate the other object from the game space. Then, the processor81updates the content of the player character data or the enemy character data stored in the storage medium so as to reflect the result of the process. The process of step S10is executed, following step S9.

In step S10, the processor81generates, and displays on the display12, a game image representing the game space in which the process result of steps S2, S3and S9has been reflected. Specifically, the processor81generates a game image representing the game space as viewed from the position of the virtual camera204, controlled in step S2, and in the direction of the virtual camera204, wherein the game image represents how the characters act in accordance with the control by the processes of steps S3and S9. The generated game image is displayed on the display12. Note that when the process loop including a series of steps S1to S11is executed, the process of step S10is repeatedly executed at a rate of once per the predetermined amount of time. Thus, a video is displayed, showing how the characters act in the game space. Note that while the game system1displays an image on the display12in the present embodiment, an image may be displayed on another display device (e.g., a monitor connected to the main body apparatus2) different from the display12. The process of step S11is executed again, following step S10.

In step S11, the processor81determines whether or not to end the game. For example, the processor81determines whether an instruction to end the game has been given by the user. If the determination result from step S11is negative, the process of step S1is executed again. Thereafter, the series of processes of steps S1to S11is repeatedly executed until it is determined in step S11to end the game. On the other hand, if the determination result from step S11is affirmative, the processor81ends the game process shown inFIG.18.

[4. Functions/Effects and Variations of Present Embodiment]

As described above, in the embodiment described above, the game program is configured to cause a computer of an information processing apparatus (e.g., the main body apparatus2) to perform the following processes:controlling a virtual camera in a virtual space (step S2);controlling movement of a player character in the virtual space in response to a move instruction based on an operation input by a user (step S3);controlling an action by the player character in the virtual space in response to an action instruction based on the operation input (step S3);when the player character performs the action, defining, in the virtual space, a hit detection area used for determining whether the action has hit another object other than the player character at a position that is determined based on position and orientation of the player character in the virtual space (step S5), and expanding the hit detection area in a depth direction of the virtual camera (step S7); andif the expanded hit detection area is in contact with the other object, executing a process based on the action against the other object (step S9).

Thus, since the hit detection area is expanded in the depth direction of the virtual camera, even if the position of an object that performs an action and the position of another object are shifted from each other in the depth direction, it is easier to perform the action to hit the other object. Thus, it is possible to improve the controllability of the action.

The “action” for which a hit detection area is defined is an attack action by the player character against an enemy character in the embodiment described above, and it is possible to improve the controllability of the attack action. Here, in other embodiments, the “action” is not limited to an attack action but may be any kind of an action. For example, the game system1may define a hit detection area and expand the defined hit detection area in the depth direction of the virtual camera for an action by the player character for obtaining an item placed in the game space, and/or an action for destroying an object placed in the game space.

In the embodiment described above, the hit detection area is defined at the position of a weapon object such as a sword or a hammer to be held by the player character201. Note however that in other embodiments, the player character201does not need to own a weapon object, but the hit detection area may be defined at the position of the player character201itself.

The “process based on an action against another object” may be a process to be performed against the other object, and may be any process to be performed based on the action. In the embodiment described above, the “process based on an action against another object” is the process of giving a damage to the other object. In other embodiments, the “process based on an action against another object” may be a process in which an item is obtained by a character that has performed the action in a case where the other object is an item, for example, and may be a process in which the other object is destroyed or deformed in a case where the other object is an object such as a building or an obstruction placed on the ground.

To “expand a hit detection area” means to include any method for expanding the extent of the hit detection area. While the method for expanding the hit detection area is a method in which additional areas to be added to reference areas are also used as the hit detection area in the embodiment described above, the method for expanding the hit detection area is not limited thereto. For example, the hit detection area may be expanded by deforming the hit detection area.

The “expanded hit detection area” refers to an area that includes the original hit detection area and an area that is added by expansion. Specifically, where the hit detection area is expanded by adding additional areas to the basic area as in the embodiment described above, the “expanded hit detection area” refers to an area that includes the original hit detection area (i.e., the basic area) and the additional areas added thereto. Where the hit detection area is expanded by deforming the hit detection area as in the variation to be described below (seeFIG.19), the “expanded hit detection area” refers to the entirety of the deformed hit detection area.

FIG.19is a view showing an example of a hit detection area according to the variation of the embodiment.FIG.19shows the hit detection area to be defined when the player character201performs a sword action. In the variation shown inFIG.19, the un-expanded hit detection area251is cubic. For example, the cube shape is arranged with one side being parallel to the depth direction of the virtual camera204.

In the variation shown inFIG.19, the game system1deforms the hit detection area251to form an expanded hit detection area252. That is, the game system1expands the hit detection area by deforming the hit detection area251to as to enlarge the hit detection area251in the depth direction of the virtual camera204. Thus, it is possible to expand the hit detection area as in the embodiment described above.

Note that while the example shown inFIG.19is directed to a case where a cubic hit detection area is deformed, the game system1may deform a spherical hit detection area. For example, the game system1may deform a spherical hit detection area into an elliptical sphere so as to expand the hit detection area in the depth direction of the virtual camera204.

Note that in other embodiments, the information processing system (i.e., the game system1) does not need to include some of the components of the embodiment described above and does not need to execute some of the processes that are executed in the embodiment described above. For example, in order to realize a specific one of the advantageous effects of the embodiment described above, the information processing system may include a component or components for realizing the specific advantageous effect and execute a process or processes for realizing the specific advantageous effect, and the information processing system does not need to include other components and does not need to execute other processes.

The embodiment described above can be used in, for example, a game system or a game program, etc., with the aim of making it easier to perform an action to hit an object, for example.

While certain example systems, methods, devices and apparatuses have been described herein, it is to be understood that the appended claims are not to be limited to the systems, methods, devices and apparatuses disclosed, but on the contrary, are intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims

  1. A non-transitory computer-readable storage medium having stored therein a game program, wherein when executed the game program causes a processor of an information processing apparatus to: control a virtual camera in a virtual space;control movement of a player character object in the virtual space in response to a move instruction based on an operation input by a user;control an action by the player character object in the virtual space in response to an action instruction based on the operation input;when the player character object performs the action, define a hit detection area used for determining whether the action has hit another object other than the player character object at a position that is determined based on position and orientation of the player character object in the virtual space, and expand the hit detection area in a depth direction of the virtual camera in the virtual space;and if the expanded hit detection area is in contact with the other object, perform a process based on the action against the other object.
  1. The storage medium according to claim 1, wherein: the hit detection area is a plurality of areas of a predetermined shape that are arranged in a predetermined positional relationship;and the game program when executed further causes the processor to expand the hit detection area by adding, to at least one of the plurality of areas, an area of the same shape as the at least one area at a position that is shifted by a predetermined amount in the depth direction of the virtual camera.
  2. The storage medium according to claim 2, wherein the game program when executed further causes the processor to expand the hit detection area by adding, to at least one of the plurality of areas, an area on a near side thereof in the depth direction of the virtual camera and an area on a far side thereof in the depth direction of the virtual camera.
  3. The storage medium according to claim 1, wherein: the hit detection area is a plurality of areas of a predetermined shape that are arranged in a predetermined positional relationship;and the game program when executed further causes the processor to expand the hit detection area by moving at least one of the plurality of areas in the depth direction of the virtual camera.
  4. The storage medium according to claim 1, wherein the game program when executed further causes the processor to expand the hit detection area by deforming the hit detection area so as to enlarge the hit detection area in the depth direction of the virtual camera.
  5. The storage medium according to claim 1, wherein the hit detection area has a spherical shape.
  6. The storage medium according to claim 1, wherein the hit detection area has a shape that includes a first unit area and a second unit area that are arranged in a predetermined positional relationship, and a connecting area that connects together the first unit area and the second unit area by a predetermined rule.
  7. The storage medium according to claim 1, wherein: the game program when executed further causes the processor to: in response to the action instruction, start a control of the action by the player character object, and define the hit detection area based on position and orientation of the player character object in the virtual space;and for a predetermined period of time after the start of the action, continue the control of the action by the player character object, and update the hit detection area in accordance with passage of time;and updating the hit detection area in accordance with passage of time is done by updating the hit detection area in accordance with passage of time based on a pattern that is associated with the action, and expanding the updated hit detection area in the depth direction of the virtual camera.
  8. The storage medium according to claim 1, wherein: the other object is an enemy character object;the predetermined action is an attack action;and the process based on the action is a process of giving a damage to the enemy character object.
  9. The storage medium according to claim 9, wherein the game program when executed further causes the processor to further: control the enemy character object in the virtual space;if an enemy attack action is performed in which the enemy character object attacks the player character object based on the control of the enemy character object, define an enemy hit detection area used for determining whether the enemy attack action has hit the player character object based on position and orientation of the enemy character object in the virtual space, without expanding the enemy hit detection area in the depth direction of the virtual camera;and if the enemy hit detection area is in contact with the player character object, perform a process of giving a damage to the player character object.
  10. An information processing apparatus comprising a processor and a memory coupled thereto, wherein: the processor is configured to: control a virtual camera in a virtual space;control movement of a player character object in the virtual space in response to a move instruction based on an operation input by a user;control an action by the player character object in the virtual space in response to an action instruction based on the operation input;when the player character object performs the action, define, in the virtual space, a hit detection area used for determining whether the action has hit another object other than the player character object at a position that is determined based on position and orientation of the player character object in the virtual space, and expand the hit detection area in a depth direction of the virtual camera in the virtual space;and if the expanded hit detection area is in contact with the other object, perform a process based on the action against the other object.
  11. The information processing apparatus according to claim 11, wherein: the hit detection area is a plurality of areas of a predetermined shape that are arranged in a predetermined positional relationship;and the processor is configured to expand the hit detection area by adding, to at least one of the plurality of areas, an area of the same shape as the at least one area at a position that is shifted by a predetermined amount in the depth direction of the virtual camera.
  12. The information processing apparatus according to claim 12, wherein the processor is configured to expand the hit detection area by adding, to at least one of the plurality of areas, an area on a near side thereof in the depth direction of the virtual camera and an area on a far side thereof in the depth direction of the virtual camera.
  13. The information processing apparatus according to claim 11, wherein: the hit detection area is a plurality of areas of a predetermined shape that are arranged in a predetermined positional relationship;and the processor is configured to expand the hit detection area by moving at least one of the plurality of areas in the depth direction of the virtual camera.
  14. The information processing apparatus according to claim 11, wherein the processor is configured to expand the hit detection area by deforming the hit detection area so as to enlarge the hit detection area in the depth direction of the virtual camera.
  15. The information processing apparatus according to claim 11, wherein the hit detection area has a spherical shape.
  16. The information processing apparatus according to claim 11, wherein the hit detection area has a shape that includes a first unit area and a second unit area that are arranged in a predetermined positional relationship, and a connecting area that connects together the first unit area and the second unit area by a predetermined rule.
  17. The information processing apparatus according to claim 11, wherein: the processor is configured to: in response to the action instruction, start a control of the action by the player character object, and define the hit detection area based on position and orientation of the player character object in the virtual space;and for a predetermined period of time after the start of the action, continue the control of the action by the player character object, and update the hit detection area in accordance with passage of time;and updating the hit detection area in accordance with passage of time is done by updating the hit detection area in accordance with passage of time based on a pattern that is associated with the action, and expanding the updated hit detection area in the depth direction of the virtual camera.
  18. The information processing apparatus according to claim 11, wherein: the other object is an enemy character object;the predetermined action is an attack action;and the process based on the action is a process of giving a damage to the enemy character object.
  19. The information processing apparatus according to claim 19, wherein the processor is further configured to: control the enemy character object in the virtual space;if an enemy attack action is performed in which the enemy character object attacks the player character object based on the control of the enemy character object, define an enemy hit detection area used for determining whether the enemy attack action has hit the player character object based on position and orientation of the enemy character object in the virtual space, without expanding the enemy hit detection area in the depth direction of the virtual camera;and if the enemy hit detection area is in contact with the player character object, perform a process of giving a damage to the player character object.
  20. An information processing system comprising a processor and a storage medium having stored therein a game program, wherein: the processor is configured to execute the game program to: control a virtual camera in a virtual space;control movement of a player character object in the virtual space in response to a move instruction based on an operation input by a user;control an action by the player character object in the virtual space in response to an action instruction based on the operation input;when the player character object performs the action, define, in the virtual space, a hit detection area used for determining whether the action has hit another object other than the player character object at a position that is determined based on position and orientation of the player character object in the virtual space, and expand the hit detection area in a depth direction of the virtual camera in the virtual space;and if the expanded hit detection area is in contact with the other object, perform a process based on the action against the other object.
  21. The information processing system according to claim 21, wherein: the hit detection area is a plurality of areas of a predetermined shape that are arranged in a predetermined positional relationship;and the processor is configured to expand the hit detection area by adding, to at least one of the plurality of areas, an area of the same shape as the at least one area at a position that is shifted by a predetermined amount in the depth direction of the virtual camera.
  22. The information processing system according to claim 22, wherein the processor is configured to expand the hit detection area by adding, to at least one of the plurality of areas, an area on a near side thereof in the depth direction of the virtual camera and an area on a far side thereof in the depth direction of the virtual camera.
  23. The information processing system according to claim 21, wherein: the hit detection area is a plurality of areas of a predetermined shape that are arranged in a predetermined positional relationship;and the processor is configured to expand the hit detection area by moving at least one of the plurality of areas in the depth direction of the virtual camera.
  24. The information processing system according to claim 21, wherein the processor is configured to expand the hit detection area by deforming the hit detection area so as to enlarge the hit detection area in the depth direction of the virtual camera.
  25. The information processing system according to claim 21, wherein the hit detection area has a spherical shape.
  26. The information processing system according to claim 21, wherein the hit detection area has a shape that includes a first unit area and a second unit area that are arranged in a predetermined positional relationship, and a connecting area that connects together the first unit area and the second unit area by a predetermined rule.
  27. The information processing system according to claim 21, wherein: the processor is configured to: in response to the action instruction, start a control of the action by the player character object, and define the hit detection area based on position and orientation of the player character object in the virtual space;and for a predetermined period of time after the start of the action, continue the control of the action by the player character object, and update the hit detection area in accordance with passage of time;and updating the hit detection area in accordance with passage of time is done by updating the hit detection area in accordance with passage of time based on a pattern that is associated with the action, and expanding the updated hit detection area in the depth direction of the virtual camera.
  28. The information processing system according to claim 21, wherein: the other object is an enemy character object;the predetermined action is an attack action;and the process based on the action is a process of giving a damage to the enemy character object.
  29. The information processing system according to claim 29, wherein: the processor is further configured to: control the enemy character object in the virtual space;if an enemy attack action is performed in which the enemy character object attacks the player character object based on the control of the enemy character object, define an enemy hit detection area used for determining whether the enemy attack action has hit the player character object based on position and orientation of the enemy character object in the virtual space, without expanding the enemy hit detection area in the depth direction of the virtual camera;and if the enemy hit detection area is in contact with the player character object, perform a process of giving a damage to the player character object.
  30. A game processing method to be executed by an information processing system, the method comprising: controlling a virtual camera in a virtual space;controlling movement of a player character object in the virtual space in response to a move instruction based on an operation input by a user;controlling an action by the player character object in the virtual space in response to an action instruction based on the operation input;when the player character object performs the action, defining, in the virtual space, a hit detection area used for determining whether the action has hit another object other than the player character object at a position that is determined based on position and orientation of the player character object in the virtual space, and expand the hit detection area in a depth direction of the virtual camera in the virtual space;and if the expanded hit detection area is in contact with the other object, performing a process based on the action against the other object.
  31. The game processing method according to claim 31, wherein: the hit detection area is a plurality of areas of a predetermined shape that are arranged in a predetermined positional relationship;and further comprising expanding the hit detection area by adding, to at least one of the plurality of areas, an area of the same shape as the at least one area at a position that is shifted by a predetermined amount in the depth direction of the virtual camera.
  32. The game processing method according to claim 32, further comprising expanding the hit detection area by adding, to at least one of the plurality of areas, an area on a near side thereof in the depth direction of the virtual camera and an area on a far side thereof in the depth direction of the virtual camera.
  33. The game processing method according to claim 31, wherein: the hit detection area is a plurality of areas of a predetermined shape that are arranged in a predetermined positional relationship;and further comprising expanding the hit detection area by moving at least one of the plurality of areas in the depth direction of the virtual camera.
  34. The game processing method according to claim 31, further comprising expanding the hit detection area by deforming the hit detection area so as to enlarge the hit detection area in the depth direction of the virtual camera.
  35. The game processing method according to claim 31, wherein the hit detection area has a spherical shape.
  36. The game processing method according to claim 31, wherein the hit detection area has a shape that includes a first unit area and a second unit area that are arranged in a predetermined positional relationship, and a connecting area that connects together the first unit area and the second unit area by a predetermined rule.
  37. The game processing method according to claim 31, further comprising: in response to the action instruction, starting a control of the action by the player character object, and defining the hit detection area based on position and orientation of the player character object in the virtual space;and for a predetermined period of time after the start of the action, continuing the control of the action by the player character object, and updating the hit detection area in accordance with passage of time;and wherein updating the hit detection area in accordance with passage of time is done by updating the hit detection area in accordance with passage of time based on a pattern that is associated with the action, and expanding the updated hit detection area in the depth direction of the virtual camera.
  38. The game processing method according to claim 31, wherein: the other object is an enemy character object;the predetermined action is an attack action;and the process based on the action is a process of giving a damage to the enemy character object.
  39. The game processing method according to claim 39, further comprising: controlling the enemy character object in the virtual space;if an enemy attack action is performed in which the enemy character object attacks the player character object based on the control of the enemy character object, defining an enemy hit detection area used for determining whether the enemy attack action has hit the player character object based on position and orientation of the enemy character object in the virtual space, without expanding the enemy hit detection area in the depth direction of the virtual camera;and if the enemy hit detection area is in contact with the player character object, performing a process of giving a damage to the player character object.

Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.