U.S. Pat. No. 12,023,577

GAME PROCESSING SYSTEM, METHOD OF PROCESSING GAME, AND STORAGE MEDIUM STORING PROGRAM FOR PROCESSING GAME

AssigneeGREE, INC.

Issue DateMarch 30, 2023

Illustrative Figure

Abstract

A game processing system according to an aspect is a game processing system for processing a game having a first mode and a second mode, the system comprising: a storage storing a game parameter relating to the game; and one or more computer processors. The one or more computer processors execute computer-readable instructions to: provide, in the first mode, an interaction with a virtual character in accordance with first detection information obtained by an information processing device not mounted on a head of a player; provide, in the second mode, an interaction with the virtual character in accordance with second detection information obtained by a head mounted display mounted on the head of the player; and update the game parameter in accordance with progress of the game in the first mode and the second mode.

Description

DESCRIPTION OF THE EMBODIMENTS Various embodiments of the disclosure will be described hereinafter with reference to the accompanying drawings. Throughout the drawings, the same or similar components are denoted by the same reference numerals. With reference toFIGS.1to3, a game processing system according to an embodiment of the disclosure will be described.FIG.1is a block diagram of a game processing system1according to an embodiment,FIG.2schematically illustrates a head mounted display10(hereinafter referred to as “HMD10”) used in the game processing system1, andFIG.3schematically illustrates the HMD10worn by a player5. The game processing system1according to an embodiment realizes various games by executing game processing programs according to an embodiment. Using the game processing system1, the player is able to play various games. For example, the game processing system1can realize a game in which a virtual character and the player interact in a virtual space. The game realized by the game processing system1has a first mode and a second mode. The game realized by the game processing system1includes a switch process to switch to the second mode. The same game contents are used in the first mode and the second mode. Game contents are electronic data that are used in a game and can be acquired, owned, used, managed, exchanged, integrated, reinforced, sold, abandoned, or donated in the game by users. The game contents include any content such as a card, an item, a virtual currency, a ticket, a character, an avatar, level information, status information, parameter information (health, attack, and the like), and statistic information (skills, abilities, magic, jobs and the like). However, the way the game contents are used may not be limited to those described in this specification. An example of the first mode is a chat mode, and an example of the second mode is a VR mode. The first mode and the ...

DESCRIPTION OF THE EMBODIMENTS

Various embodiments of the disclosure will be described hereinafter with reference to the accompanying drawings. Throughout the drawings, the same or similar components are denoted by the same reference numerals.

With reference toFIGS.1to3, a game processing system according to an embodiment of the disclosure will be described.FIG.1is a block diagram of a game processing system1according to an embodiment,FIG.2schematically illustrates a head mounted display10(hereinafter referred to as “HMD10”) used in the game processing system1, andFIG.3schematically illustrates the HMD10worn by a player5.

The game processing system1according to an embodiment realizes various games by executing game processing programs according to an embodiment. Using the game processing system1, the player is able to play various games. For example, the game processing system1can realize a game in which a virtual character and the player interact in a virtual space. The game realized by the game processing system1has a first mode and a second mode. The game realized by the game processing system1includes a switch process to switch to the second mode. The same game contents are used in the first mode and the second mode. Game contents are electronic data that are used in a game and can be acquired, owned, used, managed, exchanged, integrated, reinforced, sold, abandoned, or donated in the game by users. The game contents include any content such as a card, an item, a virtual currency, a ticket, a character, an avatar, level information, status information, parameter information (health, attack, and the like), and statistic information (skills, abilities, magic, jobs and the like). However, the way the game contents are used may not be limited to those described in this specification. An example of the first mode is a chat mode, and an example of the second mode is a VR mode. The first mode and the second mode will be described later.

The game processing system1includes the HMD10and a server50. The HMD10and the server50are communicably interconnected over a network40.

As shown inFIG.2, the HMD10includes an attachment11that is to be fitted on a human head and an information processing device20attached to the attachment11.

The attachment11includes a goggle-like casing11ahaving an opening11dformed therein, a first belt11band a second belt11cattached to the casing11a, and a cover11e. The cover11eis attached to the casing11asuch that it is openable and closable. An opening11dis formed in a front portion of the casing11a.FIG.2shows the state where the cover11eis open. When the HMD10is used, the cover11eis closed so as to close the opening11d.

The information processing device20is detachably provided on an inner side of the cover11eof the HMD10. The information processing device20includes a display24. The information processing device20is attached to the cover11esuch that the display24faces the inside of the casing11awhen the cover11eis closed.

The information processing device20is attached to the attachment11when the HMD10is used. When the HMD10is not used, the information processing device20is detached from the attachment11.

In the illustrated embodiment, the information processing device20is a smartphone. In addition to the smartphone, the information processing device20may be a mobile phone, a tablet, a personal computer, an electronic book reader, a wearable computer, a game console, or any other information processing devices. The information processing device20detached from the HMD10is operated by the player5in the first mode as described later. In addition to games, the information processing device20detached from the HMD10may be used for telephone calls and Internet accesses depending on its originally intended use.

The information processing device20includes the display24as described above. In the illustrated embodiment, when the information processing device20is mounted on the attachment11, the display24serves as an apparatus for displaying an image in the HMD10. Accordingly, when the HMD10is used to play a game, the display24displays a virtual space and a virtual character(s) of the game, and other images related to the game.

The shape of the attachment11is not limited to the illustrated goggle type. The attachment11may include a structure of any shape that moves following the movement of the head of the player who wears the HMD, and the structure can place the display24in front of the wearer's eye(s) while the HMD is worn. For example, the attachment11may have an eyeglasses-like shape, a hat-like shape, or a helmet-like shape. In order to enhance player's sense of immersion, the HMD10may be configured such that the display24covers both eyes of the player when the attachment11is attached to the head of the player.

When the HMD10is used, the HMD10is mounted on the head of the player5via the attachment11, as shown inFIG.3. The information processing device20is mounted on the attachment11attached to the head of the player5.

The information processing device20will be further described referring again toFIG.1. As illustrated, in one embodiment, the information processing device20includes a computer processor21, a memory unit22, a communication I/F23, a display24, a sensor unit25, a sound collector26, and a storage27.

The computer processor21is a computing device which loads various programs realizing an operating system and game logics from the storage27or other storage into the memory unit22and executes instructions included in the loaded programs. The computer processor21is, for example, a CPU, an MPU, a DSP, a GPU, any other computing device, or a combination thereof. The processor21may be realized by means of an integrated circuit such as ASIC, PLD, FPGA, MCU, or the like. Although the computer processor21is illustrated as a single component inFIG.1, the computer processor21may be a collection of a plurality of physically separate computer processors. In this specification, a program or instructions included in the program that are described as being executed by the computer processor21may be executed by a single computer processor or executed by a plurality of computer processors distributively. Further, a program or instructions included in the program executed by the computer processor21may be executed by a plurality of virtual computer processors.

The memory unit22is used to store instructions that may be executed by the computer processor21and other various data. At least a part of the game processing program in the embodiment is loaded into the memory unit22at appropriate timings in accordance with the progress of the game. The memory unit22is, for example, a main storage device (main memory) that the computer processor21is able to access at high speed. The memory unit22may be, for example, a RAM such as a DRAM or an SRAM.

The communication I/F23may be implemented as hardware, firmware, or communication software such as a TCP/IP driver or a PPP driver, or a combination thereof. The information processing device20is able to transmit and receive data to and from other devices via the communication I/F23.

The display24includes a display panel24aand a touch-screen panel24b. For example, the display panel24ais laminated on an upper surface or lower surface thereof with the touch-screen panel24b. The display panel24ais a liquid crystal panel, an organic EL panel, an inorganic EL panel, or any other display panel capable of displaying an image. The touch-screen panel24bis configured to detect touch interactions (touch operations) performed by the player. The touch-screen panel24bcan detect various touch operations such as tapping, double tapping, and dragging performed by the player. The touch-screen panel24bmay include a capacitive proximity sensor and may be capable of detecting a non-contact operation performed by the player.

The sensor unit25includes one or more sensors. The sensor unit25includes, for example, at least one selected from the group consisting of a gyro sensor, an acceleration sensor, and a geomagnetic sensor. The sensor unit25may include an eye tracking sensor that directly detects player's eye movements. The eye tracking sensor is, for example, an eye-gaze tracking sensor that emits a near-infrared light beam into the iris and detects its reflected light. The position and the direction of the head of the player5wearing the HMD10are specified based on detection information obtained by the sensor unit25as described later.

The sound collector26is capable of collecting sound and voice. The sound collector26is, for example, a microphone. Sound and voice of the player5is detected based on audio information collected by the sound collector26.

The storage27is an external storage device accessed by the computer processor21. The storage27is, for example, a magnetic disk, an optical disk, a semiconductor memory, or various other storage device capable of storing data. Various programs such as a game processing program are stored in the storage27. The storage27may also store various data used in a game(s). At least some of the programs and various data that can be stored in the storage27may be stored in a storage that is physically separated from the information processing device20.

In the illustrated embodiment, the storage27stores image data28a, chat data28b, scenario data28c, parameter data28d, game progress data28e, and various other data necessary for progress of the game.

The image data28aincludes data for drawing a background in a virtual space where a game is executed, data for drawing a virtual character, and data for drawing an object other than the virtual character used in the game. The image data28amay include information about the position of an object in the virtual space.

The chat data28bincludes data for drawing an icon of a virtual character, data for drawing an icon of the player5, data for specifying a plurality of messages from a virtual character, data representing options of a response message to the plurality of messages from the virtual character, and any other data used in the chat. The plurality of messages from the virtual character may be defined in a tree structure in which nodes corresponding to each message are interconnected by arcs. In the tree structure, for example, more than one arc extends from a start message which is the root node existing at the top, and each arc is connected to a node situated at a lower level. Arcs also extend from the lower nodes and are connected to nodes at a further lower level. The nodes at the lower levels each correspond to a possible message from a virtual character that may be displayed after the start message. The chat data28bmay include a mode switch condition that is a condition for starting a mode switch from the chat mode to the VR mode. The mode switch condition may include, for example, that the elapsed time since the game was started in the chat mode is equal to or longer than a predetermined length of time, that a chat has progressed to a terminal node of messages having the tree structure, and any other conditions. The chat data28bmay include data indicating a message that is to be displayed when the chat mode is resumed after selection of a switch object is not completed.

The scenario data28cincludes data defining a scenario experienced by the player5in the second mode (e.g., the VR mode) of the game. When there are two or more scenarios to be experienced by the player5in the game, the scenario data28cmay be defined for each of the scenarios.

With reference toFIG.4, scenario data used in the game processing system according to an embodiment will be described.FIG.4schematically shows the scenario data28cused in the game processing system1. The scenario data28cis a data set for defining a scenario to be executed in the VR mode. In the illustrated embodiment, the scenario data28cincludes first scenario data28c1to fifth scenario data28c5corresponding to a first scenario to a fifth scenario, respectively. There may be six or more scenarios or may be four or less scenarios in the VR mode. Data structure of the first scenario data28c1corresponding to the first scenario will be now described. The description on the first scenario data28c1below also applies to the second scenario data28c2to the fifth scenario data28c5.

In the illustrated embodiment, the first scenario data28c1includes opening scene data A1, basic scene data B1to B3, additional scene data C1to C3, and ending scene data D1to D2. In one embodiment, the opening scene data A1, the basic scene data B1to B3, the additional scene data C1to C3, and the ending scene data D1to D2may each include data concerning actions of a virtual character in a corresponding scene, data concerning moving images used in the scene, data concerning actions that the player can do responsive to the virtual character in the scene, and any other data necessary for executing the scene.

In the illustrated embodiment, the opening scene data A1includes data concerning an action of the virtual character in an opening scene executed after the first scenario has started, data concerning a moving image used in the opening scene, data concerning actions that the player can do responsive to the virtual character in the opening scene, and any other data necessary for executing the opening scene.

The basic scene data B1to B3include data relating to actions of the virtual character in basic scenes executed after the opening scene, data relating to a moving image used in the basic scene, data relating to actions that the player can make to the virtual character in the basic scene, and any other data necessary for executing the basic scene. The basic scene data B1to B3may each include information indicating conditions for switching to an additional scene to determine whether switching to the additional scene is possible or not. The condition for switching to the additional scene may be determined for each basic scene. The condition for switching to the addition scene is, for example, that the player gazes at a predetermined object for a predetermined period of time or longer, the player does not look fixedly at a predetermined direction, and the player does a prescribed action. Other than the above-mentioned ones, the conditions for switching to the addition scene can be appropriately determined depending on the story of a scenario, a type of an object appearing in the virtual space, and other factors.

The additional scene data C1to C3include data concerning actions of the virtual character in an additional scene, data concerning a moving image used in the additional scene, data concerning actions that the player can do responsive to the virtual character in the additional scene, and any other data necessary for executing the additional scene. It is determined whether the condition for switching to the additional scene is satisfied during or after the basic scene is executed, and then the additional scene is executed in accordance with the determination result.

The ending scene data D1to D2may include data necessary for executing an ending scene before the second mode is terminated. The ending scene data D1to D2include data concerning actions of the virtual character in the ending scene, data concerning a moving image used in the ending scene, data concerning actions that the player can do responsive to the virtual character in the ending scene, and any other data necessary for executing the ending scene.

The parameter data28dincludes a game parameter relating to a game realized in the game processing system1. The game parameter may be a player character parameter relating to a player character. The player character parameter may include a parameter representing a virtual character's favorability to a user character. Here, the virtual character is a non-player character. The game parameter may be updated as the game progresses. The parameter relating to the player character is not limited to the parameter representing the virtual character's favorability to the user character. The parameter relating to the player character is appropriately determined according to a type, the nature, the worldview of the game, or other elements.

The game progress data28eincludes data used for managing the progress of the game. The game progress data28emay be updated as the game progresses. The game progress data28emay include, for example, data related to points acquired by the player in the game, and any other various types of data that may vary depending on the progress of the game.

The components and functions of the information processing device20shown inFIG.1are examples. The information processing device20applicable to the invention may include various components that are not shown. For example, the information processing device20may be provided with a speaker for outputting sound effect of the game and sound and voice of the virtual character.

Next, functions of the HMD10will be described. In the illustrated embodiment, various functions of the HMD10are realized by the computer processor21of the information processing device20executing computer readable instructions. The instructions executed by the computer processor21include instructions included in the game processing program according to an embodiment.

When the game processing program according to the embodiment is executed by the computer processor21, the game having the first mode and the second mode different from the first mode is realized in the game processing system1. The game realized in the game processing system1may further have a mode other than the first mode and the second mode.

In the first mode of the game realized in the game processing system1, processing relating to the game is performed based on first detection information detected by the information processing device20when the HMD10is not mounted on the player5, that is, when the information processing device20is detached from the attachment11. The first detection information may include information concerning a touch operation of the player5detected via the touch-screen panel24bof the information processing device20, information concerning voice of the player detected by the sound collector26, and any other detection information that can be detected in the information processing device20when the HMD10is not attached to the player5. In the first mode, the player5is able to perform operations relating to the game using the information processing device20that is removed from the attachment11. Since the first mode is designed to play the game when the HMD10is not mounted on the player5, it is possible to display a non-stereoscopic image on the display24.

In the second mode of the game realized in the game processing system1, the game is played and progresses using second detection information detected by the HMD10attached to the head of the player5. The second detection information is, for example, detection information obtained by the sensor unit25. Based on the second detection information, head tracking information for determining the orientation of the head of the player5is calculated. The head tracking information may be information indicating the position of the head of the player5in addition to the orientation of the head of the player5. A process for progressing the game in the second mode may be performed based on, for example, the head tracking information calculated based on the second detection information. The process for progressing the game in the second mode may be performed using any other information in addition to the head tracking information. In preparation for playing the game in the second mode, the player5attaches the information processing device20to the attachment11, and places the attachment11with the information processing device20on his/her head. As described above, the second mode is designed such that the game is played while the HMD10is worn by the player5, so in one embodiment, a stereoscopic image that is stereoscopically viewed by the player5is displayed on the display24in the second mode. The stereoscopic image is displayed on the display24by, for example, a parallax barrier method. In the parallax barrier method, a left eye image and a right eye image are displayed on the display24. The stereoscopic image is a set of images including the left eye image and the right eye image configured to be stereoscopically viewed when displayed on the display utilizing the parallax of the left and right eyes.

The first mode is, for example, a chat mode. The chat mode is an example of the first mode. The chat mode provides a function that allows the player to chat with a virtual character via a text message. In the first mode, the player can experience interaction with a virtual character by chatting with the virtual character. Processes performed in the chat mode will be described later in detail. Here, the interaction means, in a broad sense, that the virtual character reacts to an action made by the player. The interaction with the virtual character includes an interaction performed as communication with the virtual character such as conversation with the virtual character. In this specification, an interaction performed as communication with a virtual character may also be referred to as a communicative interaction. In addition to the communicative interaction, the interaction with a virtual character may include a battle against the virtual character, a cooperative play to play the game in cooperation with the virtual character, and other interactions with a virtual character. In this specification, an interaction performed as a battle against a virtual character may be referred to as a battle interaction. In the specification, an interaction performed as a cooperative play with a virtual character may be referred to as a cooperative interaction.

The second mode is, for example, the VR mode. The VR mode provides a function that allows the player to perform various interactions with a virtual character, which is a non-player character appearing in the virtual space displayed on the display of the information processing20. Processes performed in the VR mode will be described later in detail. The VR mode is an example of the second mode, and the second mode may include any game mode in which a process for progressing the game is performed using the head tracking information.

In one embodiment, a game having the first mode and the second mode that is realized by the game processing system1, may be a game in which a player performs interactions with a virtual character other than the communicative interaction. In the game realized by the game processing system1, the communicative interaction may not be performed. The game according to one embodiment is played in a two-dimensional space in the first mode and played in a three-dimensional space in the second mode. The game according to one embodiment is played in a three-dimensional space in the first mode, whereas in the second mode, the game is played in a three-dimensional space displayed in a different manner than the three-dimensional space of the first mode (or in a three-dimensional space configured in a different manner than the three-dimensional space of the first mode). In one embodiment, the game realized by the game processing system1may use a game content(s) common to the first mode and the second mode. A content parameter associated with the game content may be carried over to/from the first mode from/to the second mode. For example, when a value of the content parameter of the specific game content is changed during playing the game in the first mode and thereafter the game switches to the second mode, the specific game content with the changed content parameter is used in the second mode. In one embodiment, a value of a content parameter of a specific game content may be changed when the game is switched from the first mode to the second mode and/or when the game is switched from the second mode to the first mode. The change of the content parameter of the game content may be either advantageous for the player5to progress the game or disadvantageous for the player5. In one embodiment, a game play result in the first mode may be reflected in the game played in the second mode, and a play result in the second mode may be reflected in the game played in the first mode. For example, experience points of the player5acquired in the first mode may be carried over to the second mode.

As described above, the first mode and the second mode of the game realized in the game processing system1are distinguished from each other. That is, the first mode is different from the second mode. In one embodiment, when the game implemented in the game processing system1is played in the first mode, the HMD10is not attached to the head of the player5, whereas when the game is played in the second mode, the HMD10is attached to the head of the player5. In the first mode, the game is processed based on the first detection information obtained by the information processing device20that is not attached to the head of the player5, whereas in the second mode, the game is processed based on the second detection information obtained by the HMD10that is attached to the head of the player5.

In one embodiment, a stereoscopic image is displayed on the display24in the second mode, whereas in the first mode, a non-stereoscopic image is displayed on the display24as described above. At least an image of a virtual character among the images used in the game is displayed as a stereoscopic image in the second mode, whereas in the first mode, the image of the virtual character is displayed as a non-stereoscopic image.

In one embodiment, a process of progressing the game in the first mode is performed without using the head tracking information calculated based on the second detection information, whereas in the second mode, a process of progressing the game is performed based on the head tracking information.

In one embodiment, the process of progressing the game in the first mode is performed in accordance with a touch operation detected via the touch-screen panel24b, whereas in the second mode, the process is not performed in accordance with the touch operation on the touch-screen panel24b.

In one embodiment, in the case where an interaction with a virtual character is provided in a game implemented in the game processing system1, the interaction with the virtual character is provided based on the first detection information obtained by the information processing device20that is not attached to the head of the player5in the first mode. In the second mode, the interaction with the virtual character is provided based on the second detection information obtained by the HMD10attached to the head of the player5.

After the game is started, it is possible to switch between the first mode and the second mode which are distinguished as described above. With reference toFIG.5, the outline of the mode switch of the game processed by the game processing system1will be described. As shown, when the game is started in step S1, the chat mode, which is the first mode, is started in step S2. In this chat mode, upon start of a switching process for switching to the VR mode which is the second mode, the process shifts to step S3and the switching process to the second mode is carried out. When the switching process is completed, the VR mode, which is the second mode, is started in step S4. When the VR mode is terminated or interrupted, a return process to the chat mode is performed.

Functions realized by the computer processor21will be now described more specifically. The computer processor21functions as a chat mode execution unit21a, a switch processing unit21b, a VR mode execution unit21c, and a parameter management unit21dby executing computer readable instructions included in the game processing program. At least some of the functions that can be realized by the computer processor21may be realized by a computer processor other than the computer processor21of the game system1. For example, at least some of the functions realized by the computer processor21may be realized by a computer processor mounted on the server50.

The chat mode execution unit21aperforms processing for providing the game in the chat mode as the first mode by executing a game processing program according to an embodiment. The chat mode executing unit21aimplements a function that allows the player5(or a character of the player5) to chat with a virtual character that is the non-player character. After the player5logs in to the chat mode, the chat mode execution unit21adisplays a message from the virtual character and a message input or selected by the player5on the display24, and enables the player5to chat with the virtual character. Following the message from the virtual character, the chat mode execution unit21amay display on the display24several response options for the player5to respond to the message. One alternative is selected from the response alternatives according to an operation of the player5and a message corresponding to the selected response is displayed on the display24as a message from the player5. The player5can select a desired one from the displayed response options by touching the touch-screen panel24b. The message from the virtual character and the response options for the player to respond to the message can be specified by referring to the chat data28bin accordance with the game processing program. The message from the virtual character may be displayed together with an image of the virtual character, and the message from the player5may be displayed together with an image of the player5. The message from the virtual character may include a message for prompting the player to switch to the VR mode which is the second mode, a message for allowing the player5to select an option(s) for setting of the VR mode, and any other message associated with the VR mode.

In the chat mode, a switch start object for starting switching to the VR mode as the second mode is displayed in accordance with the progress of the chat. Display of the switch start object is performed in accordance with the game processing program when the mode switch condition is satisfied. The chat mode execution unit21ais capable of detecting that the switch start object is selected based on a detection signal from the touch-screen panel24bor any other user interface. When it is detected that the switch start object has been selected, the process for switching to the VR mode, which is the second mode, is started.

The switch processing unit21bperforms the process for switching from the first mode to the second mode. The switch process may include displaying a guidance on the display24to prompt a player to attach the information processing device20to the attachment11, and displaying the switch object on the display24such that it is selectable by the player's gazing. After displaying the switch object, the switch processing unit21breceives a detection signal from the sensor unit25or another sensor and determines whether the switch object has been selected by gazing based on the detection signal. For the determination, the switch processing unit21bcalculates the position and orientation of the HMD10based on the detection signal from the sensor unit25, and specifies the position (point) at which the player5gazes based on the calculated position and orientation of the HMD10. The gazing point can be specified by various known methods. For example, the gazing point may be specified based on the detection signal of the eye tracking sensor. For example, the switch processing unit21bmeasures the duration of time (gazing duration) during which the gazing point is on the switch start object, and when the gazing duration reaches a predetermined length, the switch processing unit21bdetermines that the selection of the switch start object has been completed. After the switch processing unit21bdetermines that the selection of the switch object has been completed, it starts the VR mode which is the second mode.

The VR mode execution unit21cdisplays an image of the virtual character in the virtual space in accordance with the game processing program and realizes interactions between the virtual character and the player5. The VR mode execution unit21cserves as a scene selection unit21c1, a detection unit21c2, an action determination unit21c3, and an image generation unit21c4when the game processing program according to one embodiment is executed by the processor21.

The scene selection unit21c1selects a game scene to be executed based on the scenario data28cand other information as required. After logging in to the VR mode, the scene selection unit21c1selects the opening scene. After the opening scene is executed, the scene selection unit21c1selects a next scene to be executed based on the detection information provided by the HMD10, information stored in the storage27(for example, the parameter data28d), and, any other information as necessary.

The detection unit21c2calculates the head tracking information for determining the orientation and/or the position of the HMD10based on the detection signal from the sensor unit25of the HMD10. The head tracking information is information indicating the orientation and/or the position of the HMD10. The head tracking information is determined based on the detection information from the sensor unit25. More specifically, the head tracking information is obtained such that the position and orientation of the HMD10mounted on the head are calculated as the position in the three-dimensional orthogonal coordinate system and the angle around each axis. The three-dimensional orthogonal coordinate system is, for example, an orthogonal coordinate system composed of a roll axis along the front-rear direction, a yaw axis along the vertical direction, and a pitch axis along the left-right direction. The detection unit21c2generates a virtual space depending on the determined position and/or orientation of the HMD10and outputs image information for depicting the virtual space to the display24. For example, the detection unit21c2can calculate the gazing direction or the gazing point based on the position and/or the orientation of the HMD10.

The action determination unit21c3determines an action of the player5using the HMD10based on the head tracking information. Moreover, the action determination unit21c3may specify an action (reaction) of the virtual character in the virtual space according to the determined action of the player5. The actions of the player5wearing the HMD10include nodding, shaking his/her head, any other actions accompanied by motions of the head, and eye movements. The action of the player5that causes the virtual character to make a reaction may include the player's eye movement. Such eye movement can be detected by the eye tracking sensor included in the sensor unit25.

The image generation unit21c4generates image information for a depicting area in the virtual space. The depicting area is determined by an angle of view or the like around the gazing point or gazing direction determined by the detection unit21c2. Based on the image information generated in this way, the image for the depicting area of the virtual space is displayed on the display24. For the generation of the image information, for example, the image data28a, the scenario data28c, the parameter data28d, the game progress data28e, and other data stored in the storage27may be used. When a virtual character appears in the depicting area, the image generation unit21c4generates image information of the virtual space including the virtual character performing the action specified by the action determination unit21c3, and the image generated in this manner is rendered on the display24as a stereoscopic image. In this way, an image of the virtual space including the image of the virtual character performing the action specified in response to the action of the player5is displayed on the display24. The image generation unit21c4may generate audio information corresponding to the action of the virtual character and the depicting area of the virtual space. The audio information is output to the speaker of the HMD10.

As described above, a motion image in which a virtual character moves in the virtual space is displayed on the display24by the VR mode execution unit21c, and voice and sound corresponding to the movement of the virtual character is output from the speaker. For example, when the virtual character speaks to the player5, a motion image in which the head and mouth of the virtual character move is displayed on the display24, and voice corresponding to words which the virtual character speaks is output from the speaker. In one embodiment, the player5interacts with the same virtual character in the VR mode and the chat mode. For example, a virtual character appearing in the VR mode has the same name as a virtual character appearing in the chat mode, and they have common appearance such that they can be recognized as the same character. It should be noted that the image of the virtual character is displayed as a stereoscopic image in the VR mode, whereas the image of the virtual character is displayed as a non-stereoscopic image in the chat mode, so that the images of the virtual character are different between the VR mode and the chat mode. However, such difference in representation format does not affect the identity of virtual characters.

The parameter management unit21dupdates the parameter data28das the game progresses. When the game is provided in the first mode, the parameter management unit21dupdates the parameter data28daccording to the progress of the game in the first mode, and when the game is provided in the second mode, the parameter management unit21dupdates the parameter data28daccording to the progress of the game in the second mode.

In a case where the parameter data28dincludes a parameter indicating the virtual character's favorability to the user character, the parameter management unit21dmay increase or decrease the value of the parameter depending on an action of the player5in the first mode and the second mode. When the first mode is the chat mode, the amount (for example, increase amount or decrease amount) or rate (increase rate or decrease rate) of change of favorability for each option of the message presented to the player5in the chat mode may be defined. In this case, the parameter management unit21dspecifies the change amount or change rate of the favorability depending on the message option selected by the player5, and updates the parameter data28dbased on the specified change amount or change rate of the favorability. In the VR mode, the change amount or the change rate of the favorability may be determined for each type of action that the player5makes. The parameter management unit21dspecifies the change amount or change rate of the favorability depending on the action of the player5and updates the parameter data28dbased on the specified change amount or change rate of the favorability.

When the second mode is the VR mode, an inappropriate action may be defined for the actions that the player5takes in the VR mode. The inappropriate actions may include player's moving his/her head with an acceleration equal to or greater than a predetermined value, glancing at the prohibited area in the virtual space, and repeating the same action for more than a predetermined number of times. In this specification, among actions that the player5does, actions other than the inappropriate actions may also be referred to as normal actions. When the inappropriate action is performed by the player5, the parameter managing unit21dupdates the parameter data28dso as to decrease the favorability. Whereas when the normal action is made by the player5, the parameter management unit21dupdates the parameter data28dso as to increases or decreases the favorability depending on the normal action.

When the game is being processed in the VR mode, the VR mode may incidentally end abnormally without going through the ending scene prepared in the VR mode. For example, when the power supply to the HMD10is lost or when a function of the system program force-quits the game processing program, the VR mode may be terminated abnormally. When the VR mode is abnormally terminated, the parameter management unit21dupdates the parameter data28dso as to decrease the favorability.

A value of a game parameter included in the parameter data28dmay be changed when the game is switched from the first mode to the second mode and/or when the game is switched from the second mode to the first mode, and the parameter data28dmay be updated so as to reflect the change. The change of the game parameter during mode switch may be either advantageous for the player5to progress the game or disadvantageous for the player5. When the parameter data28dinclude a plurality of game parameters, only a part of the plurality of game parameters may be changed when the game is switched from the first mode to the second mode and/or when the game is switched from the second mode to the first mode. When the parameter data28dinclude a plurality of game parameters, a change amount or a change rate is set for each of the plurality of game parameters, and each of the game parameters may be changed based on the change amount or the change rate set therefor when the game is switched from the first mode to the second mode and/or when the game is switched from the second mode to the first mode.

The parameter data28dmay include a common parameter that can be used in both the first mode and the second mode. The player character parameter may be a common parameter. A value of a common parameter may be either changed or not changed when the game is switched from the first mode to the second mode and/or when the game is switched from the second mode to the first mode. When a value of a common parameter is changed, the parameter data28dis updated so as to reflect the change. The change of the common parameter during mode switch may be either advantageous for the player5to progress the game or disadvantageous for the player5.

The parameter data28dmay include a first mode parameter that is used in the first mode but is not used in the second mode and a second mode parameter that is used in the second mode but is not used in the first mode. Both the first mode parameter and the second mode parameter are included in the game parameters. The first mode parameter and the second mode parameter may be associated with each other. For example, when the first mode parameter is changed in the first mode, the second mode parameter associated with the first mode parameter may be changed in accordance with the change of the first mode parameter. Conversely, when the second mode parameter is changed in the second mode, the first mode parameter associated with the second mode parameter may be changed in accordance with the change of the second mode parameter. When the first mode parameter and the second mode parameter are changed, the parameter data28dmay be updated so as to reflect the change. In one embodiment, one example of the first mode parameter is the virtual character's favorability to the user character used in the first mode (a first mode favorability), and one example of the second mode parameter is the virtual character's favorability to the user character used in the second mode (a second mode favorability). For example, when the first mode favorability rises in the first mode, the second mode favorability rises accordingly. In another embodiment, one example of the first mode parameter is the virtual character's favorability to the user character used in the first mode (a first mode favorability), and one example of the second mode parameter is a game parameter used in the second mode other than the favorability (for example, the attack power of the player character). In this way, the first mode parameter and the second mode parameter may be game parameters of the same type or game parameters of different types.

When a termination condition is satisfied, the VR mode execution unit21cperforms a termination process to terminate the VR mode. The termination condition may include, for example, that a predetermined duration of time (for example, one minute) has elapsed from the start of the VR mode, that an operation for termination has been detected, that the last event included in the scenario being executed in the VR mode has ended, and any other conditions. The termination process performed when the termination condition is satisfied may include, displaying, on the display24, a guidance for prompting the player to remove the information processing device20from the attachment11, and displaying a login screen to log in to the chat mode which is the first mode.

Next, with reference toFIG.6andFIGS.7ato7d, a chat process in the chat mode will be described.FIG.6is a flowchart showing the flow of the process in the chat mode in one embodiment, andFIGS.7ato7dshow examples of a display image in the chat mode. It is assumed that the HMD10is not mounted on the head of the player5and the information processing device20is detached from the attachment11at the start of the chat mode.

As described above, in the game processing system1, the game is started in the chat mode. In step S11, a login screen for logging in to the chat mode is displayed on the display24of the information processing device20. When the login process is performed by the player5, the chat process proceeds to step S12.

In step S12, the player5and the virtual character exchange messages thereby to perform a chat therebetween. More specifically, after logging into the chat mode, a chat display image for displaying a chat performed between the player5and the virtual character is generated, and the chat display image is displayed on the display24.FIG.7ashows a chat display image70which is an example of the chat display image displayed on the display24. The chat display image70has a chat display area71including an icon73corresponding to the player5, an icon74corresponding to the virtual character, a message75afrom the player5, and messages76aand76bfrom the virtual character. In addition, the chat display image70has a menu area72arranged at the top of the chat display area71. The virtual character's messages76aand76bare specified based on the chat data28band other data stored in the storage72in accordance with the game processing program. For example, messages of the virtual character are displayed sequentially from the message in the root node with reference to the chat data28bdefined in the form of the tree structure. At the branch point of nodes, a node is selected depending on a branch condition and a message corresponding to the selected node is displayed.

In step S12, a parameter set for the player5may be increased or decreased depending the option selected by the player5. As described above, the increase/decrease amount or change rate of the parameter is determined based on the change amount (for example, increase amount or decrease amount) or the change rate (increase rate or decrease rate) of the parameter determined for each option of the message. When the change amount or the change rate of the parameter is specified, the parameter data28dis updated based on the specified change amount or change rate of the favorability.

The chat display image70inFIG.7aincludes a display showing options77of the message for the player5to select at the lower part of the chat display area71. The player5is able to select one from among the displayed options77. The selection is performed, for example, by touching an area where the option desired to be selected is displayed on the display24with a finger. Once the selection is made, a message75bcorresponding to the selected option is newly displayed in the chat display area71as a message from the player5, as shown inFIG.7b.

In the course of the chat process, messages76cand76dfor prompting the player to select settings in the second mode VR mode may be displayed as messages from the virtual character as shown inFIG.7c. The message76cis a message for prompting selection of a scenario in the VR mode, and the message76dis a message for prompting selection of an item used by the virtual character in the VR mode (for example, clothes worn by the virtual character in the virtual space in the VR mode). In other words, the settings in the second mode include the scenario in the VR mode and the item used by the virtual character in the VR mode. After the message76cprompting the selection of the scenario is displayed, choices of the scenario are displayed on the display24. For example, first to fifth scenarios are displayed as the choices. The player5is able to select one scenario that he/she likes from among these choices. After the message76dprompting the player5to select an item is displayed, choices of the item are displayed on the display24. The player5can select one item that he/she likes from among the choices. In the course of a chat, a message for prompting the player to switch to the second mode may be displayed as a message from the virtual character.

In step S13, it is determined whether the mode switch condition from the first mode to the second mode is satisfied or not. An example of the mode switch condition from the first mode to the second mode is that a predetermined duration of time (for example, one minute) has elapsed from the start of the chat mode. The elapsed time from the start of chat mode is measured using, for example, a system clock. The process returns to step S12and the chat is continued until it is determined that the mode switch condition is satisfied. When it is determined that the mode switch condition is satisfied, the chat process proceeds to step S14.

In step S14, the switch start object78is displayed on the display24, and the chat process proceeds to step S15. As shown inFIG.7d, the switch start object78is displayed as, for example, an object having a circular profile in the chat display area71. When it is confirmed that the switch start object78is selected in step S15, the chat process proceeds to step S16. Whereas when the selection of the switch start object78is not confirmed even after a predetermined duration of time (for example, 10 seconds) has elapsed since the switch start object78is displayed, it is determined that the switch start object has not been selected and the chat process is terminated. Alternatively, the process may return to step S12and a process for resuming the chat may be performed when it is determined that the switch start object has not been selected. Whether the switch start object78is selected or not may be determined based on an operation performed on the touch-screen panel24b. For example, when an operation (for example, a tap operation) to touch the touch-screen panel24bat a position overlapping the display area of the switch start object78is detected via a detection signal of the touch-screen panel24b, it is determined that the switch start object78is selected. Alternatively, the selection of the switch start object78may be made by a non-contact operation.

In step S16, a switch process to switch to the VR mode, which is the second mode, is started. When the switch process is started, the chat process is terminated.

The above chat process is executed by, for example, the chat mode execution unit21a. The chat mode execution unit21ais capable of executing the above-described chat process alone or in cooperation with other functions as appropriate.

The chat process may be performed using, as required, data stored in a storage other than the storage27, detection information obtained by various sensors, and any other data, in addition to the data stored in the storage27.

The mode switch process from the chat mode as the first mode to the VR mode as the second mode will be now described with reference toFIG.8andFIGS.9aand9b.FIG.8is a flowchart of the mode switch process according to an embodiment.

When the switch process is started, a guidance that prompts the player to wear the HMD10is displayed on the display24of the information processing device20in step S31. A guidance display image80which is an example of the guidance display image including the guidance is shown inFIG.9a. The guidance display image80may include a guidance prompting the player to attach the information processing device20to the attachment11, a guidance prompting the player to wear the HMD10on the head, and any other various guidances necessary for starting the VR mode. The guidance display image80shown inFIG.9aincludes a guidance81prompting the player to attach the information processing device20to the attachment11. This guidance may include instructions for attaching the information processing device20to the attachment11. When a predetermined time has elapsed after displaying the guidance, the mode switch process proceeds to step S32.

In step S32, the switch object is displayed on the display24. An example of the switch process image including the switch object is shown inFIG.9b. In the switch process image85shown inFIG.9b, a switch object86is included. The switch process image85may be displayed as a stereoscopic image. In one embodiment, the images displayed on the display24are switched from non-stereoscopic images to stereoscopic images at the start of the mode switch process or at a predetermined timing after the start of the mode switch process. For example, the images displayed in the chat mode are non-stereoscopic, and the images displayed after the switch process has started are stereoscopic.

Next, in step S33, it is determined whether selection of the switch object86is completed. The switch object86is selected, for example, by the player5gazing at the switch object86for a predetermined duration of time. Therefore, in order to select the switch object86in this state, the player5is required to wear the HMD10. In one embodiment, whether the selection of the switch object86is completed is determined based on whether the gazing point calculated based on the detection signal from the HMD10is situated on the switch object86for a predetermined amount of time or more. The gazing point may be calculated based on information obtained by the sensor unit25provided in the HMD10as described above. The above determination can be made by measuring a gazing time for which the gazing point stays on the switch object86using the system clock and determining whether the measured gazing time has reached the predetermined amount of time. For example, when the gazing time reaches the predetermined amount of time, it is determined that selection of the switch object86is completed, and the mode switch process proceeds to step S34. When completion of the selection of the switch object86is not detected even after a predetermined amount of time has elapsed since the switch object86is displayed, it is determined that the selection of the switch object86has not been completed, and the mode switch process proceeds to step S35.

In step S34, the VR mode, which is the second mode, is initiated. Processes performed in the VR mode will be described later in detail.

When the switch to the VR mode has not been selected, a process for returning to the chat mode is started in step S35. When the returning process to the chat mode is completed, the chat mode is resumed. In the resumed chat mode, a message from the virtual character based on the fact that the switch to the VR mode was not performed may be displayed.

The mode switch process described above is executed, for example, by the switch processing unit21b. The switch processing unit21bis capable of executing the mode switch process alone or in cooperation with other functions as needed.

Next, a VR process in the VR mode will be described with reference toFIGS.10to12.FIG.10is a flowchart showing the flow of the VR process in the VR mode in one embodiment,FIG.11shows an example of a display image in the VR mode, andFIG.12schematically illustrates the flow of actions of the virtual character in a scene being executed in the VR mode. As described above, in order to select the switch object86in the mode switch process, the player5wears the HMD10. It is assumed that the HMD10is mounted on the head of the player5at the start of the VR mode. In the following, the VR process will be described on the premise that the first scenario is selected. In the VR process, the parameter data28dis referred to when necessary. In one embodiment, the value of each game parameter included in the parameter data28das of the start of the VR process is equal to the value of the corresponding game parameter included in the parameter data28das of the end of the chat process. That is, in one embodiment, the parameter data28dupdated in the chat process is used without modification in the VR process.

When the VR mode is started, the opening scene is executed in step S41.

More specifically, image information of the virtual space corresponding to the opening scene is generated, and an image according to the image information is output to the display24. In the opening scene, a predetermined action of the virtual character may be performed in the virtual space.

FIG.11shows an example of a VR image of the opening scene displayed on the display24. The VR image90shown inFIG.11includes a virtual character image91representing the virtual character, images92ato92crepresenting objects constituting the background, and other images of various objects. The VR image90is an image corresponding to a predetermined range of the virtual space for the first scenario. The range of the virtual space corresponding to the VR image90may be specified as a range determined by an angle of view around the gazing point in the virtual space calculated based on the orientation of the HMD10. The image data28ais used to generate the image information. The VR image90and other images displayed in the VR mode are displayed as stereoscopic images.

For example, the virtual character image91is able to take an action specified by opening scene data A1in the virtual space. For example, the action that the virtual character image91performs may include talking to the player5(toward the position of the player5in the virtual space), traveling in the virtual space, picking up an object placed in the virtual space, and any other various actions in the virtual space.

A series of actions performed by the virtual character in the opening scene will be described with reference toFIG.12.FIG.12schematically illustrates a flow of the actions taken by the virtual character in a scene executed in the VR mode.FIG.12illustrates the flow of actions in the opening scene. As shown inFIG.12, the virtual character first performs an action corresponding to an action A11in the opening scene. The action A11corresponds to an action that the virtual character can perform in the opening scene. For example, the action A11is an action of the virtual character talking to a character in the virtual space.

The action A11is defined as an action that may branch. More specifically, the action A11branches to an action A21, an action A22, and an action A23. For example, in the action A11, a first switch condition for switching to the action A21, a second switch condition for switching to the action A22, and a third switch condition for switching to the action A23are set. When the first switch condition is satisfied, the action of the virtual character is switched to the action A21. When the second switch condition is satisfied, the action of the virtual character is switched to the action A22. When the third switch condition is satisfied, the action of the virtual character is switched to the action A23. At least one of the first, second, and third switch conditions may be that the virtual character's favorability to the user character is equal to or greater than a predetermined threshold value.

Various conditions may be set as the first switch condition, the second switch condition, and the third switch condition. For example, the first switch condition, the second switch condition, and the third switch condition may include that the action prescribed as action A11has ended, that the player5performs a predetermined action (for example, the player5gazes at a predetermined part of the virtual character) to the virtual character, and that the player5answers a question from the virtual character. The first switch condition, the second switch condition, and the third switch condition are different from each other. The player5can answer the question from the virtual character by nodding, shaking his/her head, or other head motions. More specifically, when a question is made by the virtual character and the player makes action by moving his/her head, the movement of the head is detected by the sensor unit25of the HMD10and the answer of the player5is identified based on the detected movement of the head. For example, when a nodding motion is detected, it may be determined that the player5has made a positive answer. Whereas when a motion of shaking his/her head is detected, it may be determined that the player5has made a negative answer. Further, when a motion of the head of the player5is not detected for a predetermined duration of time after the question is made by the virtual character, it may be determined that the player5does not answer the question from the virtual character. The action of the player5may be determined by a function of the action determination unit21c3.

The action A21further branches to an action A31and an action A32. When a predetermined switch condition is satisfied in the action A21, switching from the action A21to the action A31or the action A32is performed depending on the established switch condition. Only the action A33is associated with the action A22in the lower layer. That is, the action A22is not branched. In this case, when the action A22ends, the action A33is started. No more action is defined below the action A23. The structure of the actions shown inFIG.12is an example, and the types and the number of actions included in each scene and the structure of the actions may be set appropriately according to a story to be realized in the scene.

The switch from one action to another action may be performed seamlessly so that the player5does not perceive the action switch.

While the actions A11, A21to A23, and A31to A33are executed, the player5can take actions responsive to the virtual character by gazing at a predetermined part of the virtual character or any other method. In response to the action of the player5taken for the virtual character, the virtual character can take a reaction determined based on the start scene data A1. The virtual character may be controlled to make a reaction in accordance with the favorability. For example, for the same action of the player5, a friendly reaction may be made when the favorability is high, and an unfriendly reaction may be made when the favorability is low.

When the action A31, the action A32, the action A33in the terminal layer, or the action A23ends, the opening scene is ended and the VR process proceeds to step S42.

In step S42, the basic scene to be executed is selected from among a plurality of possible basic scenes. Selection of the basic scene to be executed is performed from among a plurality of possible basic scenes prepared for the first scenario, for example, according to the parameter data28d. In the first scenario, three basic scenes corresponding to the basic scene data B1to B3illustrated inFIG.4are provided. In one embodiment, the basic scene to be executed is selected from among these three basic scenes based on the virtual character's favorability to the player5, the favorability is stored as the parameter data28d. For example, when the favorability is equal to or higher than a first threshold value, the basic scene corresponding to the basic scene data B1is selected. When the favorability is lower than a second threshold value, the basic scene corresponding to the basic scene data B3is selected. When the favorability is equal to or higher than the second threshold value and lower than the first threshold value, the basic scene corresponding to the basic scene data B2is selected. Selection of the basic scene may be performed by any other method other. For example, the basic scene may be randomly selected. The basic scene may be randomly selected after performing weighting of the possible basic scenes so that the probability that some basic scene(s) (for example, the first basic scene corresponding to the basic scene data B1) are selected becomes higher as the favorability is higher. The method of selecting the basic scene is not limited to the method explicitly described in the specification. Once the basic scene is selected, the VR process proceeds to step S43.

In step S43, the basic scene selected in step S42is executed. More specifically, image information of the virtual space corresponding to the basic scene is generated, and an image corresponding to the image information is output to the display24. In the basic scene, the virtual character may take a prescribed action in the virtual space. As with the opening scene, the action which the virtual character takes in the basic scene may be defined as a series of branchable actions. A series of branchable actions in the basic scene may have a data structure similar to the data structure shown inFIG.12. When the action situated in the bottom layer in the series of actions is completed, the basic scene ends and the VR process proceeds to step S44.

In step S44, it is determined whether the condition for switching to the additional scene set for the basic scene executed in step S43is satisfied. In the determination, the condition for switching to the additional scene included in the basic scene data (for example, the basic scene data B1) corresponding to the basic scene executed in step S43is retrieved from the storage27, and it is determined whether the condition is satisfied in the basic scene executed in step S43. For example, when the switch condition is that the player gazes at an object representing a door in the virtual space for more than three seconds, counted is the duration of time in which the gazing point of the player5in the virtual space is located on the object representing the door, and it is determined that the switch condition is satisfied when the counted time is three seconds or longer. The gazing point of the player5in the virtual space is calculated on the basis of the orientation of the HMD10calculated based on the detection information from the sensor unit25. The condition for switching to the additional scene may or may not be explicitly presented to the player5at the time of execution of the basic scene. For example, in the basic scene, the virtual character may speak to the player5the line corresponding to the condition for switching to the additional scene.

When it is determined that the switch condition is not satisfied, the VR process proceeds to step S45, whereas when it is determined that the switch condition is satisfied, the VR process proceeds to step S46.

In step S45, a first ending scene is executed. More specifically, image information of the virtual space corresponding to the additional scene is generated, and an image corresponding to the image information is output to the display24.

In step S46, an additional scene is executed. More specifically, image information of the virtual space corresponding to the additional scene is generated, and an image corresponding to the image information is output to the display24. Similarly to the opening scene, the action that the virtual character takes in the additional scene may be defined as a series of branchable actions. A series of branchable actions in the additional scene may have a data structure similar to the data structure shown inFIG.12. When the action situated in the bottom layer in the additional scene is completed, the additional scene ends and the VR process proceeds to step S47.

In step S47, a second ending scene is executed. More specifically, image information of the virtual space corresponding to the second ending scene is generated, and an image corresponding to the image information is output to the display24. The image displayed in the second ending scene may be the same as or different from the image displayed in the first ending scene.

When movement of the head of the player5is detected by the sensor unit25of the HMD10while the VR image90is displayed on the display24, an action of the player5is specified based on the detected movement of the head. Then, an action (reaction) of the virtual character responsive to the specified action of the player5is determined. Subsequently image information of the virtual character image91that performs the determined action is generated. The image information generated in this way is output to the display24. For example, when a nodding motion of the player5is detected, image information of the virtual character image91reacting to the nodding motion of the player5is generated, and the image information is displayed on the display24. In this manner, in the VR mode, interaction between the player5and the virtual character is realized using the stereoscopic image of the virtual character.

In the above-described steps S41, S43, S45to S47, the parameter(s) set for the player5may be increased or decreased according to the action(s) of the player5. As described above, the increase/decrease amount or change rate of the parameter may be determined based on the change amount (for example, increase amount or decrease amount) or the change rate (increase rate or decrease rate) of the parameter determined for each type of the action of the player5. When the change amount or the change rate of the parameter is specified, the parameter data28dis updated based on the specified change amount or change rate of the favorability.

In step S48, the termination process of the VR mode is performed. The termination process may include displaying on the display24a guidance for prompting the player to remove the information processing device20from the attachment11, and displaying a login screen to log in to the chat mode which is the first mode.

The above VR process is executed by the VR mode execution unit21c. The VR mode execution unit21cis capable of executing the VR process alone or in cooperation with other functions as needed.

The game processing system according to another embodiment will be described with reference toFIG.13.FIG.13is a block diagram illustrating a game processing system101according to another embodiment. The game processing system101includes the information processing device20and the HMD110. The game processing system101is different from the game processing system1in that the VR mode can be provided without attaching the information processing device20to the HMD110. Hereinafter, the game processing system101will be described focusing on the differences from the game processing system1.

The HMD110, the information processing device20, and the server50are communicably interconnected over the network40. The HMD110and the information processing device20may be connected so as to communicate with each other according to a short-range wireless system such as Bluetooth (registered trademark) without using the network40. The HMD110is different from the HMD10of the game processing system1in that the VR mode can be provided even if the information processing device20is not mounted.

The HMD110includes a computer processor121, a memory unit122, a communication I/F123, a display124, a sensor unit125, a sound collector126, and a storage127. The computer processor121, the memory unit122, the communication I/F123, the display124, the sensor unit125, the sound collector126, and the storage127are configured similarly to the computer processor21, the memory22, the communication I/F23, the display24, the sensor unit25, the sound collector26, and the storage27of the information processing device20, respectively. However, the display124may not have a touch-screen panel.

The functions of the chat mode execution unit21a, the switch processing unit21b, the VR mode execution unit21c, and the parameter management unit21dare distributed between the information processing device20and the HMD110. Specifically, the function of the chat mode execution unit21ais realized in the information processing device20, and the function of the VR mode execution unit21cis realized in the HMD110. A part of the function of the switch processing unit21bis realized by the information processing device20, and the rest is realized in the HMD110. The function of the parameter management unit21dis realized by both the information processing device20and the HMD110.

The image data28a, the chat data28b, the scenario data28c, the parameter data28d, and the game progress data28eare stored in one or both of the storage27and the storage127. Alternatively these data may be stored in a storage other than the storage27and the storage127.

When starting the game in the game processing system101, the player5uses the information processing device20to start the chat mode. The process for executing the chat mode is performed by the chat mode execution unit21aof the information processing device20.

When the mode switch condition is satisfied in the chat mode and the switch start object is selected, the mode switch process for switching to the VR mode is started. The mode switch process is performed by the switch processing unit21b. In the mode switch process, displaying the guidance corresponding to the above-described step S31is performed in the information processing device20. For example, a guidance prompting the player to put on the HMD110is displayed on the display24of the information processing device20. Displaying the switch object86corresponding to step S32and subsequent processes are executed in the HMD110.

When the VR mode is started, the processing of the VR mode is performed by the VR mode execution unit21cof the HMD110.

According to the above embodiments, the following advantageous effects can be obtained. With the game processing system1,101, it is possible to play a game in the first mode (for example, the chat mode) without using the HMD10. The game parameters such as the favorability updated in the first mode are used also in the second mode (for example, the VR mode). This makes it possible to play the game in the first mode when the HMD10is not usable and to play the game in the second mode when the HMD10is usable.

With the game processing system1,101, the game parameters stored in the storage27are updated in accordance with progress of the game, and different scene are executed in accordance with the updated game parameters.

Embodiments of the present invention are not limited to the above embodiments but various modifications are possible within a spirit of the invention. For example, some or all of the functions executed by the computer processor21and the computer processor121may be realized by a computer processor which is not shown in the above-mentioned embodiment without departing from the scope of the invention. For example, the game processing system1and the game processing system101may include a game machine that executes at least a part of the game processing program. Some of the functions realized by processing of the computer processor21or the computer processor121may be realized by processing performed by the game machine.

Embodiments of the disclosure may include various devices and electronic components other than those described above. For example, in addition to the information processing device20and the HMD10,110, the game processing system1and the game processing system101may be provided with a control device for receiving operations of the player5. The game processing system1and the game processing system101may detect operations of the player5via the control device and perform the processing of the game in accordance with the detected operations.

The procedures described herein, particularly those described with a flowchart, are susceptible of omission of part of the steps constituting the procedure, adding steps not explicitly included in the steps constituting the procedure, and/or reordering the steps. The procedure subjected to such omission, addition, or reordering is also included in the scope of the present invention unless diverged from the purport of the present invention.

Claims

  1. A game processing system for processing a game in a virtual space, the game having a first mode and a second mode, the system comprising: one or more processors;a storage accessed by the one or more processors and storing at least one first mode parameter used in the first mode and at least one second mode parameter used in the second mode;and an information processing device including at least a part of the one or more processors, wherein the one or more processors execute computer-readable instructions to: perform, in the first mode, processing related to the game in accordance with first detection information related to an action of a player obtained by the information processing device not mounted on a head of the player;perform, in the second mode, processing related to the game in accordance with second detection information related to an action of the player obtained by a head mounted display mounted on the head of the player;update the at least one first mode parameter in accordance with progress of the game in the first mode;and update the at least one second mode parameter when the at least one first mode parameter is updated, and wherein the at least one second mode parameter is further updated in accordance with progress of the game in the second mode.
  1. The game processing system of claim 1, wherein the virtual space contains a virtual character that is a non-player character.
  2. The game processing system of claim 2, wherein a same common item is used in the first mode and the second mode, and wherein in the second mode, the common item is used by the virtual character.
  3. The game processing system of claim 2, wherein the game is adapted to provide interaction with the virtual character.
  4. The game processing system of claim 2, wherein the at least one first mode parameter and/or the at least one second mode parameter is associated with interaction with the virtual character in the game.
  5. The game processing system of claim 1, wherein the at least one first mode parameter is a same type of parameter as the at least one second mode parameter.
  6. The game processing system of claim 1, wherein the at least one first mode parameter is a different type of parameter than the at least one second mode parameter.
  7. The game processing system of claim 1, wherein the one or more processors update the at least one first mode parameter when the at least one second mode parameter is updated.
  8. The game processing system of claim 1, wherein the at least one second mode parameter is updated to rise when the at least one first mode parameter has risen in the first mode.
  9. The game processing system of claim 1, wherein the at least one first mode parameter is updated to rise when the at least one second mode parameter has risen in the second mode.
  10. The game processing system of claim 1, wherein at least one of the at least one first mode parameter and the at least one second mode parameter is changed in at least one of a case where the game is switched from the first mode to the second mode and a case where the game is switched from the second mode to the first mode.
  11. The game processing system of claim 1, wherein a same common item is used in the first mode and the second mode.
  12. The game processing system of claim 12, wherein the common item has an item parameter set thereto, and wherein in a case where the game is switched from the first mode to the second mode, the common item having the item parameter changed in the first mode is used in the second mode.
  13. The game processing system of claim 12, wherein the common item has an item parameter set thereto, and wherein the item parameter is changed in at least one of a case where the game is switched from the first mode to the second mode and a case where the game is switched from the second mode to the first mode.
  14. The game processing system of claim 1, wherein the game progresses in accordance with the at least one first mode parameter and/or the at least one second mode parameter.
  15. The game processing system of claim 1, wherein a mode switching process is performed for switching either (i) from the first mode to the second mode responsive to determining that a first switch condition is satisfied in the first mode or (ii) from the second mode to the first mode responsive to determining that a second switch condition is satisfied in the second mode.
  16. The game processing system of claim 1, wherein a switch process image is displayed on the information processing device upon the mode switching process being initiated.
  17. The game processing system of claim 1, wherein the at least one first parameter includes a plurality of first mode parameters, and wherein the at least one second parameter is updated when at least one of the plurality of first mode parameters is updated.
  18. A method of processing a game in a virtual space, the game having a first mode and a second mode, the game being processed by one or more computer processors executing computer-readable instructions, the method comprising: performing, in the first mode, processing related to the game in accordance with first detection information related to an action of a player obtained by an information processing device not mounted on a head of the player;performing, in the second mode, processing related to the game in accordance with second detection information related to an action of the player obtained by a head mounted display mounted on the head of the player;updating at least one first mode parameter used in the first mode in accordance with progress of the game in the first mode;updating at least one second mode parameter used in the second mode when the at least one first mode parameter is updated, and further updating the at least one second mode parameter in accordance with progress of the game in the second mode.
  19. A non-transitory computer-readable storage medium storing a game program for processing a game in a virtual space, the game having a first mode and a second mode, the game program causing one or more computer processors to: perform, in the first mode, processing related to the game in accordance with first detection information related to an action of a player obtained by an information processing device not mounted on a head of the player;perform, in the second mode, processing related to the game in accordance with second detection information related to an action of the player obtained by a head mounted display mounted on the head of the player;update at least one first mode parameter used in the first mode in accordance with progress of the game in the first mode;update at least one second mode parameter used in the second mode when the at least one first mode parameter is updated;and further update the at least one second mode parameter in accordance with progress of the game in the second mode.

Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.