U.S. Pat. No. 12,263,405

DISPLAY VIDEOGAME CHARACTER AND OBJECT MODIFICATIONS

AssigneeCOLOPL, INC.

Issue DateFebruary 28, 2022

Illustrative Figure

Abstract

First display data is received to enable display of a video of a virtual space including a first character, which moves in cooperation with a behavior of a performer. A video is displayed in which the first character behaves in the virtual space on the basis of the first display data. The first character is associated with a prescribed first object, and a display mode of the first object is changed in accordance with a predetermined rule.

Description

DESCRIPTION OF EMBODIMENTS A system according to the present disclosure is a system for providing a game to a plurality of users. The system will be described below with reference to the drawings. The present invention is not limited to these illustrations but is indicated by the scope of the claims, and it is intended that the present invention includes all modifications within the meaning and scope equivalent to the scope of the claims. In the following description, the same components are denoted by the same reference numerals in the description of the drawings, and will not be repeatedly described. FIG.1is a diagram showing an overview of a system1according to the present embodiment. The system1includes a plurality of user terminals100(computers), a server200, a game play terminal300(an external device, a second external device), and a transmission terminal400(an external device, a first external device). InFIG.1, user terminals100A to100C, that is, three user terminals100are shown as an example of the plurality of user terminals100, but the number of user terminals100is not limited to the shown example. In the present embodiment, the user terminals100A to100C are described as “user terminals100” when being not necessary to be distinguished from each other. The user terminal100, the game play terminal300, and the transmission terminal400are connected to the server200via a network2. The network2is configured by various mobile communication systems constructed by the Internet and a wireless base station. Examples of the mobile communication system include so-called 3G and 4G mobile communication systems, LTE (Long Term Evolution), and a wireless network (for example, Wi-Fi (registered trademark)) that can be connected to the Internet through a predetermined access point. (Overview of Game) In the present embodiment, as an example of a game provided by the system1(hereinafter, referred to as “main game”), a game mainly played by the user of the ...

DESCRIPTION OF EMBODIMENTS

A system according to the present disclosure is a system for providing a game to a plurality of users. The system will be described below with reference to the drawings. The present invention is not limited to these illustrations but is indicated by the scope of the claims, and it is intended that the present invention includes all modifications within the meaning and scope equivalent to the scope of the claims. In the following description, the same components are denoted by the same reference numerals in the description of the drawings, and will not be repeatedly described.

FIG.1is a diagram showing an overview of a system1according to the present embodiment. The system1includes a plurality of user terminals100(computers), a server200, a game play terminal300(an external device, a second external device), and a transmission terminal400(an external device, a first external device). InFIG.1, user terminals100A to100C, that is, three user terminals100are shown as an example of the plurality of user terminals100, but the number of user terminals100is not limited to the shown example. In the present embodiment, the user terminals100A to100C are described as “user terminals100” when being not necessary to be distinguished from each other. The user terminal100, the game play terminal300, and the transmission terminal400are connected to the server200via a network2. The network2is configured by various mobile communication systems constructed by the Internet and a wireless base station. Examples of the mobile communication system include so-called 3G and 4G mobile communication systems, LTE (Long Term Evolution), and a wireless network (for example, Wi-Fi (registered trademark)) that can be connected to the Internet through a predetermined access point.

(Overview of Game)

In the present embodiment, as an example of a game provided by the system1(hereinafter, referred to as “main game”), a game mainly played by the user of the game play terminal300will be described. Hereinafter, the user of the game play terminal300called a “player”. As an example, the player (performer) operates one or more characters appearing in the main game to carry on the game. In the main game, the user of the user terminal100plays a role of supporting the progress of the game by the player. Details of the main game will be described below. The game provided by the system1may be a game in which a plurality of users participate, and no limitation to this example is intended.

(Game Play Terminal300)

The game play terminal300controls the progress of the game in response to operations input by the player. Further, the game play terminal300sequentially transmits information (hereinafter, game progress information) generated by a player's game play to the server200in real time.

(Server200)

The server200sends the game progress information (second data) received in real time from the game play terminal300, to the user terminal100. In addition, the server200mediates the sending and reception of various types of information between the user terminal100, the game play terminal300, and the transmission terminal400.

(Transmission Terminal400)

The transmission terminal400generates behavior instruction data (first data) in response to operations input by the user of the transmission terminal400, and transmits the behavior instruction data to the user terminal100via the server200. The behavior instruction data is data for reproducing a moving image on the user terminal100, and specifically, is data for producing behaviors of characters appearing in the moving image.

In the present embodiment, as an example, the user of the transmission terminal400is a player of the main game. Further, as an example, the moving image reproduced on the user terminal100based on the behavior instruction data is a moving image in which the characters operated by the player in the game behave. The “behavior” is to move at least a part of a character's body, and also includes a speech. Therefore, the behavior instruction data according to the present embodiment includes, for example, sound data for controlling the character to speak and motion data for moving the character's body.

As an example, the behavior instruction data is sent to the user terminal100after the main game is over. Details of the behavior instruction data and the moving image reproduced based on the behavior instruction data will be described below.

(User Terminal100)

The user terminal100receives game progress information in real time, and generate a game screen to display using the information. In other words, the user terminal100reproduces the game screen of the game being played by the player in real-time rendering. Thereby, the user of the user terminal100can visually recognize the same game screen as the game screen that the player visually recognize while playing the game at substantially the same timing as the player.

In addition, the user terminal100generates information for supporting the progress of the game by the player in response to the operation input by the user, and sends the information to the game play terminal300via the server200. Details of the information will be described below.

Further, the user terminal100receives the behavior instruction data from the transmission terminal400, and generates and reproduces a moving image (video) using the behavior instruction data. In other words, the user terminal100reproduces the behavior instruction data by rendering.

FIG.2is a diagram showing a hardware configuration of the user terminal100.FIG.3is a view showing a hardware configuration of the server200.FIG.4is a diagram showing a hardware configuration of the game play terminal300.FIG.5is a diagram showing a hardware configuration of the transmission terminal400.

(User Terminal100)

In the present embodiment, as an example, an example is described in which the user terminal100is implemented as a smartphone, but the user terminal100is not limited to the smartphone. For example, the user terminal100may be implemented as a feature phone, a tablet computer, a laptop computer (a so-called notebook computer), or a desktop computer. Further, the user terminal100may be a game device suitable for a game play.

As shown inFIG.2, the user terminal100includes a processor10, a memory11a, a storage12, a communication interface (IF)13, an input/output IF14, a touch screen15(display unit), a camera17, and a ranging sensor18. These components of the user terminal100are electrically connected to one another via a communication bus. The user terminal100may include an input/output IF14that can be connected to a display (display unit) configured separately from a main body of the user terminal100instead of or in addition to the touch screen15.

Further, as shown inFIG.2, the user terminal100may be configured to have the capability to communicate with one or more controller1020. The controller1020establishes communication with the user terminal100in accordance with a communication standard, for example, Bluetooth (registered trademark). The controller1020may include one or more button, and sends an output value based on the user's input operation to the button to the user terminal100. In addition, the controller1020may include various sensors such as an acceleration sensor and an angular velocity sensor, and sends the output values of the various sensors to the user terminal100.

Instead of or in addition to the user terminal100including the camera17and the ranging sensor18, the controller1020may include the camera17and the ranging sensor18.

It is desirable that the user terminal100allows a user, who uses the controller1020, to input user identification information such as a user's name or login ID via the controller1020at the time of start of a game, for example. Thereby, the user terminal100enables to associate the controller1020with the user, and can specify on the basis of a sending source (controller1020) of the received output value that the output value belongs to any user.

When the user terminal100communicates with a plurality of controllers1020, each user grasps each of the controllers1020, so that it is possible to implement multiplay with one user terminal100without communication with another device such as the server200via the network2. In addition, the user terminals100communicate with one another in accordance with a wireless standard such as a wireless LAN (Local Area Network) standard (communicate with one another without using the server200), whereby multiplay can be implemented locally with a plurality of user terminals100. When the above-described multiplay is implemented locally with one user terminal100, the user terminal100may further have at least a part of various functions (to be described below) provided in the server200. Further, when the above-described multiplay is implemented locally with the plurality of user terminals100, the plurality of user terminals100may have various functions (to be described below) provided in the server200in a distributed manner.

Even when the above-described multiplay is implemented locally, the user terminal100may communicate with the server200. For example, the user terminal may send information indicating a play result such as a record or win/loss in a certain game and user identification information in association with each other to the server200.

Further, the controller1020may be configured to be detachable from the user terminal100. In this case, a coupling portion with the controller1020may be provided on at least any surface of a housing of the user terminal100, controller1020. When the user terminal100is coupled to the controller1020by a cable via the coupling portion, the user terminal100and the controller1020sends and receives signals via the cable.

As shown inFIG.2, the user terminal100may be connected to a storage medium1030such as an external memory card via the input/output IF14. Thereby, the user terminal100can read program and data recorded on the storage medium1030. The program recorded on the storage medium1030is a game program, for example.

The user terminal100may store the game program acquired by communicating with an external device such as the server200in the memory11of the user terminal100, or may store the game program acquired by reading from the storage medium1030in the memory11.

As described above, the user terminal100includes the communication IF13, the input/output IF14, the touch screen15, the camera17, and the ranging sensor18as an example of a mechanism for inputting information to the user terminal100. Each of the components described above as an input mechanism can be regarded as an operation unit configured to receive a user's input operation.

For example, when the operation unit is configured by at least any one of the camera17and the ranging sensor18, the operation unit detects an object1010in the vicinity of the user terminal100, and specifies an input operation from the detection result of the object. As an example, a user's hand as the object1010or a marker having a predetermined shape is detected, and an input operation is specified based on color, shape, movement, or type of the object1010obtained as a detection result. More specifically, when a user's hand is detected from a captured image of the camera17, the user terminal100specifies a gesture (a series of movements of the user's hand) detected based on the captured image, as a user's input operation. The captured image may be a still image or a moving image.

Alternatively, when the operation unit is configured by the touch screen15, the user terminal100specifies and receives the user's operation performed on an input unit151of the touch screen15as a user's input operation. Alternatively, when the operation unit is configured by the communication IF13, the user terminal100specifies and receives a signal (for example, an output value) sent from the controller1020as a user's input operation. Alternatively, when the operation unit is configured by the input/output IF14, a signal output from an input device (not shown) different from the controller1020connected to the input/output IF14is specified and received as a user's input operation.

(Server200)

The server200may be a general-purpose computer such as a workstation or a personal computer as an example. The server200includes a processor20, a memory21, a storage22, a communication IF23, and an input/output IF24. These components in the server200are electrically connected to one another via a communication bus.

(Game Play Terminal300)

The game play terminal300may be a general-purpose computer such as a personal computer as an example. The game play terminal300includes a processor30, a memory31, a storage32, a communication IF33, and an input/output IF34. These components in the game play terminal300are electrically connected to one another via a communication bus.

As shown inFIG.4, the game play terminal300according to the present embodiment is included in an HMD (Head Mounted Display) set1000as an example. In other words, it can be expressed that the HMD set1000is included in the system1, and it can also be expressed that the player plays a game using the HMD set1000. A device for the player to play the game is not limited to the HMD set1000. As an example, the device may be any device that allows the player to experience the game virtually. The device may be implemented as a smartphone, a feature phone, a tablet computer, a laptop computer (a so-called notebook computer), or a desktop computer. Further, the device may be a game device suitable for a game play.

The HMD set1000includes not only the game play terminal300but also an HMD500, an HMD sensor510, a motion sensor520, a display530, and a controller540. The HMD500includes a monitor51, a gaze sensor52, a first camera53, a second camera54, a microphone55, and a speaker56. The controller540may include a motion sensor520.

The HMD500may be mounted on a head of the player to provide a virtual space to the player during operations. More specifically, the HMD500displays each of a right-eye image and a left-eye image on the monitor51. When each eye of the player visually recognizes each image, the player may recognize the image as a three-dimensional image based on a parallax of both the eyes. The HMD500may include either a so-called head-mounted display including a monitor or a head-mounted device capable of mounting a terminal including a smartphone or another monitor.

The monitor51is implemented as, for example, a non-transmissive display device. In an aspect, the monitor51is arranged on a main body of the HMD500to be located in front of both eyes of the player. Therefore, when the player visually recognizes the three-dimensional image displayed on the monitor51, the player can be immersed in the virtual space. In an aspect, the virtual space includes, for example, a background, player-operatable objects, and player-selectable menu images. In an aspect, the monitor51may be implemented as a liquid crystal monitor or an organic EL (Electro Luminescence) monitor included in a so-called smart phone or other information display terminals.

In another aspect, the monitor51can be implemented as a transmissive display device. In this case, the HMD500may be an open type such as a glasses type, instead of a closed type that covers the player's eyes as shown inFIG.1. The transmissive monitor51may be temporarily configured as a non-transmissive display device by adjustment of its transmittance. The monitor51may include a configuration in which a part of the image constituting the virtual space and a real space are displayed at the same time. For example, the monitor51may display an image of the real space captured by a camera mounted on the HMD500, or may make the real space visually recognizable by setting a part of the transmittance to be high.

In an aspect, the monitor51may include a sub-monitor for displaying a right-eye image and a sub-monitor for displaying a left-eye image. In another aspect, the monitor51may be configured to integrally display the right-eye image and the left-eye image. In this case, the monitor51includes a high-speed shutter. The high-speed shutter operates to enable alternate display of the right-eye image and the left-eye image so that only one of the eyes can recognize the image.

In an aspect, the HMD500includes a plurality of light sources (not shown). Each of the light source is implemented by, for example, an LED (Light Emitting Diode) configured to emit infrared rays. The HMD sensor510has a position tracking function for detecting the movement of the HMD500. More specifically, the HMD sensor510reads a plurality of infrared rays emitted by the HMD500and detects the position and inclination of the HMD500in the real space.

In another aspect, the HMD sensor510may be implemented by a camera. In this case, the HMD sensor510can detect the position and the inclination of the HMD500by executing image analysis processing using image information of the HMD500output from the camera.

In another aspect, the HMD500may include a sensor (not shown) as a position detector instead of the HMD sensor510or in addition to the HMD sensor510. The HMD500can use the sensor to detect the position and the inclination of the HMD500itself. For example, when the sensor is an angular velocity sensor, a geomagnetic sensor, or an acceleration sensor, the HMD500can use any of those sensors instead of the HMD sensor510to detect its position and inclination. As an example, when the sensor provided in the HMD500is an angular velocity sensor, the angular velocity sensor detects an angular velocity around each of three axes of the HMD500in the real space over time. The HMD500calculates a temporal change of the angle around each of the three axes of the HMD500based on each of the angular velocities, and further calculates an inclination of the HMD500based on the temporal change of the angles.

The gaze sensor52detects a direction in which lines of sight of the right eye and the left eye of the player are directed. The gaze sensor52detects the lines of sight of the player. The direction of the line of sight is detected by, for example, a known eye tracking function. The gaze sensor52is implemented by a sensor having the eye tracking function. In an aspect, the gaze sensor52preferably includes a right-eye sensor and a left-eye sensor. The gaze sensor52may be, for example, a sensor configured to irradiate the right eye and the left eye of the player with infrared light and to receive reflection light from the cornea and the iris with respect to the irradiation light, thereby detecting a rotational angle of each eyeball. The gaze sensor52can detect the line of sight of the player based on each of the detected rotational angles.

The first camera53captures a lower part of the player's face. More specifically, the first camera53captures a nose and a mouse of the player. The second camera54captures eyes and eyebrows of the player. The housing of the HMD500on the player side is defined as an inside of the HMD500, and the housing of the HMD500on the side opposite to the player. In an aspect, the first camera53can be located outside the HMD500, and the second camera54can be located inside the HMD500. The imaged generated by the first camera53and the second camera54are input to the game play terminal300. In another aspect, the first camera53and the second camera54may be implemented as one camera, and the player's face may be captured by the one camera.

The microphone55converts the speech of the player into a sound signal (electric signal) and outputs the sound signal to the game play terminal300. The speaker56converts the sound signal into a sound and outputs the sound to the player. In another aspect, the HMD500may include earphones instead of the speaker56.

The controller540is connected to the game play terminal300in a wired or wireless manner. The controller540receives as an input a command from the player to the game play terminal300. In an aspect, the controller540is configured to be capable of being gripped by the player. In another aspect, the controller540is configured to be wearable on a part of player's body or clothing. In further another aspect, the controller540may be configured to output at least one of vibration, sound, and light in accordance with the signal sent from the game play terminal300. In further another aspect, the controller540receives an operation for controlling the position and movement of an object arranged in the virtual space, from the player.

In an aspect, the controller540includes a plurality of light sources. Each of the light sources is implemented, for example, by an LED that emits infrared rays. The HMD sensor510has a position tracking function. In this case, the HMD sensor510reads the plurality of infrared rays emitted by the controller540, and detects position and inclination of the controller540in the real space. In another aspect, the HMD sensor510may be implemented by a camera. In this case, the HMD sensor510can detect the position and the inclination of the controller540by executing image analysis processing using the image information of the controller540output from the camera.

The motion sensor520is attached to the player's hand in an aspect, and detects movement of the player's hand. For example, the motion sensor520detects a rotation speed of the hand and the number of rotations of the hand. The detected signal is sent to the game play terminal300. The motion sensor520is provided in the controller540, for example. In an aspect, the motion sensor520is provided in, for example, the controller540configured to be capable of being gripped by the player. In another aspect, for safety in the real space, the controller540is a glove-type controller that is mounted on the player's hand not to easily fly away. In further another aspect, a sensor not mounted on the player may detect the movement of the player's hand. For example, a signal of a camera capturing the player may be input to the game play terminal300as a signal representing a behavior of the player. The motion sensor520and the game play terminal300are connected to each other in a wireless manner, for example. In the case of the wireless, a communication mode is not particularly limited, and Bluetooth or other known communication methods may be used, for example.

The display530displays the same image as the image displayed on the monitor51. Thereby, users other than the player wearing the HMD500can also view the same image like the player. The image displayed on the display530does not have to be a three-dimensional image, and may be a right-eye image or a left-eye image. Examples of the display530include a liquid crystal display and an organic EL monitor.

The game play terminal300produces the behavior of a character to be operated by the player, on the basis of various types of information acquired from the respective units of the HMD500, the controller540, and the motion sensor520, and controls the progress of the game. The “behavior” herein includes moving respective parts of the body, changing postures, changing facial expressions, moving, speaking, touching and moving the object arranged in the virtual space, and using weapons and tools gripped by the character. In other words, in the main game, as the respective parts of the player's body move, respective parts of the character's body also move in the same manner as the player. In the main game, the character speaks the contents of the speech of the player. In other words, in the main game, the character is an avatar object that behaves as a player's alter ego. As an example, at least some of the character's behaviors may be executed in response to an input to the controller540from the player.

In the present embodiment, the motion sensor520is attached to both hands of the player, both legs of the player, a waist of the player, and a head of the player. The motion sensor520attached to both hands of the player may be provided in the controller540as described above. In addition, the motion sensor520attached to the head of the player may be provided in the HMD500. The motion sensor520may be further attached to both elbows and knees of the user. As the number of motion sensors520attached to the player increases, the movement of the player can be more accurately reflected in the character. Further, the player may wear a suit to which one or more motion sensors520are attached, instead of attaching the motion sensors520to the respective parts of the body. In other words, a motion capturing method is limited to an example of using the motion sensor520.

(Transmission Terminal400)

The transmission terminal400may be a mobile terminal such as a smartphone, a PDA (Personal Digital Assistant), or a tablet computer. Further, the transmission terminal400may be a so-called stationary terminal such as a desktop computer terminal.

As shown inFIG.5, the transmission terminal400includes a processor40, a memory41, a storage42, a communication IF43, an input/output IF44, and a touch screen45. The transmission terminal400may include an input/output IF44connectable to a display (display unit) configured separately from the main body of the transmission terminal400, instead of or in addition to the touch screen45.

The controller1021may include one or physical input mechanisms of buttons, levers, sticks, and wheels. The controller1021sends an output value based on an input operation input to the input mechanisms from the operator (the player in the present embodiment) of the transmission terminal400, to the transmission terminal400. Further, the controller1021may include various sensors of an acceleration sensor and an angular velocity sensor, and may send the output values of the various sensors to the transmission terminal400. The above-described output values are received by the transmission terminal400via the communication IF43.

The transmission terminal400may include a camera and a ranging sensor (not shown). The controller1021may alternatively or additionally include the camera and the ranging sensor provided in the transmission terminal400.

As described above, the transmission terminal400includes the communication IF43, the input/output IF44, and the touch screen45as examples of mechanisms that input information to the transmission terminal400. The above-described respective components as an input mechanism can be regarded as an operation unit configured to receive the user's input operation.

When the operation unit is configured by the touch screen45, the transmission terminal400specifies and receives a user's operation, which is performed on an input unit451of the touch screen45, as a user's input operation. Alternatively, when the operation unit is configured by the communication IF43, the transmission terminal400specifies and receives a signal (for example, an output value), which is sent from the controller1021, as a user's input operation. Alternatively, when the operation unit is configured by the input/output IF44, the transmission terminal400specifies and receives a signal, which is output from an input device (not shown) connected to the input/output IF44, as a user's input operation.

Each of the processors10,20,30, and40controls operations of all the user terminal100, the server200, the game play terminal300, and the transmission terminal400. Each of the processors10,20,30, and40includes a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a GPU (Graphics Processing Unit). Each of the processors10,20,30, and40reads a program from each of storages12,22,32, and42which will be described below. Then, each of the processors10,20,30, and40expands the read program to each of memories11,21,31, and41which will be described below. The processors10,20, and30execute the expanded program.

Each of the memories11,21,31, and41is a main storage device. Each of the memories11,21,31, and41is configured by storage devices of a ROM (Read Only Memory) and a RAM (Random Access Memory). The memory11temporarily stores a program and various types of data read from the storage12to be described below by the processor10to give a work area to the processor10. The memory11also temporarily stores various types of data generated when the processor10is operating in accordance with the program. The memory21temporarily stores a program and various types of data read from the storage22to be described below by the processor20to give a work area to the processor20. The memory21also temporarily stores various types of data generated when the processor20is operating in accordance with the program. The memory31temporarily stores a program and various types of data read from the storage32to be described below by the processor30to give a work area to the processor30. The memory31also temporarily stores various types of data generated when the processor30is operating in accordance with the program. The memory41temporarily stores a program and various types of data read from the storage42to be described below by the processor40to give a work area to the processor40. The memory41also temporarily stores various types of data generated when the processor40is operating in accordance with the program.

In the present embodiment, the programs to be executed by the processors10and30may be game programs of the main game. In the present embodiment, the program executed by the processor40may be a transmission program for implementing transmission of behavior instruction data. In addition, the processor10may further execute a viewing program for implementing the reproduction of a moving image.

In the present embodiment, the program to be executed by the processor20may be at least one of the game program, the transmission program, and the viewing program. The processor20executes at least one of the game program, the transmission program, and the viewing program in response to a request from at least one of the user terminal100, the game play terminal300, and the transmission terminal400. The transmission program and the viewing program may be executed in parallel.

In other words, the game program may be a program for implementing the game by cooperation of the user terminal100, the server200, and the game play terminal300. The transmission program may be a program implementing the transmission of the behavior instruction data by cooperation of the server200and the transmission terminal400. The viewing program may be a program for implementing the reproduction of the moving image by cooperation of the user terminal100and the server200.

Each of the storages12,22,32, and42is an auxiliary storage device. Each of the storages12,22,32, and42is configured by a storage device such as a flash memory or an HDD (Hard Disk Drive). Each of the storages12and32stores various types data regarding the game, for example. The storage42stores various types of data regarding transmission of the behavior instruction data. Further, the storage12stores various types of data regarding the reproduction of the moving image. The storage22may store at least some of various types of data regarding each of the game, the transmission of the behavior instruction data, and the reproduction of the moving image.

Each of the communication IFs13,23,33, and43controls the sending and reception of various types of data in the user terminal100, the server200, the game play terminal300, and the transmission terminal400. Each of the communication IFs13,23,33, and43controls, for example, communication via a wireless LAN (Local Area Network), Internet communication via a wired LAN, a wireless LAN, or a mobile phone network, and communication using short-range wireless communication.

Each of the input/output IFs14,24,34, and44are interfaces through which the user terminal100, the server200, the game play terminal300, and the transmission terminal400receives a data input and outputs the data. Each of the input/output IFs14,24,34, and44may perform input/output of data via a USB (Universal Serial Bus) or the like. Each of the input/output IFs14,24,34, and44may include a physical button, a camera, a microphone, a speaker, a mouse, a keyboard, a display, a stick, and a lever. Further, each of the input/output IFs14,24,34, and44may include a connection portion for sending to and receiving from a peripheral device.

The touch screen15is an electronic component in which the input unit151and the display unit152(display) are combined. The touch screen45is an electronic component in which the input unit451and the display unit452are combined. Each of the input units151and451is, for example, a touch-sensitive device, and is configured by a touch pad, for example. Each of the display units152and452is configured by a liquid crystal display or an organic EL (Electro-Luminescence) display, for example.

Each of the input units151and451has a function of detecting a position where user's operations (mainly, physical contact operations including a touch operation, a slide operation, a swipe operation, and a tap operation) are input to an input surface, and sending information indicating the position as an input signal. Each of the input units151and451includes a touch sensor (not shown). The touch sensor may adopt any one of methods such as a capacitive touch method and a resistive-film touch method.

Although not shown, the user terminal100and the transmission terminal400may include one or more sensors configured to specify a holding posture of the user terminal100and a holding posture of the transmission terminal400, respectively. The sensor may be, for example, an acceleration sensor or an angular velocity sensor.

When each of the user terminal100and the transmission terminal400includes a sensor, the processors10and40can specify the holding posture of the user terminal100and the holding posture of the transmission terminal400from the outputs of the sensors, respectively, and can perform processing depending on the holding postures. For example, when the processors10and40may be vertical screen displays in which a vertically long images are displayed on the display units152and452when the user terminal100and the transmission terminal400are held in a vertical direction, respectively. On the other hand, when the user terminal100and the transmission terminal400are held horizontally, a horizontally long image may be displayed on the display unit as a horizontal screen display. In this way, the processors10and40may be able to switch between a vertical screen display and a horizontal screen display depending on the holding postures of the user terminal100and the transmission terminal400, respectively.

FIG.6is a block diagram showing functional configurations of the user terminal100, the server200, and the HMD set1000included in the system1.FIG.7is a block diagram showing a functional configuration of the transmission terminal400shown inFIG.6.

The user terminal100has a function as an input device that receives a user's input operation, and a function as an output device that outputs an image or a sound of the game. The user terminal100functions as a control unit110and a storage unit120by cooperation of the processor10, the memory11, the storage12, the communication IF13, the input/output IF14, and the touch screen15.

The server200has a function of mediating the sending and reception of various types of information between the user terminal100, the HMD set1000, and the transmission terminal400. The server200functions as a control unit210and a storage unit220by cooperation of the processor20, the memory21, the storage22, the communication IF23, and the input/output IF24.

The HMD set1000(the game play terminal300) has a function as an input device that receives a player's input operation, a function as an output device that outputs an image and a sound of the game, and a function of sending game progress information to the user terminal100via the server200in real time. The HMD set1000functions as a control unit310and a storage unit320by cooperation of the processor30, the memory31, the storage32, the communication IF33, and the input/output IF34of the game play terminal300with the HMD500, the HMD sensor510, the motion sensor520, and the controller540.

The transmission terminal400has a function of generating behavior instruction data and sending the behavior instruction data to the user terminal100via the server200. The transmission terminal400functions as a control unit410and a storage unit420by cooperation of the processor40, the memory41, the storage42, the communication IF43, the input/output IF44, and the touch screen45.

(Data Stored in Storage Unit of Each Device)

The storage unit120stores a game program131(a program), game information132, and user information133. The storage unit220stores a game program231, game information232, user information233, and a user list234. The storage unit320stores a game program331, game information332, and user information333. The storage unit420stores a user list421, a motion list422, and a transmission program423(a program, a second program).

The game programs131,231, and331are game programs to be executed by the user terminal100, the server200, and the HMD set1000, respectively. The respective devices operates by cooperation based on the game programs131,231, and331, and thus the main game is implemented. The game programs131and331may be stored in the storage unit220and downloaded to the user terminal100and the HMD set1000, respectively. In the present embodiment, the user terminal100performs rendering on the data received from the transmission terminal400in accordance with the game program131and reproduces a moving image. In other words, the game program131is also a program for reproducing the moving image using moving image instruction data transmitted from the transmission terminal400. The program for reproducing the moving image may be different from the game program131. In this case, the storage unit120stores a program for reproducing the moving image separately from the game program131.

The game information132,232, and332are data used for reference when user terminal100, the server200, and the HMD set1000execute the game programs, respectively. Each of the user information133,233, and333is data regarding a user's account of the user terminal100. The game information232is the game information132of each of the user terminals100and the game information332of the HMD set1000. The user information233is the user information133of each of the user terminals100and player's user information included in the user information333. The user information333is the user information133of each of the user terminals100and player's user information.

Each of the user list234and the user list421is a list of users who have participated in the game. Each of the user list234and the user list421may include not only a list of users who have participated in the most recent game play by the player but also a list of users who have participated in each of game plays before the most recent game play. The motion list422is a list of a plurality of motion data created in advance. The motion list422is, for example, a list in which motion data is associated with information (for example, a motion name) identifies each motion. The transmission program423is a program for implementing transmission of the behavior instruction data for reproducing the moving image on the user terminal100to the user terminal100.

(Functional Configuration of Server200)

The control unit210comprehensively controls the server200by executing the game program231stored in the storage unit220. For example, the control unit210mediates the sending and reception of various types of information between the user terminal100, the HMD set1000, and the transmission terminal400.

The control unit210functions as a communication mediator211, a log generator212, and a list generator213in accordance with the description of game program231. The control unit210can also as other functional blocks (not shown) for the purpose of mediating the sending and reception of various types of information regarding the game play and transmission of the behavior instruction data and supporting the progress of the game.

The communication mediator211mediates the sending and reception of various types of information between the user terminal100, the HMD set1000, and the transmission terminal400. For example, the communication mediator211sends the game progress information received from the HMD set1000to the user terminal100. The game progress information includes data indicating information on movement of the character operated by the player, parameters of the character, and items and weapons possessed by the character, and enemy characters. The server200sends the game progress information to the user terminal100of all the users who participate in the game. In other words, the server200sends common game progress information to the user terminal100of all the users who participate in the game. Thereby, the game progresses in each of the user terminals100of all the users who participate in the game in the same manner as in the HMD set1000.

Further, for example, the communication mediator211sends information received from any one of the user terminals100to support the progress of the game by the player, to the other user terminals100and the HMD set1000. As an example, the information may be an item for the player to carry on the game advantageously, and may be item information indicating an item provided to the player (character). The item information includes information (for example, a user name and a user ID) indicating the user who provides the item. Further, the communication mediator211may mediate the transmission of the behavior instruction data from the transmission terminal400to the user terminal100.

The log generator212generates a log for the game progress based on the game progress information received from the HMD set1000. The list generator213generates the user list234after the end of the game play. Although details will be described below, each user in the user list234is associated with a tag indicating the content of the support provided to the player by the user. The list generator213generates a tag based on the log for the game progress generated by the log generator212, and associates it with the corresponding user. The list generator213may associate the content of the support, which is input by the game operator or the like using a terminal device such as a personal computer and provided to the player by each user, with the corresponding user, as a tag. Thereby, the content of the support provided by each user becomes more detailed. The user terminal100sends, based on the user's operation, the information indicating the user to the server200when the users participate in the game. For example, the user terminal100sends a user ID, which is input by the user, to the server200. In other words, the server200holds information indicating each user for all the users who participate in the game. The list generator213may generate, using the information, the user list234.

(Functional Configuration of HMD Set1000)

The control unit310comprehensively controls the HMD set1000by executing the game program331stored in the storage unit320. For example, the control unit310allows the game to progress in accordance with the game program331and the player's operation. In addition, the control unit310communicates with the server200to send and receive information as needed while the game is in progress. The control unit310may send and receive the information directly to and from the user terminal100without using the server200.

The control unit310functions as an operation receiver311, a display controller312, a UI controller313, an animation generator314, a game coordinator315, a virtual space controller316, and a response processor317in accordance with the description of the game program331. The control unit310can also as other functional blocks (not shown) for the purpose of controlling characters appearing in the game, depending on the nature of the game to be executed.

The operation receiver311detects and receives the player's input operation. The operation receiver311receives signals input from the HMD500, the motion sensor520, and the controller540, determines what kind of input operation has been performed, and outputs the result to each component of the control unit310.

The UI controller313controls user interface (hereinafter, referred to as UI) images to be displayed on the monitor51and the display530. The UI image is a tool for the player to make an input necessary for the progress of the game to the HMD set1000, or a tool for obtaining information, which is output during the progress of the game, from the HMD set1000. The UI image is not limited thereto, but includes icons, buttons, lists, and menu screens, for example.

The animation generator314generates, based on control modes of various objects, animations showing motions of various objects. For example, the animation generator314may generate an animation that expresses a state where an object (for example, a player's avatar object) moves as if it is there, its mouth moves, or its facial expression changes.

The game coordinator315controls the progress of the game in accordance with the game program331, the player's input operation, and the behavior of the avatar object corresponding to the input operation. For example, the game coordinator315performs predetermined game processing when the avatar object performs a predetermined behavior. Further, for example, the game coordinator315may receive information indicating the user's operation on the user terminal100, and may perform game processing based on the user's operation. In addition, the game coordinator315generates game progress information depending on the progress of the game, and sends the generated information to the server200. The game progress information is sent to the user terminal100via the server200. Thereby, the progress of the game in the HMD set1000is shared in the user terminal100. In other words, the progress of the game in the HMD set1000synchronizes with the progress of the game in the user terminal100.

The virtual space controller316performs various controls related to the virtual space provided to the player, depending on the progress of the game. As an example, the virtual space controller316generates various objects, and arranges the objects in the virtual space. Further, the virtual space controller316arranges a virtual camera in the virtual space. In addition, the virtual space controller316produces the behaviors of various objects arranged in the virtual space, depending on the progress of the game. Further, the virtual space controller316controls the position and inclination of the virtual camera arranged in the virtual space, depending on the progress of the game.

The display controller312outputs a game screen reflecting the processing results executed by each of the above-described components to the monitor51and the display530. The display controller312may display an image based on a field of view from the virtual camera arranged in the virtual space, on the monitor51and the display530as a game screen. Further, the display controller312may include the animation generated by the animation generator314in the game screen. Further, the display controller312may draw the above-described UI image, which is controlled by the UI controller313, in a manner of being superimposed on the game screen.

The response processor317receives a feedback regarding a response of the user of the user terminal100to the game play of the player, and outputs the feedback to the player. In the present embodiment, for example, the user terminal100can create, based on the user's input operation, a comment (message) directed to the avatar object. The response processor317receives comment data of the comment and outputs the comment data. The response processor317may display text data corresponding to the comment of the user on the monitor51and the display530, or may output sound data corresponding to the comment of the user from a speaker (not shown). In the former case, the response processor317may draw an image corresponding to the text data (that is, an image including the content of the comment) in a manner of being superimposed on the game screen.

(Functional Configuration of User Terminal100)

The control unit110comprehensively controls the user terminal100by executing the game program131stored in the storage unit120. For example, the control unit110controls the progress of the game in accordance with the game program131and the user's operation. In addition, the control unit110communicates with the server200to send and receive information as needed while the game is in progress. The control unit110may send and receive the information directly to and from the HMD set1000without using the server200.

The control unit110functions as an operation receiver111, a display controller112, a UI controller113, an animation generator114, a game coordinator115, a virtual space controller116, and a moving image reproducer117in accordance with the description of the game program131. The control unit110can also as other functional blocks (not shown) for the purpose of progressing the game, depending on the nature of the game to be executed.

The operation receiver111detects and receives the user's input operation with respect to the input unit151. The operation receiver111determines what kind of input operation has been performed from the action exerted by the user on a console via the touch screen15and another input/output IF14, and outputs the result to each component of the control unit110.

For example, the operation receiver111receives an input operation for the input unit151, detects coordinates of an input position of the input operation, and specifies a type of the input operation. The operation receiver111specifies, for example, a touch operation, a slide operation, a swipe operation, and a tap operation as the type of the input operation. Further, the operation receiver111detects that the contact input is released from the touch screen15when the continuously detected input is interrupted.

The UI controller113controls a UI image to be displayed on the display unit152to construct a UI according to at least one of the user's input operation and the received game progress information. The UI image is a tool for the user to make an input necessary for the progress of the game to the user terminal100, or a tool for obtaining information, which is output during the progress of the game, from the user terminal100. The UI image is not limited thereto, but includes icons, buttons, lists, and menu screens, for example.

The animation generator114generates, based on control modes of various objects, animations showing motions of various objects.

The game coordinator115controls the progress of the game in accordance with the game program131, the received game progress information, and the user's input operation. When predetermined processing is performed by the user's input operation, the game coordinator115sends information on the game processing to the HMD set1000via the server200. Thereby, the predetermined game processing is shared in the HMD set1000. In other words, the progress of the game in the HMD set1000synchronizes with the progress of the game in the user terminal100. The predetermined game processing is, for example, processing of providing an item to an avatar object, and in this example, information on the game processing is the item information described above.

The virtual space controller116performs various controls related to the virtual space provided to the user, depending on the progress of the game. As an example, the virtual space controller116generates various objects, and arranges the objects in the virtual space. Further, the virtual space controller116arranges a virtual camera in the virtual space. In addition, the virtual space controller116produces the behaviors of the various objects arranged in the virtual space, depending on the progress of the game, specifically, depending on the received game progress information. Further, the virtual space controller316controls position and inclination of the virtual camera arranged in the virtual space, depending on the progress of the game, specifically, the received game progress information.

The display controller112outputs a game screen reflecting the processing results executed by each of the above-described components to the display unit152. The display controller112may display an image based on a field of view from the virtual camera arranged in the virtual space provided to the user, on the display unit152as a game screen. Further, the display controller112may include the animation generated by the animation generator114in the game screen. Further, the display controller112may draw the above-described UI image, which is controlled by the UI controller113, in a manner of being superimposed on the game screen. In any case, the game screen displayed on the display unit152is the game screen as the game screen displayed on the other user terminal100and the HMD set1000.

The moving image reproducer117performs analysis (rendering) on the behavior instruction data received from the transmission terminal400, and reproduces the moving image.

(Functional Configuration of Transmission Terminal400)

The control unit410comprehensively controls the transmission terminal400by executing a program (not shown) stored in the storage unit420. For example, the control unit410generates behavior instruction data in accordance with the program and the operation of the user (the player in the present embodiment) of the transmission terminal400, and transmits the generated data to the user terminal100. Further, the control unit410communicates with the server200to send and receive information as needed. The control unit410may send and receive the information directly to and from the user terminal100without using the server200.

The control unit410functions as a communication controller411, a display controller412, an operation receiver413, a sound receiver414, a motion specifier415, and a behavior instruction data generator416in accordance with the description of the program. The control unit410can also function as other functional blocks (not shown) for the purpose of generating and transmitting behavior instruction data.

The communication controller411controls the sending and reception of information to and from the server200or the user terminal100via the server200. The communication controller411receives the user list421from the server200as an example. Further, the communication controller411sends the behavior instruction data to the user terminal100as an example.

The display controller412outputs various screens, which reflects results of the processing executed by each component, to the display unit452. The display controller412displays a screen including the received user list234as an example. Further, as an example, the display controller412displays a screen including the motion list422for enabling the player to select motion data included in the behavior instruction data to be transmitted for use in production of the behavior of an avatar object.

The operation receiver413detects and receives the player's input operation with respect to the input unit151. The operation receiver111determines what kind of input operation has been performed from the action exerted by the user on a console via the touch screen45and another input/output IF44, and outputs the result to each component of the control unit410.

For example, the operation receiver413receives an input operation for the input unit451, detects coordinates of an input position of the input operation, and specifies a type of the input operation. The operation receiver413specifies, for example, a touch operation, a slide operation, a swipe operation, and a tap operation as the type of the input operation. Further, the operation receiver413detects that the contact input is released from the touch screen45when the continuously detected input is interrupted.

The sound receiver414receives a sound generated around the transmission terminal400, and generates sound data of the sound. As an example, the sound receiver414receives a sound output by the player and generates sound data of the sound.

The motion specifier415specifies the motion data selected by the player from the motion list422in accordance with the player's input operation.

The behavior instruction data generator416generates behavior instruction data. As an example, the behavior instruction data generator416generates behavior instruction data including the generated sound data and the specified motion data.

The functions of the HMD set1000, the server200, and the user terminal100shown inFIG.6and the function of the transmission terminal400shown inFIG.7are merely examples. Each of the HMD set1000, the server200, the user terminal100, and the transmission terminal400may have at least some of functions provided by other devices. Further, another device other than the HMD set1000, the server200, the user terminal100, and the transmission terminal400may be used as a component of the system1, and another device may be made to execute some of the processing in the system1. In other words, the computer, which executes the game program in the present embodiment, may be any of the HMD set1000, the server200, the user terminal100, the transmission terminal400, and other devices, or may be implemented by a combination of these plurality of devices.

FIG.8is a flowchart showing an example of a flow of control processing of the virtual space provided to the player and the virtual space provided to the user of the user terminal100.FIGS.9A and9Bare diagrams showing a virtual space600A provided to the player and a field-of-view image visually recognized by the player according to an embodiment.FIGS.10A and10Bare diagrams showing a virtual space600B provided to the user of the user terminal100and a field-of-view image visually recognized by the user according to an embodiment. Hereinafter, the virtual spaces600A and600B are described as “virtual spaces600” when being not necessary to be distinguished from each other.

In step S1, the processor30functions as the virtual space controller316to define the virtual space600A shown inFIG.9A. The processor30defines the virtual space600A using virtual space data (not shown). The virtual space data may be stored in the game play terminal300, may be generated by the processor30in accordance with the game program331, or may be acquired by the processor30from the external device such as the server200.

As an example, the virtual space600has an all-celestial sphere structure that covers the entire sphere in a 360-degree direction around a point defined as a center. InFIGS.9A and10A, an upper half of the virtual space600is illustrated as a celestial sphere not to complicate the description.

In step S2, the processor30functions as the virtual space controller316to arrange an avatar object (character)610in the virtual space600A. The avatar object610is an avatar object associated with the player, and behaves in accordance with the player's input operation.

In step S3, the processor30functions as the virtual space controller316to arrange other objects in the virtual space600A. In the example ofFIGS.9A and9B, the processor30arranges objects631to634. Examples of other objects may include character objects (so-called non-player characters, NPC) that behaves in accordance with the game program331, operation objects such as virtual hands, and objects that imitate animals, plants, artificial objects, or natural objects that are arranged depending on the progress of the game.

In step S4, the processor30functions as the virtual space controller316to arrange a virtual camera620A in the virtual space600A. As an example, the processor30arranges the virtual camera620A at a position of the head of the avatar object610.

In step S5, the processor30displays a field-of-view image650on the monitor51and the display530. The processor30defines a field-of-view area640A, which is a field of view from the virtual camera620A in the virtual space600A, in accordance with an initial position and an inclination of the virtual camera620A. Then, the processor30defines a field-of-view image650corresponding to the field-of-view area640A. The processor30outputs the field-of-view image650to the monitor51and the display530to allow the HMD500and the display530to display the field-of-view image650.

In the example ofFIGS.9A and9B, as shown inFIG.9A, since a part of the object634is included in the field-of-view area640A, the field-of-view image650includes a part of the object634as shown inFIG.9B.

In step S6, the processor30sends initial arrangement information to the user terminal100via the server200. The initial arrangement information is information indicating initial arrangement positions of various objects in the virtual space600A. In the example ofFIGS.9A and9B, the initial arrangement information includes information on initial arrangement positions of the avatar object610and the objects631to634. The initial arrangement information can also be expressed as one of the game progress information.

In step S7, the processor30functions as the virtual space controller316to control the virtual camera620A depending on the movement of the HMD500. Specifically, the processor30controls the direction and inclination of the virtual camera620A depending on the movement of the HMD500, that is, the posture of the head of the player. As will be described below, the processor30moves the head of the player (changes the posture of the head) and moves a head of the avatar object610in accordance with such movement. The processor30controls the direction and inclination of the virtual camera620A such that a direction of the line of sight of the avatar object610coincides with a direction of the line of sight of the virtual camera620A. In step S8, the processor30updates the field-of-view image650in response to changes in the direction and inclination of the virtual camera620A.

In step S9, the processor30functions as the virtual space controller316to move the avatar object610depending on the movement of the player. As an example, the processor30moves the avatar object610in the virtual space600A as the player moves in the real space. Further, the processor30moves the head of the avatar object610in the virtual space600A as the head of the player moves in the real space.

In step S10, the processor30functions as the virtual space controller316to move the virtual camera620A to follow the avatar object610. In other words, the virtual camera620A is always located at the head of the avatar object610even when the avatar object610moves.

The processor30updates the field-of-view image650depending on the movement of the virtual camera620A. In other words, the processor30updates the field-of-view area640A depending on the posture of the head of the player and the position of the virtual camera620A in the virtual space600A. As a result, the field-of-view image650is updated.

In step S11, the processor30sends the behavior instruction data of the avatar object610to the user terminal100via the server200. The behavior instruction data herein includes at least one of motion data that takes the motion of the player during a virtual experience (for example, during a game play), sound data of a sound output by the player, and operation data indicating the content of the input operation to the controller540. When the player is playing the game, the behavior instruction data is sent to the user terminal100as game progress information, for example.

Processes of steps S7to S11are consecutively and repeatedly executed while the player is playing the game.

In step S21, the processor10of the user terminal100of a user3functions as the virtual space controller116to define a virtual space600B shown inFIG.10A. The processor10defines a virtual space600B using virtual space data (not shown). The virtual space data may be stored in the user terminal100, may be generated by the processor10based on the game program131, or may be acquired by the processor10from an external device such as the server200.

In step S22, the processor10receives initial arrangement information. In step S23, the processor10functions as the virtual space controller116to arrange various objects in the virtual space600B in accordance with the initial arrangement information. In the example ofFIGS.10A and10B, various objects are an avatar object610and objects631to634.

In step S24, the processor10functions as the virtual space controller116to arrange a virtual camera620B in the virtual space600B. As an example, the processor10arranges the virtual camera620B at the position shown inFIG.10A.

In step S25, the processor10displays a field-of-view image660on the display unit152. The processor10defines a field-of-view area640B, which is a field of view from the virtual camera620B in the virtual space600B, in accordance with an initial position and an inclination of the virtual camera620B. Then, the processor10defines a field-of-view image660corresponding to the field-of-view area640B. The processor10outputs the field-of-view image660to the display unit152to allow the display unit152to display the field-of-view image660.

In the example ofFIGS.10A and10B, since the avatar object610and the object631are included in the field-of-view area640B as shown inFIG.10A, the field-of-view image660includes the avatar object610and the object631as shown inFIG.10B.

In step S26, the processor10receives the behavior instruction data. In step S27, the processor10functions as the virtual space controller116to move the avatar object610in the virtual space600B in accordance with the behavior instruction data. In other words, the processor10reproduces a video in which the avatar object610is behaving, by real-time rendering.

In step S28, the processor10functions as the virtual space controller116to control the virtual camera620B in accordance with the user's operation received when functioning as the operation receiver111. In step S29, the processor10updates the field-of-view image660depending on changes in the position of the virtual camera620B in the virtual space600B and the direction and inclination of the virtual camera620B. In step S28, the processor10may automatically control the virtual camera620B depending on the movement of the avatar object610, for example, the change in the movement and direction of the avatar object610. For example, the processor10may automatically move the virtual camera620B or change its direction and inclination such that the avatar object610is always captured from the front. As an example, the processor10may automatically move the virtual camera620B or change its direction and inclination such that the avatar object610is always captured from the rear in response to the movement of the avatar object610.

As described above, the avatar object610behaves in the virtual space600A depending on the movement of the player. The behavior instruction data indicating the behavior is sent to the user terminal100. In the virtual space600B, the avatar object610behaves in accordance with the received behavior instruction data. Thereby, the avatar object610performs the same behavior in the virtual space600A and the virtual space600B. In other words, the user3can visually recognize the behavior of the avatar object610depending on the behavior of the player using the user terminal100.

FIGS.11A to11Dare diagrams showing another example of the field-of-view image displayed on the user terminal100. Specifically,FIG.11is a diagram showing an example of a game screen of a game (main game) to be executed by the system1in which the player is playing.

The main game is a game in which the avatar object610who operates weapons, for example, guns and knives and a plurality of enemy objects671who is NPC appear in the virtual space600and the avatar object610fights against the enemy objects671. Various game parameters, for example, a physical strength of the avatar object610, the number of usable magazines, the number of remaining bullets of the gun, and the number of remaining enemy objects671are updated depending on the progress of the game.

A plurality of stages are prepared in the main game, and the player can clear the stage by establishing predetermined achievement conditions associated with each stage. Examples of the predetermined achievement conditions may include conditions established by defeating all the appearing enemy objects671, defeating a boss object among the appearing enemy objects671, acquiring a predetermined item, and reaching a predetermined position. The achievement conditions are defined in the game program131. In the main game, the player clears the stage when the achievement conditions are established depending on the content of the game, in other words, a win of the avatar object610against the enemy objects671(win or loss between the avatar object610and the enemy object671) is determined. On the other hand, for example, when the game executed by the system1is a racing game, the ranking of the avatar object610is determined when a condition is established that the avatar object reaches a goal.

In the main game, the game progress information is live transmitted to the plurality of user terminals100at predetermined time intervals in order to share the virtual space between the HMD set1000and the plurality of user terminals100. As a result, on the touch screen15of the user terminal100on which the user watches the game, a field-of-view image of the field-of-view area defined by the virtual camera620B corresponding to the user terminal100is displayed. Further, on an upper right side and an upper left side of the field-of-view image, parameter images showing the physical strength of the avatar object610, the number of usable magazines, the number of remaining bullets of the gun, and the number of remaining enemy objects671are displayed in a manner of being superimposed. The field-of-view image can also be expressed as a game screen.

As described above, the game progress information includes motion data that takes the behavior of the player, sound data of a sound output by the player, and operation data indicating the content of the input operation to the controller540. These data are, that is, information for specifying the position, posture, and direction of the avatar object610, information for specifying the position, posture, and direction of the enemy object671, and information for specifying the position of other objects (for example, obstacle objects672and673). The processor10specifies the position, posture, and direction of each object by analyzing (rendering) the game progress information.

The game information132includes data of various objects, for example, the avatar object610, the enemy object671, and the obstacle objects672and673. The processor10uses the data and the analysis result of the game progress information to update the position, posture, and direction of each object. Thereby, the game progresses, and each object in the virtual space600B moves in the same manner as each object in the virtual space600A. Specifically, in the virtual space600B, each object including the avatar object610behaves in accordance with the game progress information regardless of whether the user operates the user terminal100.

On the touch screen15of the user terminal100, as an example, UI images701and702are displayed in a manner of being superimposed on the field-of-view image. The UI image701is a UI image that receives an operation for controlling the touch screen15to display a UI image711that receives an item-supply operation for supporting the avatar object610from the user3. The UI image702is a UI image that receives an operation for controlling the touch screen15to display a UI image (to be described below) receives an operation for inputting and sending a comment for the avatar object610(in other words, a player4) from the user3. The operation received by the UI images701and702may be, for example, an operation of tapping the UI images701and702.

When the UI image701is tapped, the UI image711is displayed in a manner of being superimposed on the field-of-view image. The UI image711includes, for example, a UI image711A on which a magazine icon is drawn, a UI image711B on which a first-aid kit icon is drawn, a UI image711C on which a triangular cone icon is drawn, and a UI image711D on which a barricade icon is drawn. The item-supply operation corresponds to an operation of tapping any UI image, for example.

As an example, when the UI image711A is tapped, the number of remaining bullets of the gun used by the avatar object610increases. When the UI image711B is tapped, the physical strength of the avatar object610is restored. When the UI images711C and711D are tapped, the obstacle objects672and673are arranged in the virtual space to obstruct the movement of the enemy object671. One of the obstacle objects672and673may obstruct the movement of the enemy object671more than the other obstacle object.

The processor10sends item-supply information indicating that the item-supply operation has been performed, to the server200. The item-supply information includes at least information for specifying a type of the item specified by the item-supply operation. The item-supply information may include another information on the item such as information indicating a position where the item is arranged. The item-supply information is sent to another user terminal100and the HMD set1000via the server200.

FIGS.12A to12Dare diagrams showing another example of the field-of-view image displayed on the user terminal100. Specifically,FIG.12is a diagram showing an example of a game screen of the main game, and is a diagram for illustrating a communication between the player and user terminal100during the game play.

In a case ofFIG.12A, the user terminal100produces a speech691of the avatar object610. Specifically, the user terminal100produces the speech691of the avatar object610on the basis of the sound data included in the game progress information. The content of the speech691is “OUT OF BULLETS!” output by a player4. In other words, the content of the speech691is to inform each user that there is no magazine (0) and the number of bullets loaded in the gun is 1, so that a means for attacking the enemy object671is likely to be lost.

InFIG.12A, a balloon is used to visually indicate the speech of the avatar object610, but the sound is output by the speaker of the user terminal100in fact. In addition to the output of the sound, the balloon shown inFIG.12A(that is, the balloon including a text of the sound content) may be displayed in the field-of-view image. This also applies to a speech692to be described below.

Upon reception of the tap operation on the UI image702, the user terminal100displays UI images705and706(message UI) in a manner of being superimposed on the field-of-view image as shown inFIG.12B. The UI image705is a UI image on which a comment on the avatar object610(in other words, the player) is displayed. The UI image706is a UI image that receives a comment-sending operation from the user3in order to send the input comment.

As an example, upon reception of the tap operation on the UI image705, the user terminal100controls the touch screen15to display a UI image (not shown, hereinafter simply referred to as “keyboard”) imitating a keyboard. The user terminal100controls the UI image705to display a text corresponding to the user's input operation on the keyboard. In the example ofFIG.12B, the text “I'll SEND YOU A MAGAZINE” is displayed on the UI image705.

As an example, upon reception of the tap operation on the UI image706after the text is input, the user terminal100sends comment information including information indicating the input content (text content) and information indicating the user, to the server200. The comment information is sent to another user terminal100and HMD set1000via the server200.

A UI image703A is a UI image indicating a user name of the user who sends the comment, and a UI image704A is a UI image indicating a content of the comment sent by the user. In the example ofFIG.12B, a user with the user name “BBBBB” uses his/her own user terminal100to send comment information having the content “watch out!”, whereby the UI image703A and the UI image704A are displayed. The UI image703A and the UI image704A are displayed on the touch screen15of all the user terminals100participating in the main game and the monitor51of the HMD500. The UI image703A and the704A may be one UI image. In other words, one UI image may include the user name and the content of the comment.

In an example ofFIG.12C, a user with the user name “AAAAA”, who is the user of the user terminal100shown inFIGS.12A to12D, inputs and sends a comment as described above, whereby UI images703B and704B are displayed on the touch screen15. The UI image703B contains the user name “AAAAA”, and the UI image704B contains the comment “I'll SEND YOU A MAGAZINE!” input in the example ofFIG.12B.

Further, the example ofFIG.12Cshows a field-of-view image611in which the user “AAAAA” further inputs a tap operation to the UI image701and displays the UI image711on the touch screen15and the input of the tap operation to the UI image711A is completed. In other words, item-supply information indicating a magazine is sent from the user terminal100of the user “AAAAA” to another user terminal100and the HMD set1000, and as a result, the user terminal100and the HMD set1000arrange a presentment object674(to be described below) in the virtual space600. As an example, the user terminal100and the HMD set1000executes a presentment related to the presentment object674after the elapsed time indicated in the item-supply information has elapsed, and executes processing of arousing the effect of the item object.

In an example ofFIG.12D, the number of magazines is increased from0to1by execution of the processing of arousing the effect of the item object. As a result, the player speaks the phrase “thank you!” to the user “AAAAA”, and sound data of the speech is sent to each of the user terminals100. Thereby, each of the user terminals100outputs the sound “than you!” as a speech692of the avatar object610.

As described above, the communication between the user and the avatar object610is achieved in the main game by both the input of the comment of each user and the output of the speech sound of the avatar object610based on the speech of the player.

(Game Progress Processing in Game Play Terminal300)

FIG.13is a flowchart showing an example of a flow of game progress processing to be executed by the game play terminal300.

In step S31, the processor30functions as the game coordinator315to control the progress of the game in accordance with the game program331and the movement of the player. In step S32, the processor30generates game progress information and transmits the generated information to user terminal100. Specifically, the processor30sends the generated game progress information to each of the user terminals100via the server200.

In step S33, upon receiving item-supply information (YES in S33), the processor30arranges item objects in the virtual space600A based on the item-supply information in step S34. As an example, the processor30arranges the presentment object674in the virtual space600A before the arrangement of the item objects (seeFIG.11C). The presentment object674may be, for example, an object imitating a present box. As an example, the processor30may execute the presentment related to the presentment object674after the elapsed time indicated in the item-supply information has elapsed. The presentment may be, for example, an animation in which a lid of the present box opens. The processor30executes processing for arousing the effect of the item object after executing the animation. For example, in the example ofFIG.11D, the obstacle object673is arranged.

The processor30may arrange the item object corresponding to the tapped UI image in the virtual space600A after executing the animation. For example, when a tap operation is performed on the UI image711A, the processor30arranges the magazine object indicating the magazine in the virtual space600A after executing the animation. In addition, when a tap operation is performed on the UI image711B, the processor30arranges the first-aid kit object indicating the first-aid kit in the virtual space600A after executing the animation. The processor30may execute the processing of arousing the effect of the magazine object or the first-aid kit object when the avatar object610moves to the position of the magazine object or the first-aid kit object, for example.

The processor30continues and repeats the processes of steps S31to S34until the game is over. When the game is over, for example, when the player inputs a predetermined input operation for the end of the game (YES in step S35), the processing shown inFIG.13ends.

(Game Progress Processing in User Terminal100)

FIG.14is a flowchart showing an example of a flow of game progress processing to be executed by the user terminal100.

In step S41, the processor10receives the game progress information. In step S42, the processor10functions as the game coordinator115to control the progress of the game in accordance with the game progress information.

In step S43, when the processor10receives the item-supply operation from the user3(YES in step S43), the processor10spends virtual currency and arranges the presentment object674in the virtual space600B in step S44. Here, the virtual currency may be purchased (charged for the main game) when the user3performs a predetermined operation on the processor10before or during the participation in the game, or may be given to the user3when predetermined conditions are satisfied. The predetermined conditions may be those that require participation in the main game such as clearing a quest in the main game, or those that do not require participation in the main game such as answering a questionnaire. As an example, the amount of virtual currency (holding amount of virtual currency) is stored in the user terminal100as game information132.

In step S45, the processor10sends the item-supply information to the server200. The item-supply information is sent to the game play terminal300via the server200.

The processor10arranges item objects in the virtual space600A when a predetermined time elapses after the arrangement of the presentment object674. In the example ofFIGS.11A to11D, the obstacle object673is arranged. In other words, when the user3inputs a tap operation to the UI image711C, a predetermined amount of virtual currency is spent and the obstacle object673is arranged.

The processor10continues and repeats the processes of steps S41to S45when the game is over. When the game is over, for example, when the player inputs a predetermined input operation for the end of the game or when the user3performs a predetermined input operation for leaving in the middle of the game (YES in step S46), the processing shown inFIG.14ends.

(Game Progress Processing in Server200)

FIG.15is a flowchart showing an example of a flow of game progress processing to be executed by the server200.

In step S51, the processor20receives the game progress information from the game play terminal300. In step S52, the processor20functions as the log generator212to update a game progress log (hereinafter, a play log). As an example, the play log is generated by the processor20when the initial arrangement information is received from the game play terminal300.

In step S53, the processor20sends the received game progress information to each of the user terminals100.

In step S54, when the item-supply information is received from any user terminal100(YES in step S54), the processor20functions as the log generator212to update the play log in step S55. In step S56, the processor20sends the received item-supply information to the game play terminal300.

The processor20continues and repeats the processes of steps S51to S56until the game is over. When the game is over, for example, when information indicating the game over is received from the game play terminal300(YES in step S57), the processor20functions as the list generator213to generate a list of users (user list234), who participate in the game, from the play log in step S58. The processor20stores the generated user list234in the server200.

FIG.16is a diagram showing a specific example of the user list234. A “user” column stores information (for example, a user name) indicating users who participate in the game. A “tag” column stores information (tag) generated based on the support performed on the player by each user. In the example ofFIG.16, tags not having square brackets in tags stored in the “tag” column are information automatically generated by the processor20, and tags having square brackets are information manually input by the operator of the game.

In the example ofFIG.16, the user “AAAAA” is associated with the information: A MAGAZINE, 10 F, A BOSS, and “WINNING AGAINST THE BOSS BECAUSE OF GIFT OF THE MAGAZINE”. This indicates that the user “AAAAA” supplies a magazine, for example, in a battle against a boss on a stage of a 10th floor and the avatar object610wins the boss with bullets of the supplied magazine.

In addition, the user “BBBBB” is associated with the information: A FIRST-AID KIT, 3 F, ZAKO, and “RESTORATION IMMEDIATELY BEFORE GAME OVER”. This indicates that the user “BBBBB” supplies a first-aid kit, for example, in a battle against a Zako enemy (a low level enemy) on a stage of a 3rd floor, and as a result, that the physical strength of the avatar object610is restored immediately before the physical strength becomes 0 (becomes game over).

In addition, the user “CCCCC” is associated with the information: A BARRICADE, 5 F, ZAKO, and “STOP TWO ZOMBIES FROM COMING HERE USING BARRICADE”. This indicates that the user “CCCCC” supplies a barricade (obstacle object672inFIGS.11A to11D), for example, in a battle against a Zako enemy on a stage of a 5th floor, and as a result, succeeds in making two Zako enemies stuck.

In the example ofFIG.16, one support provided is associated with the user name of each user3, but a tag for each of the multiple times of support can be associated with the user name of the user3who has performed the support several times. It is preferable that the respective tags are distinguished from one another in the user list234. Thereby, after the game over, the player who refers to the user list421using the transmission terminal400can accurately grasp the content of each support.

(Transmission Processing in Transmission Terminal400)

FIG.17is a flowchart showing an example of a flow of transmission processing by the transmission terminal400.FIGS.18A and18Bare diagrams showing a specific example of a screen displayed on the transmission terminal400.FIG.19is a diagram showing another specific example of the screen displayed on the transmission terminal.

In step S61, the processor40functions as the operation receiver413to receive a first operation for displaying the list (user list234) of users who participate in the game. A download screen721shown inFIG.18Ais a screen for downloading the user list234from the server200and controlling the display unit452to display it. As an example, the download screen721is a screen to be displayed immediately after a start operation of an application for executing the transmission processing shown inFIG.17is input to the transmission terminal400.

The download screen721includes UI images722and723as an example. The UI image722receives an operation for downloading the user list234, that is, the first operation. The first operation may be, for example, an operation for tapping the UI image722. The UI image723receives an operation for terminating the application. Such an operation may be, for example, an operation for tapping the UI image723.

Upon reception of the tap operation on the UI image722, the processor40functions as the communication controller411to acquire (receive) the user list234from the server200in step S62. In step S63, the processor40functions as the display controller412to control the display unit452to display the user list234. Specifically, the processor40controls the display unit452to display a user list screen generated based on the user list234. As an example, the user list screen may be a user list screen731shown inFIG.18B. The user list screen731includes record images corresponding to respective records in the user list234. In the example ofFIG.18B, record images732A to732C are described as the record images, but the number of record images is not limited to three. In the example ofFIG.18B, when the number of records in the user list234is greater than 3 (that is, when the number of users participating in the game is greater than 3), the player can control the display unit452to display another record image by, for example, inputting an operation of scrolling the screen (for example, a drag operation or a flick operation) to the touch screen45.

As an example, the record images732A to732C include user names733A to733C, tag information734A to734C, and icons735A to735C, respectively. Hereinafter, the record images732A to732C, the user names733A to733C, the tag information734A to734C, and the icons735A to735C are a “record image732”, a “user name733”, “tag information734”, and an “icon735”, respectively, when being not necessary to be distinguished from one another.

The user name733is information indicating each of users who participate in the game stored in the “user” column in the user list234. The tag information734is information indicating a tag associated with each of users who participate in the game in the user list234. For example, the record image732A includes “AAAAA” as the user name733A. Therefore, the record image732A includes, as the tag information734A, the information associated with the “AAAAA” in the user list234: A MAGAZINE, 10 F, A BOSS, and “WINNING AGAINST THE BOSS BECAUSE OF GIFT OF THE MAGAZINE”. The icon735is, for example, an image set in advance by the user.

The processor40may store the received user list in the transmission terminal400(in the user list421ofFIG.7). The download screen721may include a UI image (not shown) for displaying the user list421on the display unit452. In this example, when the UI image is tapped, the processor40reads the user list421without downloading the user list234, generates a user list screen from the user list421, and controls the display unit452to display the generated user list screen.

In step S64, the processor40functions as the operation receiver413to receive a second operation for selecting any of the users included in the user list screen731. As an example, the second operation may be an operation of tapping any of the record images732on the user list screen731. In the example ofFIG.18B, the player inputs a tap operation to the record image732A. In other words, the player selects the user “AAAAA” as a user who transmits the behavior instruction data.

Upon reception of the tap operation on the record image732, the processor40functions as the display controller412to control the display unit452to display the motion list422. Specifically, the processor40controls the display unit452to display a motion list screen generated based on the motion list422. As an example, the motion list screen may be a motion list screen741shown inFIG.19. The motion list screen741includes record images corresponding to respective records in the motion list422. In the example ofFIG.19, record images742A to742C are described as the record images, but the number of record images is not limited to three. In the example ofFIG.19, when the number of records in the motion list422is greater than 4, the player can control the display unit452to display another record image by, for example, inputting an operation of scrolling the screen (for example, a drag operation or a flick operation) to the touch screen45.

As an example, the record images742A to742C include motion names743A to743C, motion images744A to744C, and UI images745A to745C, respectively. Hereinafter, the record images742A to742C, the motion names743A to743C, the motion images744A to744C, and the UI images745A to745C are a “record image7432”, a “motion name743”, a “motion image744”, and a “UI image745”, respectively, when being not necessary to be distinguished from one another.

The motion name743is information for identifying the motion stored in the motion list422. The motion image744is an image generated from motion data associated with each motion name in the motion list422. As an example, the processor40includes an image of the avatar object610, which takes a first posture in each motion data, in the record image742as the motion image744. The motion image744may be a UI image that receives a predetermined operation (for example, a tap operation on the motion image744) from the player. Upon reception of the predetermined operation, the processor40may reproduce a motion moving image in which the avatar object610behaves in accordance with the motion data. The processor40may automatically display the motion list screen741again when the motion moving image is completed.

The record image742may include, for example, a UI image including the text “motion reproduction” instead of the motion image744.

In step S66, the processor40functions as the operation receiver413to receive a third operation for selecting a motion. As an example, the third operation may be a tap operation on the UI image745. In other words, the UI image745receives an operation for selecting motion data corresponding to each of the record images742. By receiving the third operation, the processor40functions as the motion specifier415to specify the motion data selected by the player.

In step S67, the processor40functions as the display controller412and the sound receiver414to receive a sound input of the player while reproducing the motion moving image in which the avatar object610behaves in accordance with the selected motion data.

FIG.20is a diagram showing a specific example of a sound input by a player4. As shown inFIG.20, the player4inputs speech sound820A while reproducing a motion moving image810A. The speech sound820A is a speech sound directed to the user3(hereinafter, user3A) with a user name “AAAAA”. In other words, in the example ofFIG.20, the player4selects a user3A (first user) in step S64, and creates behavior instruction data directed to the user3A. It is assumed that the user terminal100used by the user3A is a user terminal100A.

Since the speech sound820A is a speech sound directed to the user3A, the speech sound is based on the content of the support provided for the avatar object610(in other words, the player4) by the user3A. Specifically, the user3A supplies a magazine in a battle against a boss on a stage of a 10th floor, and the avatar object610wins the boss with bullets of the supplied magazine. Therefore, the speech sound820A includes the contents “THANK YOU FOR GIVING ME THE MAGAZINE IN THE BATTLE AGAINST THE BOSS! THE TIMING WAS PERFECT! THANKS TO MR. AAAAA, I WAS ABLE TO CLEAR IT!”. As described above, it is preferable that the speech sound includes the content of the support provided by the user3in the game and gratitude to the user3.

In an aspect, the player4creates a speech content directed to the user3before starting the sound input, that is, before inputting the third operation to the transmission terminal400. In another aspect, the speech content directed to the user3may be automatically generated by the processor40. In addition, the processor40may display the tag associated with the user3selected by the second operation in a manner of being superimposed on the motion moving image810A.

The processor40converts the received sound into sound data. In step S68, the processor40functions as the behavior instruction data generator416to generate behavior instruction data including the sound data and the motion data of the selected motion.

In step S69, the processor40functions as the communication controller411to transmit the generated behavior instruction data to the user terminal100(first computer) of the selected user3(user3A in the example ofFIG.20).FIGS.21A to21Care diagrams showing further another specific example of the screen displayed on the transmission terminal400. After executing step S68, the processor40functions as the display controller412to control the display unit452to display the transmission screen. As an example, the transmission screen may be a transmission screen751shown inFIG.21A. The transmission screen751includes a UI image752and a motion image753A. Further, as shown inFIG.21A, the transmission screen751may include information indicating a user to whom the behavior instruction data is transmitted.

The UI image752receives an operation for transmitting the behavior instruction data to the selected user3. The operation may be, for example, a tap operation on the UI image752. The motion image753A is a UI image that receives an operation for reproducing the moving image based on the generated behavior instruction data, that is, the moving image based on the behavior instruction data generated for the user3A. The operation may be, for example, a tap operation on the motion image753A. The UI image, which receives the operation for reproducing the generated moving image, is not limited to the motion image753A. For example, the UI image may be a UI image including a text “moving image reproduction”. The processor40may automatically display the transmission screen751again when the moving image is completed.

The transmission screen751may preferably further include a UI image that receives an operation for returning to the reception of the sound input. The operation may be, for example, a tap operation on the UI image. The transmission screen751includes the UI image, whereby the player4can perform the sound input again when the sound input fails, such as when the speech content is mistake. The UI image may be a UI image that receives an operation for returning to the selection of motion data.

Upon reception of the tap operation on the UI image752, the processor40sends the behavior instruction data together with the information indicating the user3A to the server200. The server200specifies the user terminal100, which is a destination of the behavior instruction data, based on the information indicating the user3A, and sends the behavior instruction data to the specified user terminal100(that is, the user terminal100A).

When the sending of the behavior instruction data is completed, the processor40may control the display unit452to display a transmission completion screen761shown inFIG.21Bas an example. The transmission completion screen761includes UI images762and763as an image. Further, the transmission completion screen761may include a text indicating that the sending of the behavior instruction data is completed, as shown inFIG.21B.

The UI image762receives an operation for starting creation of behavior instruction data directed to another user3. The operation may be, for example, an operation of tapping the UI image762. Upon reception of the tap operation, the processor40controls the display unit452to display the user list screen again. In other words, when the tap operation is received, the transmission process returns to step S63. At this time, the processor40may generate a user list screen based on the user list421stored in the transmission terminal400, and control the display unit452to display the generated user list screen. The UI image763receives an operation for completing the application. The operation may be, for example, an operation of tapping the UI image763. When the operation is received, the transmission process ends.

In the example described with reference toFIGS.20and21A to21C, as shown inFIG.21C, the transmission terminal400sends the behavior instruction data of the moving image directed to the user3A (the user3with the user name “AAAAA”) only to the user terminal100A.

FIG.22is a diagram showing another specific example of a sound input by the player4. As shown inFIG.22, the player4inputs a speech sound820B while reproducing a motion moving image810B. The speech sound820B is a speech sound directed to the user3(hereinafter, user3B) with a user name “BBBBB”. In other words, in the example ofFIG.22, the player4inputs a tap operation on a record image732B corresponding to the user3B and creates behavior instruction data directed to the user3B in step S64. It is assumed that the user terminal100used by the user3B is a user terminal100B.

Since the speech sound820B is the speech sound directed to the user3B, the speech sound is based on the content of the support provided for the avatar object610(in other words, the player4) by the user3B. Specifically, the user3B of the user “BBBBB” supplies a first-aid kit in a battle against a Zako enemy on a stage of a 3rd floor, and as a result, the physical strength of the avatar object610is restored immediately before the physical strength becomes 0 (becomes game over). For this reason, the speech sound820B includes the contents “THANKS TO THE FIRST-AID KIT THAT MR. BBBBB GAVE ME, I HAVE SURVIVED WITHOUT GAME OVER ON THE 3RD FLOOR. THANKS SO MUCH!”.

FIGS.23A to23Care diagrams showing further another specific example of the screen displayed on the transmission terminal400. The transmission screen751shown inFIG.23Aincludes a UI image752and a motion image753B. The motion image753B reproduces a moving image in accordance with the behavior instruction data generated for the user3B when receiving a tap operation.

Upon reception of the tap operation on the UI image752, the processor40sends the behavior instruction data together with the information indicating the user3B, to the server200. The server200specifies the user terminal100, which is a destination of the behavior instruction data, based on the information indicating the user3B, and sends the behavior instruction data to the specified user terminal100(that is, the user terminal100B).

In the example described with reference toFIGS.22and23A to23C, as shown inFIG.23C, the transmission terminal400sends the behavior instruction data of the moving image directed to the user3B (the user3with the user name “BBBBB”) only to the user terminal100B.

As described above, the content of the sound based on the sound data included in the behavior instruction data is based on the content of the support provided for the player4in participating in the latest game by the user3. Since the content of the support is different for each user3, the content of the sound is different for each user3. In other words, after the game is over, behavior instruction data including sounds having different contents is sent to at least some of the user terminals100of the users3who participates in the game.

Further, the motion of the avatar object610in the example ofFIG.22is different from the motion in the example ofFIG.20. In other words, the player4selects, in the generation of the behavior instruction data directed to the user3B, motion data different from that at the time of the generation of the behavior instruction data directed to the user3A. Specifically, in step S66, the player4inputs a tap operation on the UI image745B that selects the motion data corresponding to the record image742B. In this way, the player4can make the motion data included in the behavior instruction data different for each user3.

Then, the behavior instruction data for each user3including the sound data having different contents for each user3and the motion data selected for each user3is sent only to the user terminal100of each user3. In other words, the unique behavior instruction data unique to each of the user terminals100is sent to each of the user terminals100of the selected user3.

FIG.24is a diagram showing an overview of sending of game progress information from the game play terminal300to the user terminal100. While the behavior instruction data for reproducing the moving image in the user terminal100is unique for each of the user terminals100, as shown inFIG.24, the game progress information sent to the user terminals100of all of the users3participating in the game during the game execution are common among the respective user terminals100. In other words, the behavior instruction data included in the game progress information is also common among the respective user terminals100. As described above, it can be said that the behavior instruction data for reproducing the moving image is different from the behavior instruction data for progressing the game from viewpoints of the difference between the user terminals100and the destination.

(Moving Image Reproduction Processing in User Terminal100)

FIG.25is a flowchart showing an example of moving image reproduction processing to be executed by the user terminal100.

In step S71, the processor10functions as the moving image reproducer117to receive the behavior instruction data. In step S72, the processor10functions as the moving image reproducer117to notify the user3of the reception of the behavior instruction data. As an example, the processor10notifies the user3of the reception of the behavior instruction data, using at least one of a display of a notification image on the display unit152, reproduction of a notification sound from a speaker (not shown), and lighting or flickering of a lighting unit (not shown) configured by an LED (light-emitting diode).

In step S73, the processor10functions as the operation receiver111to receive a first reproduction operation for reproducing the moving image. As an example, the first reproduction operation may be an operation of tapping the notification image. In step S74, the processor10functions as the moving image reproducer117to reproduce the moving image by rendering the behavior instruction data. As an example, the processor10may start an application for playing the main game to reproduce the moving image, or may start an application for reproducing the moving image different from the above-described application to reproduce the moving image. Hereinafter, the moving image will be referred to as a “thank-you moving image”.

FIG.26is a diagram showing a specific example of reproduction of a thank-you moving image, and specifically, is a diagram showing an example of reproduction of a thank-you moving image in the user terminal100of the user3A. In a thank-you moving image910A reproduced in the user terminal100, the avatar object610throws out a sound920A while executing a certain motion. In other words, the processor10controls the speaker (not shown) to output the sound920A while reproducing the thank-you moving image910A including the avatar object610that executes a certain motion.

The motion in the thank-you moving image910A is based on the motion data selected by the player4in the generation of the behavior instruction data directed to the user3A, and the sound920A is based on the sound data generated from the speech sound820A input by the player4in the generation of the behavior instruction data. In other words, the sound920A is a sound including the content of the support provided by the user3A in the game and gratitude for the support. In this way, the user3A can watch the thank-you moving image in which the avatar object610speaks the content of the support provided by himself/herself in the game and the gratitude for the support by the input of the first reproduction operation.

As an example, the user terminal100may control the touch screen15to display at least one UI image after the reproduction of the thank-you moving image910A is completed. The UI image may be, for example, a UI image that receives an operation for reproducing the thank-you moving image910A again, may be a UI image that receives an operation for transitioning to another screen, or may be a UI image that receives an operation for completing the application.

Further, as an example, the user terminal100may control the touch screen15to display at least one UI image during the reproduction of the thank-you moving image910A. The UI image may be, for example, a plurality of UI images that receive operations of temporarily stopping or completing the thank-you moving image910A being reproduced, or changing a reproducing scene.

These UI images displayed during the reproduction of the thank-you moving image910A and after the thank-you moving image910A is hunted do not include a UI image for answering to the avatar object610. In other words, the thank-you moving image910A according to the present embodiment does not include a means for answering to the avatar object610.

FIG.27is a diagram showing another specific example of reproduction of a thank-you moving image, and specifically, is a diagram showing an example of reproduction of a thank-you moving image in the user terminal100of the user3B. In a thank-you moving image910B reproduced in the user terminal100, the avatar object610throws out a sound920B while executing a certain motion. In other words, the processor10controls the speaker (not shown) to output the sound920B while reproducing the thank-you moving image910B including the avatar object610that executes a certain motion.

The motion in the thank-you moving image910B is based on the motion data selected by the player4in the generation of the behavior instruction data directed to the user3B, and the sound920B is based on the sound data generated from the speech sound820B input by the player4in the generation of the behavior instruction data. Therefore, the motion performed by the avatar object610in the example ofFIG.27is different from the motion in the example ofFIG.26. Further, the sound920B is a sound including the content of the support provided by the user3B in the game and gratitude for the support. Therefore, the content of the sound920B in the example ofFIG.27is different from the content of sound920A in the example ofFIG.26.

As described above, the thank-you moving image received by at least some of the user terminals100of the users3participating in the game after the game is over is a moving image in which the speech content of the avatar object610is different for each user3.

The processor10may display a UI image930including the content for urging participation in the next game in a manner of being superimposed on the moving image910. The UI image930may be transmitted together with the behavior instruction data, or may be stored in the user terminal100as the game information132.

In this game, the game play terminal300transmits game progress information, which is a transmission content, to the plurality of user terminals100via the server200. Each of the plurality of user terminals100displays the transmission content on the touch screen15on the basis of the game progress information. Specifically, the memory11of the user terminal100stores in advance plurality of types of object data for displaying objects including the avatar object610. When the user terminal100receives the game progress information from the game play terminal300, the user terminal100specifies behavior instruction data from the game progress information, and specifies the type, position, and posture of each object by analyzing (rendering) the behavior instruction data. Further, the user terminal100arranges the objects including the avatar object610in the virtual space600B using the object data stored in the memory11, on the basis of the analysis result. On the touch screen15, the field-of-view image660of the field-of-view area640B in accordance with the position and direction of the current virtual camera620B is displayed. The objects including the avatar object610are also arranged in the virtual space600A defined by the game play terminal300, and the positions and postures of the objects in the virtual space600A coincide with the positions and postures of the objects in the virtual space600B.

The transmission content is provided with a plurality of types of content having different characters displayed as the avatar object610. The plurality of types of content include, for example, first transmission content that is transmitted once or irregularly and second transmission content that is periodically transmitted every day or at a predetermined time on a predetermined day of the week.

Each of the first transmission content and the second transmission content is provided with, for example, a character appearing as a leading role or a facilitator, a character appearing in the content, progress, and others, and a plurality of types of content having different types of games played in the content. In the second transmission content, the same character in accordance with the type (for example, a character “a” in accordance with content “a”) appears and the character background (for example, a wreath object) is displayed every time, but the content of the same type (for example, the content “a”) has a different content every time.

A decoration object (a texture image) such as clothes or accessories is associated with the character displayed as the avatar object610for each type of character. A specific method of specifying a display mode of the decoration object will be described below.

The transmission content includes first character content in which a first character appears in the virtual space600B as the avatar object610and second character content in which a second character appears in the virtual space600B as the avatar object610. When the first character content is transmitted, the behavior instruction data is data (first display data) for making it possible to display the video of the virtual space600B in which the object including the first character is arranged. On the other hand, when the second character content is transmitted, the behavior instruction data is data (second display data) for making it possible to display the video of the virtual space600B in which the object including the second character is arranged.

In both of the first character and the second character, the decoration object corresponding to clothes and accessories (for example, necklace, earrings, and glasses) is associated with each part of upper body, lower body, hands, feet, neck, earlobe, and eyes in advance. The types and parts of the decoration object to be associated are defined to be different depending on the type of the character.

In the present embodiment, the first character associates with a decoration object including a first object corresponding to a shirt and a third object (blouse) that is worn on the first object (shirt) to cover a part of the first object. On the other hand, the second character associates with a decoration object a second object (a shirt having a different design, for example) of the same type as the first object.

Here, the display mode of the first object associated with the first character is determined in accordance with a predetermined rule (for example, random number lottery) on the game play terminal300for each transmission, for example, at the time of start of transmission. The game play terminal300transmits the game progress information including display mode information, which can specify the display mode, to the user terminal100, and the user terminal100specifies and sets the display mode of the first object based on the display mode information. On the other hand, the display mode of the decoration object other than the first object associated with the first character includes a display mode of the third object and is fixed in advance. The user terminal100sets the display mode of the decoration object other than the first object to a fixed display mode. On the other hand, the display mode of the decoration object associated with the second character includes the display mode of the second object of the same type as the first object, and is fixed in advance. The user terminal100sets the display mode of the decoration object to a fixed display mode.

For this reason, the display mode of the first object associated with the first character among the decoration objects is changed for each transmission. Here, the change of the display mode includes a change of a texture image, and a texture image of the first object is changed for each transmission. On the other hand, regarding the decoration object other than the first object associated with the first character and the decoration object associated with the second character, the display mode is fixed irrespective of the number of transmissions (transmission period).

The display mode is a mode when the decoration object is displayed on the display unit152. In the present embodiment, an example is exemplified in which the display mode determined for the first object (shirt) is a display color of the first object (that is, a color of the shirt). Therefore, among the decoration objects such as shirts and blouses associated with the first character, the color of the shirt is specified from a plurality of colors (white, black, red, green, blue, and purple).

In the present embodiment, the second transmission content to be periodically transmitted includes, for example, the first character content in which the first character appears whose display mode of the first object is changed every time. In the present embodiment, as the first character, for example, a character exhibiting the appearance of an office lady is exemplified. However, the first character is not limited thereto, and may be, for example, a character (for example, a high-school student character) whose age set as a character is less than a predetermined age (18 years old), a character exhibiting the appearance of a cheering party, and a character exhibiting the appearance of a female university student.

The content is transmitted from the game play terminal300during a prescribed transmission period. When the transmission of the content is started, the processor30of the game play terminal300executes a display mode control process shown inFIG.28A. Further, viewing of the content on the user terminal100is started by a touch operation on a prescribed content icon from the user. When the viewing of the content is started by the touch operation, the processor10on the user terminal100executes a display mode setting process shown inFIG.28B.

With reference toFIG.28A, in step S81, the game play terminal300determines based on a transmission schedule stored in memory31whether a current timing coincides with a start timing of content transmission prescribed for each content. In other words, the memory31stores information on a time zone for transmitting the content and the type of the content (character appearing, that is, either of the first character content or the second character content), as the transmission schedule. In step S81, it is determined based on the transmission schedule whether or not the coincidence.

When it is determined that the current timing coincides with the start timing, the process proceeds to step S82. In step S82, it is determined based on the transmission schedule whether the content to be transmitted currently is the first character content. When it is determined to be the first character content, the process proceeds to step S83.

In step S83, the display mode (that is, the color of the shirt) of the first object among the decoration objects associated with the first character in the current transmission period is determined by a random number lottery (in a random manner) from the plurality of types of display modes (white, black, red, green, blue, and purple, for example). More specifically, as described above, the first character content is the second transmission content that is periodically transmitted, and in step S83, a display mode different from the display mode at the time of previous transmission determined at the time of previous transmission of the first character content is determined by a random number lottery. Thereby, the display mode (that is, the color of the shirt) of the first object is changed to a display mode different from that at the time of the previous transmission.

In step S84, display mode information (first data) capable of specifying the display mode determined in step S83is set as the information included in the game progress information. When the first character content is transmitted, the game progress information including the display mode information in accordance with the determination in step S83, which is information in accordance with the motion of the performer, is transmitted at predetermined time intervals (for example, 1/60 second).

When the process of step S84is completed, the process returns when it is not determined in step S81that the current timing coincides with the start timing of content transmission or when it is not determined in step S82that the content to be transmitted currently is the first character content. When the content other than the first character content is transmitted, the game progress information in accordance with the motion of the performer is transmitted at predetermined time intervals (for example, 1/60 second).

With reference toFIG.28B, in step S91, the user terminal100determines based on the input operation on the touch screen15whether a viewing start operation of content is performed. When it is not determined that the viewing start operation is performed, the process returns, and when it is determined that the viewing start operation is performed, the process proceeds to step S92. In step S92, it is determined by analysis of the game progress information whether the game progress information received from the game play terminal300includes the display mode information.

When it is determined to include the display mode information, the process proceeds to step S93, and the display mode of the first object in the current transmission period is set based on the display mode information. In other words, the display mode (that is, the color of the shirt) of the first object among the display modes of the decoration objects in the current transmission period is set to the display mode (any one of white, black, red, green, blue, and purple), which is specified by the display mode information. When the process of step S93is completed or when it is not determined in step S92that the game progress information including the display mode information is received, the process proceeds to step S94.

In step S94, a fixed display mode is set as a display mode of objects other than the first object among the decoration objects associated with the first character. Further, during viewing of the content in which the character other than the first character appears, in step S94, a fixed display mode is set as a display mode of the decoration object (including a shirt) associated with the character (for example, the second character) different from the first character. In other words, when the first character content is transmitted, decoration objects (for example, blouses, trousers, and shoes) other than the first object (shirt) among the decoration objects associated with the first character are set to a fixed display mode (a fixed color). On the other hand, when the content (for example, the second character content) different from the first character content is transmitted, all of costume objects associated with the character (for example, the second character) different from the first character are set to a fixed display mode.

By the processes of steps S93and S94, model data of the avatar object610behaving in the virtual space600B is generated. During the viewing of the content, the avatar object610based on the model data behaves on the basis of the behavior instruction data included in the game progress information transmitted from the game play terminal300. When the process of step S94is completed, the process returns.

FIG.29shows a display example of an object including the first character during the viewing of the first character content. InFIG.29, a wreath object reminiscent of a wreath is arranged behind the first character. The first character behaves on the basis of the behavior instruction data.

Decoration objects associated with the first character include a shirt to be worn on the upper body, a blouse worn on the shirt, trousers to be worn on the lower body, and shoes to be worn on the feet. Further, for the decoration objects other than the shirt which is the first object, the color as the display mode is uniformly defined. For example, the blouse and the shoes are defined by white, and the trousers are defined by gray.

On the other hand, the shirt being the first object is set for each content transmission with a different color from that at the time of the previous transmission as shown in step S83ofFIG.28. As the display mode of the shirt, a white color is set as shown inFIG.29Ain Nth live transmission for the first character content. Further, a gray color is set as shown inFIG.29Bin (N+1)th live transmission for the first character content, and a blue color (indicating that a striped pattern in the drawing is blue) is set as shown inFIG.29Cin (N+2)th live transmission for the first character content. As described above, with respect to the first character content, the same first character appears wearing a shirt of a different color for each content transmission.

According to the present embodiment, the video of the virtual space600B including the first character, which moves in cooperation with the behavior of the performer, is displayed on the basis of the behavior instruction data (first display data) transmitted from the game play terminal300. Here, the prescribed first object (shirt) is associated with the first character, and the display mode (the color of the shirt) of the first object is changed in accordance with the predetermined rule. Thereby, the display mode of the first object can be spontaneously changed, and as a result, the user can be given a sense of reality of the first character.

Further, according to the present embodiment, the video of the virtual space600B including the second character, which moves in cooperation with the behavior of the performer, is displayed on the basis of the behavior instruction data (second display data) transmitted from the game play terminal300. Here, the second object (shirt) of the same type as the first object is associated with the second character, and the display mode (the color of the shirt) of the second object is fixed. Thereby, a sense of reality of the first character can be remarkable.

Further, according to the present embodiment, the display mode of the first object is changed for each transmission period to the display mode different from the display mode in the previous transmission period. Thereby, it is possible to give the user the impression that the display mode of the first object is intentionally changed for each transmission, and to give the user an interesting such as what kind of display mode will appear during this transmission period, whereby viewing rating of the transmission can be improved.

Further, according to the present embodiment, the first display data is data transmitted from the game play terminal300, and includes motion data corresponding to the motion input by the performer. The video in which the first character behaves in the virtual space600B is displayed by producing the behavior of the first character in the virtual space600B in real time based on the motion data. Thereby, it is possible to reduce the amount of data to be transmitted.

Further, according to the present embodiment, the display mode of the first object changed in accordance with the predetermined rule is specified by the display mode information (first data) transmitted from the game play terminal300. The display mode of the first object is set based on the first data. Thereby, the display mode of the first object in the plurality of user terminals100can be uniformly controlled on the game play terminal300.

Further, according to the present embodiment, the first character is associated with the prescribed third object (blouse). The third object is an object that covers at least a part of the first object. Thereby, it is possible to control how the first object is displayed by the third object.

Modifications of the embodiment described above will be listed below.

(1) In the above-described embodiment, the normal viewing mode has been described in which the avatar object610behaves in the virtual space600B similar to the virtual space600A, and the field-of-view image660of the field-of-view area640B depending on the position, direction, and inclination of the virtual camera620B arranged in the virtual space600B is displayed on the touch screen15of the user terminal100, the virtual camera620B being configured to be changed in accordance with the swipe operation on the touch screen15of the user terminal100.

However, the space where the avatar object610behaves is not limited to the virtual space, and may be a real space. An example of the real space may include a space specified by the acquired image captured and acquired by the camera (imaging unit)17. Specifically, for example, when the model is switched to the AR viewing mode, in the user terminal100, a plane surface portion such as a floor surface is specified by analyzing the acquired image captured and acquired by the camera17, and the AR virtual space arranged with at least the avatar object610is arranged at a predetermined position (for example, a center position of the plane surface portion) of the plane surface portion. In this case, the virtual camera620B is arranged at a position where the AR virtual space is viewed in the same direction as the capturing direction of the camera17at the time of capturing the acquired image. As a result, the display unit152displays the image formed in the augmented reality space such that the field-of-view image acquired from the virtual camera620B is superimposed on the image acquired by the camera17. Further, the user terminal100detects the actual position, direction, and inclination of the camera17, and changes the position, direction, and inclination of the virtual camera620B depending on the position, direction, and inclination of the camera17. Thereby, the acquired image can be changed in accordance with the change in the position, direction, and inclination of the camera17.

In the user terminal100in which the AR mode is selected, the avatar object610behaves in the augmented reality space where the objects including the avatar object610are arranged on the acquired image acquired by the camera17. Thereby, the user can be given the impression (realism) as if the avatar object610is actually behaving in the real space in front of the user.

FIGS.30A and30Bshow an example in which a character imitating a female wearing a hat (hereinafter, referred to as a third character) appears in the virtual space600B. The third character is associated with decoration objects, for example, a one-piece dress worn on the entire body of the third character, a hat worn on a head, and earrings on ears. InFIGS.30A and30B, the earrings among the decoration objects associated with the third character corresponds to a first object. For this reason, the display mode of the earrings is set to a different display mode for each transmission in accordance with step S83ofFIG.28. The second transmission content transmitted periodically includes, for example, character content in which the third character whose display mode of the first object is changed every time appears.

First, a normal viewing mode will be described with reference toFIG.30A.FIG.30(A-1) shows a display example when the third character faces the front and the forefinger of her right hand is raised in front.FIG.30(A-2) shows a display example when the third character faces diagonally forward to the right and points forward with the forefinger of her right hand. In the normal viewing mode, the position, direction, and inclination of the virtual camera620B are changed in accordance with the swipe operation. However, the earrings are covered with the hat corresponding to a third object, and no matter how the swipe operation is performed, the virtual camera320B cannot be moved to a position where the earrings can be seen (for example, a position whereFIGS.30(B-3) and30(B-4) to be described below can be displayed.) In the normal viewing mode, as shown inFIG.30A, an AR mode transition icon182for switching to an AR viewing mode is normally displayed. The mode can be switched to the AR viewing mode by detection of a touch operation on the AR mode transition icon182in the normal viewing mode.

The AR viewing mode will be described with reference toFIG.30B.FIGS.30(B-1) and30(B-2) show display examples in the AR viewing mode at the same timing asFIGS.30(A-1) and30(A-2), respectively. In the AR viewing mode, among the images of the augmented reality space in which the third character is arranged with respect to the acquired image captured and acquired by the camera17, the image from the virtual camera620B is displayed on the display unit152. The third character behaves on the basis of the behavior instruction data transmitted from the game play terminal300. Further, the position, direction, and inclination of the virtual camera620B are changed in accordance with the actual position, direction, and inclination of the camera17. In the AR viewing mode, the AR mode transition icon182is displayed in a highlighted manner (black-and-white inverse display inFIG.30B) to notify that the mode is being the AR viewing mode.

For this reason, when the position, direction, and the like of the camera17are adjusted so as to look up the left ear from diagonally below the left side of the third character, the virtual camera320B moves to the position where earring can be seen, and an image of the earring is displayed on the display unit152(seeFIG.30(B-3)). In other words, the display mode of the earring can be confirmed by selecting the AR viewing mode and adjusting of the position, direction, and the like of the camera17. As a result, the user can be motivated to watch in the AR viewing mode, and thus a taste can be improved. When the position, direction, and the like of the camera17is adjusted so as to look up the left half body of the third character from the feet of the third character, an image of the left half body including legs as shown inFIG.30(B-4) is displayed on the display unit152.

As described above, in the AR viewing mode, the position, direction, and the like of the camera17are adjusted, and thus the virtual camera620B can be moved to the position where the earring as the first object associated with the third character can be displayed (for example, the position looking up from the neck of the third character or the position looking up from the feet of the third character). As a result, the earring as the first object can be displayed, and the user can confirm the display mode of the earring. On the other hand, in the normal viewing mode, as described above, no matter how the operation is performed, the virtual camera620B cannot be moved to the position where the earring as the first object associated with the third character can be displayed (for example, the position looking up from the neck of the third character or the position looking up from the feet of the third character). As a result, the earring as the first object cannot be displayed, and the user cannot confirm the display mode of the earring.

In the AR viewing mode, the mode can be switched to the normal viewing mode by detection of the touch operation on the highlighted AR mode transition icon182. In the AR viewing mode, the AR mode transition icon182is highlighted and the AR mode resetting icon is normally displayed, whereby the AR mode may be reset (restarted) in response to the touch on the AR mode resetting icon. When the AR mode is reset, the plane surface portion such as the floor surface is specified again, and the field-of-view image in the AR virtual space600C is arranged at a predetermined position of the plane surface portion.

When the highlighted AR mode transition icon182is tapped in a state where the first object is displayed in the AR viewing mode, the viewpoint of the virtual camera620B is returned to the prescribed viewpoint (for example, the viewpoint that captures the virtual space600B from the front), or is changed to the viewpoint in which the movement amount of the virtual camera620B is minimum within the movable range of the virtual camera620B in the normal mode. Thereby, the deviation of the viewpoint before and after mode switching can be reduced as much as possible. Further, in the state where the first object is displayed in the AR viewing mode (for example,FIG.30(B-3)), the AR mode transition icon182may not be temporarily displayed.

In the AR viewing mode, the virtual camera620B is controlled in synchronization with the changes in the position, direction, and inclination of the camera17. However, the virtual camera620B may control the position, direction, and inclination in accordance with the input operation (for example, the swipe operation and the touch operation) from the user. Thereby, the field-of-view area of the virtual camera620B can be changed depending on the operation input by the user, and as a result, the field-of-view image arranged on the acquired image can be changed. Specifically, the avatar objects (for example, only the avatar object610, the avatar object610and other objects, and the entire virtual space) arranged on the acquired image may be rotated or moved by the swipe operation. Thereby, for example, even when the augmented reality space is generated and the avatar object610behaves in a narrow space where the position, direction and the like of the user terminal100cannot be sufficiently adjusted, the avatar object610can be rotated and moved by the input operation on the touch screen15.

Further, the TV mode may be provided as the viewing mode in which the position, direction, and inclination of the virtual camera620B are changed on the basis of the operation of the switcher on the operator side of the system1, and the mode may be switched to the TV mode in accordance with the user's operation. However, in the TV mode, the virtual camera320B cannot be moved to the position where the first object can be seen, that is, the first object cannot be displayed on the display unit152.

(2) In the above-described embodiment, the real-time rendering method may be used to reproduce the video in which the avatar object610is behaving. However, the video may be reproduced by a moving image transmission method. In this case, regarding the first character content, the first object whose display mode is changed in accordance with the predetermined rule is associated with the first character in the game play terminal300, and the moving image data including the video in which the first character behaves in the virtual space is transmitted as the first display data from the game play terminal300. On the touch screen15of the user terminal100, the video is displayed in which the first character behaves in the virtual space on the basis of the moving image data.

(3) The display mode of the costume of the avatar object610in the above-described embodiment is uniformly defined in accordance with the type of the avatar object610. However, the display mode of the costume of the avatar object610can be different for each user depending on, for example, the degree of progress of the game (for example, whenever the user wins the game or whenever the user feeds items). For example, even when the game progress information sent from the game play terminal300is the same, the costume of the avatar object610may be a costume in accordance with the degree of progress of the game. In this case, a plurality of types of costume data of the avatar object610may be stored in advance on each of the user terminals100. The target for changing the display mode of the avatar object610is not limited to the costume, but may be a hairstyle, a skin color, and a degree of makeup. Thereby, the variation of the avatar object610can be increased without an increase of a processing load on the game play terminal300that outputs the game progress information, so that the user's attention can be attracted.

Alternatively or additionally, the display mode of the avatar object610may be different from the display mode displayed on the user terminal100of another user in accordance with the charge by the user during the game. In addition, the display mode of the avatar object610may be different depending on the ranking of the user that is updated depending on the result of the game. Further, the user terminal100capable of the first character content may be limited to some of the user terminals100, for example, the user terminal100of the user who has a large charge during the game.

(4) In the above-described embodiment, the color of the display modes of the first object (shirt) is changed by the process of step S83. However, in step S83, the shape of the first object may be changed instead of or in combination with the color. For example, the shirt may be changed from a V-neck to a round neck, or the earring may be changed from a star to a ring. In step S83, instead of or in addition to the determination of the color, shape, and type, the display (for example, to be worn) or non-display (not to be worn) of the first object itself may be determined.

(5) In the above-described embodiment, any one of the decoration objects associated with the first character (for example, a shirt or an earring) is set as the first object, and the display mode of the first object is changed. However, the first object is not limited to the shirt or the earring as long as it is an object associated with the first character, and may be an object such as socks, trousers, a hair clip, gloves, a hat, or underwear. Further, the object may be, for example, an object related to the first character itself (for example, a hairstyle or a nail) without being limited to the decoration object. Further, the display mode of the first object associated with the first character may be changed, and the display mode of the background object (for example, the wreath object) of the first character may be changed.

(6) In the above-described embodiment, among the decoration objects associated with the first character, the display mode of the decoration objects other than the first object is fixed in advance. However, the display mode of the decoration objects other than the first object may be changed not for each transmission, but every transmission of predetermined number of times (five times), every season, and every time zone. Similarly, in the above-described embodiment, the display mode of the decoration object associated with the second character is fixed in advance, but the display mode may also be changed every transmission of predetermined number of times (five times), every season, and every time zone.

(7) In the above-described embodiment, an example has been described in which the display mode of the first object associated with the character (the first character or the third character) appearing in the second transmission content that is transmitted periodically is changed for each transmission. However, the display mode of the first object associated with the character appearing in the first transmission content that is transmitted irregularly may also be changed to a display mode different from that at the time of previous transmission.

Matters described in each of the above embodiments will be described below as Supplementary notes.

(Supplementary Note 1):

According to an aspect of an embodiment shown in the present disclosure, there is provided a program to be executed in a terminal device (user terminal100) which comprises a processor, the program causing the processor to execute: a step (S26) of receiving first display data (behavior instruction data) for enabling display of a video of a virtual space including a first character (avatar object610), which moves in cooperation with a behavior of a performer; and a step (S29) of displaying a video in which the first character behaves in the virtual space, on the basis of the first display data, wherein the first character is associated with a prescribed first object, and a display mode (color) of the first object is changed in accordance with a predetermined rule.

(Supplementary Note 2):

In Supplementary note 1, the program causes the processor to execute: a step (S26) of receiving second display data for enabling display of a video of a virtual space including a second character (another avatar object610), which moves in cooperation with the behavior of the performer; and a step (S29) of displaying a video in which the second character behaves in the virtual space, on the basis of the second display data, wherein the second character is associated with a second object (shirt) of the same type as the first object, and a display mode (color) of the second object is fixed.

(Supplementary Note 3):

In Supplementary note 1 or Supplementary note 2, the display mode of the first object is changed to a display mode different from a display mode in a previous transmission period, every transmission period.

(Supplementary Note 4):

In any one of Supplementary notes 1 to 3, the first display data is data transmitted from an external source and including motion data corresponding to a motion input by the performer; and the step of displaying the video includes producing a behavior of the first character in the virtual space in real time on the basis of the motion data to display the video in which the first character behaves in the virtual space.

(Supplementary Note 5):

In Supplementary note 4, the program causes the processor to execute: a steps (S92) of receiving first data (display mode information) for specifying the display mode of the first object that is changed in accordance with the predetermined rule; and a step (S93) of setting the display mode of the first object on the basis of the first data.

(Supplementary Note 6):

In Supplementary note 4 or Supplementary note 5, the program causes the processor to execute the step of setting to any one of a plurality of types of viewing modes (normal viewing mode, AR viewing mode) in which at least a part of a mode for specifying a viewpoint is different; the step of displaying the video includes displaying the video of the virtual space as viewed from the specified viewpoint; and a viewpoint is movable to a position where the first object is displayed in a case of a first viewing mode (AR viewing mode) among the plurality of types of viewing modes, and a viewpoint is not movable to the position where the first object is displayed in a case of a mode other than the first viewing mode.

(Supplementary Note 7):

In Supplementary note 6, the terminal device includes a detection unit that detects a position and a direction of the terminal device and a touch screen; the first viewing mode is a viewing mode in which a viewpoint is specifiable on the basis of the position and the direction detected by the detection unit; the plurality of types of viewing modes include a second viewing mode (normal viewing mode) in which a viewpoint is specified on the basis of a touch operation on the touch screen; and a viewpoint is not movable to the position where the first object is displayed in a case of the second viewing mode among the plurality of types of viewing modes.

(Supplementary Note 8):

In any one of Supplementary notes 1 to 3, the first display data is data transmitted from an external source, and moving image data including the video in which the first character associated with the first object whose display mode is changed in accordance with the predetermined rule behaves in the virtual space; and the step of displaying the video includes displaying the video in which the first character behaves in the virtual space on the basis of the moving image data.

(Supplementary Note 9):

In any one of Supplementary notes 1 to 8, the first character is associated with a prescribed third object; and the third object is an object that covers at least a part of the first object.

(Supplementary Note 10):

According to another aspect of an embodiment shown in the present disclosure, there is provided a method to be executed by a terminal device which includes a processor, the method including: a step of receiving first display data for enabling display of a video of a virtual space including a first character, which moves in cooperation with a behavior of a performer; and a step of displaying a video in which the first character behaves in the virtual space, on the basis of the first display data, wherein the first character is associated with a prescribed first object, and a display mode of the first object is changed in accordance with a predetermined rule.

(Supplementary Note 11):

According to further another aspect of an embodiment shown in the present disclosure, there is provided a terminal device which includes a processor, the terminal device being configured to: receive first display data for enabling display of a video of a virtual space including a first character, which moves in cooperation with a behavior of a performer; and display a video in which the first character behaves in the virtual space, on the basis of the first display data, wherein the first character is associated with a prescribed first object, and a display mode of the first object is changed in accordance with a predetermined rule.

[Implementation Example by Software]

The control blocks (particularly, the control units110,210,310, and410) of the user terminal100, the server200, the game play terminal300(HMD set1000), and the transmission terminal400may be implemented by a logic circuit (hardware) formed in an integrated circuit (IC chip), or may be implemented by software.

In the latter case, each of the user terminal100, the server200, the game play terminal300(HMD set1000), and the transmission terminal400includes a computer that performs instructions of a program being software for implementing each function. The computer includes, for example, one or more processors and a computer-readable recording medium stored with the above-described program. In the computer, the processor reads from the recording medium and performs the program to achieve the object of the present invention. As the above-described processor, a CPU (Central Processing Unit) can be used, for example. As the above-described recording medium, a “non-transitory tangible medium” such as a ROM (Read Only Memory) as well as a tape, a disk, a card, a semiconductor memory, and a programmable logic circuit can be used. A RAM (Random Access Memory) or the like in which the above-described program is developed may be further included. The above-described program may be supplied to the above-described computer via an arbitrary transmission medium (such as a communication network and a broadcast wave) capable of sending the program. Note that an aspect of the present invention may also be implemented in a form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.

An aspect of the present invention is not limited to each of the above-described embodiments, various modifications are possible within the scope of the present invention defined by aspects, and embodiments that are made by suitably combining technical means disclosed according to the different embodiments are also included in the technical scope of an aspect of the present invention.

REFERENCE SIGNS LIST

1: system;2: network;3,3A,3B: users (first users);4: player (performer);10,20,30,40: processors;11,21,31,41: memories;12,22,32,42: storages;13,23,33,43: communication IFs;14,24,34,44: input/output IFs;15,45: touch screens;17: camera;18: ranging sensor;51: monitor;52: gaze sensor;53: first camera;54: second camera;55: microphone;56: speaker;100,100A,100B,100C: user terminals (computer, first computer, first information processing unit);110,210,310,410: control units (first control unit, second control unit);111,311,413: operation receivers;112,312,412: display controllers;113,313: UI controllers;114,314: animation generators;115,315: game coordinators;116,316: virtual space controllers;117: moving image reproducer;120,220,320,420: storage units (first storage unit, second storage unit);131,231,331: game programs (program, first program);132,232,332: game information;133,233,333: user information;151,451: input units;152,452: display units (display);200: server;211: communication mediator;212: log generator;213: list generator;234,421: user lists;300: game play terminal (external device, second external device);317: response processor;400: transmission terminal (external device, first external device, computer, second information processing unit);411: communication controller;414: sound receiver;415: motion specifier;416: behavior instruction data generator;422: motion list;423: transmission program (program, second program);540,1020,1021: controllers;500: HMD;510: HMD sensor;520: motion sensor;530: display;600A,600B: virtual spaces;610: avatar object (character);620A,620B: virtual cameras;631,632,633,634: objects;640A,640B: field-of-view areas;650,660: field-of-view images;671: enemy objects;672,673: obstacle objects;674: presentment object;691,692: speeches;701,702,703A,70B,704A,704B,705,706,711,711A,711B,711C,711D,722,723,745,745A,745B,745C,752,762,763,930,2011,2022,2031,2032,2033,2034,2037,2038,2051,2063,2072,2073,2075: UI images (message UI, UI);721: download screen;731: user list screen (list);732,732A,732B,732C,742,742A,742B,742C: record images;733,733A,733B,733C: user names;734,734A,734B,734C: tag information;735,735A,735B,735C: icons;741: motion list screen (option);743,743A,743B,743C: motion names;744,744A,744B,744C,753: motion images;751: transmission screen;761: transmission completion screen;810A,810B: motion moving images;820A,820B: speech sounds;910A,910B: moving images;920A,920B: sound;1000: HMD set;1010: object;1030: storage medium;2010: home screen;2020: ranking screen;2021: title image;2026,2026A,2026B: ranking;2027,2027A,2027B: amount of charge;2028,2028A,2028B: the number of sending processes;2029: notification of completion of sending;2030: last sending date;2035: detailed display area;2036: scroll bar;2040: detailed screen;2050,2060: preparation screens;2052: text;2053,2061: selected images;2054A,2054B,2054C,2062A,2062B,2062C,2062D,2062E,2062F: options;2070: sound input screen; and2074: tag image.

Claims

  1. A non-transitory, tangible computer readable storage medium having a program to be executed in a terminal device which comprises a processor, the program causing the processor to execute steps of: receiving first display data for enabling display of a video of a virtual space including a first character, which moves in cooperation with a behavior of a performer;and displaying a video in which the first character behaves in the virtual space, on the basis of the first display data, wherein: the first character is associated with a prescribed first object;and a display mode of the first object is changed with or without a request from a user of the terminal device to a display mode that is independent from the degree of progress of the video on the terminal device, wherein the user can identify the first object even when the first object is changed to any display mode, wherein the display of the first object is changed so that the user can identify the first object that was hidden by another different object interlocking with the user's operation even when the first object is changed to any display mode.
  1. The non-transitory, tangible computer readable storage medium according to claim 1, wherein the program causes the processor to execute steps of: receiving second display data for enabling display of a video of a virtual space including a second character, which moves in cooperation with the behavior of the performer;and displaying a video in which the second character behaves in the virtual space, on the basis of the second display data, wherein: the second character is associated with a second object of the same type as the first object;and a display mode of the second object is fixed.
  2. The non-transitory, tangible computer readable storage medium according to claim 1, wherein the display mode of the first object is changed to a display mode different from a display mode in a previous transmission period, every transmission period.
  3. The non-transitory, tangible computer readable storage medium according to claim 1, wherein: the first display data is data transmitted from an external source and including motion data corresponding to a motion input by the performer;and the step of displaying the video includes producing a behavior of the first character in the virtual space in real time on the basis of the motion data to display the video in which the first character behaves in the virtual space.
  4. The non-transitory, tangible computer readable storage medium according to claim 4, wherein the program causes the processor to execute steps of: receiving first data for specifying the display mode of the first object that is changed in accordance with the predetermined rule;and setting the display mode of the first object on the basis of the first data.
  5. The non-transitory, tangible computer readable storage medium according to claim 4, wherein: the program causes the processor to execute the step of setting to any one of a plurality of types of viewing modes in which at least a part of a mode for specifying a viewpoint is different;the step of displaying the video includes displaying the video of the virtual space as viewed from the specified viewpoint;and a viewpoint is movable to a position where the first object is displayed in a case of a first viewing mode among the plurality of types of viewing modes, and a viewpoint is not movable to the position where the first object is displayed in a case of a mode other than the first viewing mode.
  6. The non-transitory, tangible computer readable storage medium according to claim 6, wherein: the terminal device includes a detection unit that detects a position and a direction of the terminal device and a touch screen;the first viewing mode is a viewing mode in which a viewpoint is specifiable on the basis of the position and the direction detected by the detection unit;the plurality of types of viewing modes include a second viewing mode in which a viewpoint is specified on the basis of a touch operation on the touch screen;and a viewpoint is not movable to the position where the first object is displayed in a case of the second viewing mode among the plurality of types of viewing modes.
  7. The non-transitory, tangible computer readable storage medium according to claim 1, wherein: the first display data is data transmitted from an external source, and moving image data including the video in which the first character associated with the first object whose display mode is changed in accordance with the predetermined rule behaves in the virtual space;and the step of displaying the video includes displaying the video in which the first character behaves in the virtual space on the basis of the moving image data.
  8. The non-transitory, tangible computer readable storage medium according to claim 1, wherein: the first character is associated with a prescribed third object;and the third object is an object that covers at least a part of the first object.
  9. A method to be executed by a terminal device which comprises a processor, the method comprising steps of: receiving first display data for enabling display of a video of a virtual space including a first character, which moves in cooperation with a behavior of a performer;and displaying a video in which the first character behaves in the virtual space, on the basis of the first display data, wherein: the first character is associated with a prescribed first object;and a display mode of the first object is changed with or without a request from a user of the terminal device to a display mode that is independent from the degree of progress of the video on the terminal device, wherein the user can identify the first object even when the first object is changed to any display mode, wherein the display of the first object is changed so that the user can identify the first object that was hidden by another different object interlocking with the user's operation even when the first object is changed to any display mode.
  10. A terminal device which comprises a processor, the terminal device being configured to: receive first display data for enabling display of a video of a virtual space including a first character, which moves in cooperation with a behavior of a performer;and display a video in which the first character behaves in the virtual space, on the basis of the first display data, wherein: the first character is associated with a prescribed first object;and a display mode of the first object is changed with or without a request from a user of the terminal device to a display mode that is independent from the degree of progress of the video on the terminal device, wherein the user can identify the first object even when the first object is changed to any display mode, wherein the display of the first object is changed so that the user can identify the first object that was hidden by another different object interlocking with the user's operation even when the first object is changed to any display mode.

Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.