U.S. Pat. No. 12,029,983
STORAGE MEDIUM, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING APPARATUS, AND GAME PROCESSING METHOD
AssigneeNintendo Co Ltd
Issue DateAugust 4, 2022
Illustrative Figure
Abstract
An example of an information processing apparatus performs, in a predetermined area (e.g., a room) in a virtual space, editing including at least one of selecting a placement object to be placed in the area, placing the placement object, and moving the placement object, on the basis of an operation input. The information processing apparatus counts an editing time during which the editing is performed. The information processing apparatus stores, in a memory, arrangement data indicating arrangement of the placement object in the predetermined area. The information processing apparatus performs evaluation of the editing, based on at least the editing time such that a lower evaluation is given when the editing time is shorter than when the editing time is longer.
Description
DETAILED DESCRIPTION OF NON-LIMITING EXAMPLE EMBODIMENTS 1. Configuration of Game System A game system according to an example of an exemplary embodiment is described below. An example of a game system1according to the exemplary embodiment includes a main body apparatus (an information processing apparatus; which functions as a game apparatus main body in the exemplary embodiment)2, a left controller3, and a right controller4. Each of the left controller3and the right controller4is attachable to and detachable from the main body apparatus2. That is, the game system1can be used as a unified apparatus obtained by attaching each of the left controller3and the right controller4to the main body apparatus2. Further, in the game system1, the main body apparatus2, the left controller3, and the right controller4can also be used as separate bodies (seeFIG.2). Hereinafter, first, the hardware configuration of the game system1according to the exemplary embodiment is described, and then, the control of the game system1according to the exemplary embodiment is described. FIG.1is a diagram showing an example of the state where the left controller3and the right controller4are attached to the main body apparatus2. As shown inFIG.1, each of the left controller3and the right controller4is attached to and unified with the main body apparatus2. The main body apparatus2is an apparatus for performing various processes (e.g., game processing) in the game system1. The main body apparatus2includes a display12. Each of the left controller3and the right controller4is an apparatus including operation sections with which a user provides inputs. FIG.2is a diagram showing an example of the state where each of the left controller3and the right controller4is detached from the main body apparatus2. As shown inFIGS.1and2, the left controller3and the right controller4are attachable to and detachable from the main body apparatus2. It should be noted that hereinafter, the left controller3and the right controller4will occasionally be referred to ...
DETAILED DESCRIPTION OF NON-LIMITING EXAMPLE EMBODIMENTS
1. Configuration of Game System
A game system according to an example of an exemplary embodiment is described below. An example of a game system1according to the exemplary embodiment includes a main body apparatus (an information processing apparatus; which functions as a game apparatus main body in the exemplary embodiment)2, a left controller3, and a right controller4. Each of the left controller3and the right controller4is attachable to and detachable from the main body apparatus2. That is, the game system1can be used as a unified apparatus obtained by attaching each of the left controller3and the right controller4to the main body apparatus2. Further, in the game system1, the main body apparatus2, the left controller3, and the right controller4can also be used as separate bodies (seeFIG.2). Hereinafter, first, the hardware configuration of the game system1according to the exemplary embodiment is described, and then, the control of the game system1according to the exemplary embodiment is described.
FIG.1is a diagram showing an example of the state where the left controller3and the right controller4are attached to the main body apparatus2. As shown inFIG.1, each of the left controller3and the right controller4is attached to and unified with the main body apparatus2. The main body apparatus2is an apparatus for performing various processes (e.g., game processing) in the game system1. The main body apparatus2includes a display12. Each of the left controller3and the right controller4is an apparatus including operation sections with which a user provides inputs.
FIG.2is a diagram showing an example of the state where each of the left controller3and the right controller4is detached from the main body apparatus2. As shown inFIGS.1and2, the left controller3and the right controller4are attachable to and detachable from the main body apparatus2. It should be noted that hereinafter, the left controller3and the right controller4will occasionally be referred to collectively as a “controller”.
FIG.3is six orthogonal views showing an example of the main body apparatus2. As shown inFIG.3, the main body apparatus2includes an approximately plate-shaped housing11. In the exemplary embodiment, a main surface (in other words, a surface on a front side, i.e., a surface on which the display12is provided) of the housing11has a generally rectangular shape.
It should be noted that the shape and the size of the housing11are optional. As an example, the housing11may be of a portable size. Further, the main body apparatus2alone or the unified apparatus obtained by attaching the left controller3and the right controller4to the main body apparatus2may function as a mobile apparatus. The main body apparatus2or the unified apparatus may function as a handheld apparatus or a portable apparatus.
As shown inFIG.3, the main body apparatus2includes the display12, which is provided on the main surface of the housing11. The display12displays an image generated by the main body apparatus2. In the exemplary embodiment, the display12is a liquid crystal display device (LCD). The display12, however, may be a display device of any type.
Further, the main body apparatus2includes a touch panel13on a screen of the display12. In the exemplary embodiment, the touch panel13is of a type that allows a multi-touch input (e.g., a capacitive type). The touch panel13, however, may be of any type. For example, the touch panel13may be of a type that allows a single-touch input (e.g., a resistive type).
The main body apparatus2includes speakers (i.e., speakers88shown inFIG.6) within the housing11. As shown inFIG.3, speaker holes11aand11bare formed on the main surface of the housing11. Then, sounds output from the speakers88are output through the speaker holes11aand11b.
Further, the main body apparatus2includes a left terminal17, which is a terminal for the main body apparatus2to perform wired communication with the left controller3, and a right terminal21, which is a terminal for the main body apparatus2to perform wired communication with the right controller4.
As shown inFIG.3, the main body apparatus2includes a slot23. The slot23is provided on an upper side surface of the housing11. The slot23is so shaped as to allow a predetermined type of storage medium to be attached to the slot23. The predetermined type of storage medium is, for example, a dedicated storage medium (e.g., a dedicated memory card) for the game system1and an information processing apparatus of the same type as the game system1. The predetermined type of storage medium is used to store, for example, data (e.g., saved data of an application or the like) used by the main body apparatus2and/or a program (e.g., a program for an application or the like) executed by the main body apparatus2. Further, the main body apparatus2includes a power button28.
The main body apparatus2includes a lower terminal27. The lower terminal27is a terminal for the main body apparatus2to communicate with a cradle. In the exemplary embodiment, the lower terminal27is a USB connector (more specifically, a female connector). Further, when the unified apparatus or the main body apparatus2alone is mounted on the cradle, the game system1can display on a stationary monitor an image generated by and output from the main body apparatus2. Further, in the exemplary embodiment, the cradle has the function of charging the unified apparatus or the main body apparatus2alone mounted on the cradle. Further, the cradle has the function of a hub device (specifically, a USB hub).
FIG.4is six orthogonal views showing an example of the left controller3. As shown inFIG.4, the left controller3includes a housing31. In the exemplary embodiment, the housing31has a vertically long shape, i.e., is shaped to be long in an up-down direction (i.e., a y-axis direction shown inFIGS.1and4). In the state where the left controller3is detached from the main body apparatus2, the left controller3can also be held in the orientation in which the left controller3is vertically long. The housing31has such a shape and a size that when held in the orientation in which the housing31is vertically long, the housing31can be held with one hand, particularly the left hand. Further, the left controller3can also be held in the orientation in which the left controller3is horizontally long. When held in the orientation in which the left controller3is horizontally long, the left controller3may be held with both hands.
The left controller3includes an analog stick32. As shown inFIG.4, the analog stick32is provided on a main surface of the housing31. The analog stick32can be used as a direction input section with which a direction can be input. The user tilts the analog stick32and thereby can input a direction corresponding to the direction of the tilt (and input a magnitude corresponding to the angle of the tilt). It should be noted that the left controller3may include a directional pad, a slide stick that allows a slide input, or the like as the direction input section, instead of the analog stick. Further, in the exemplary embodiment, it is possible to provide an input by pressing the analog stick32.
The left controller3includes various operation buttons. The left controller3includes four operation buttons33to36(specifically, a right direction button33, a down direction button34, an up direction button35, and a left direction button36) on the main surface of the housing31. Further, the left controller3includes a record button37and a “−” (minus) button47. The left controller3includes a first L-button38and a ZL-button39in an upper left portion of a side surface of the housing31. Further, the left controller3includes a second L-button43and a second R-button44, on the side surface of the housing31on which the left controller3is attached to the main body apparatus2. These operation buttons are used to give instructions depending on various programs (e.g., an OS program and an application program) executed by the main body apparatus2.
Further, the left controller3includes a terminal42for the left controller3to perform wired communication with the main body apparatus2.
FIG.5is six orthogonal views showing an example of the right controller4. As shown inFIG.5, the right controller4includes a housing51. In the exemplary embodiment, the housing51has a vertically long shape, i.e., is shaped to be long in the up-down direction. In the state where the right controller4is detached from the main body apparatus2, the right controller4can also be held in the orientation in which the right controller4is vertically long. The housing51has such a shape and a size that when held in the orientation in which the housing51is vertically long, the housing51can be held with one hand, particularly the right hand. Further, the right controller4can also be held in the orientation in which the right controller4is horizontally long. When held in the orientation in which the right controller4is horizontally long, the right controller4may be held with both hands.
Similarly to the left controller3, the right controller4includes an analog stick52as a direction input section. In the exemplary embodiment, the analog stick52has the same configuration as that of the analog stick32of the left controller3. Further, the right controller4may include a directional pad, a slide stick that allows a slide input, or the like, instead of the analog stick. Further, similarly to the left controller3, the right controller4includes four operation buttons53to56(specifically, an A-button53, a B-button54, an X-button55, and a Y-button56) on a main surface of the housing51. Further, the right controller4includes a “+” (plus) button57and a home button58. Further, the right controller4includes a first R-button60and a ZR-button61in an upper right portion of a side surface of the housing51. Further, similarly to the left controller3, the right controller4includes a second L-button65and a second R-button66.
Further, the right controller4includes a terminal64for the right controller4to perform wired communication with the main body apparatus2.
FIG.6is a block diagram showing an example of the internal configuration of the main body apparatus2. The main body apparatus2includes components81to85,87,88,91,97, and98shown inFIG.6in addition to the components shown inFIG.3. Some of the components81to85,87,88,91,97, and98may be mounted as electronic components on an electronic circuit board and accommodated in the housing11.
The main body apparatus2includes a processor81. The processor81is an information processing section for executing various types of information processing to be executed by the main body apparatus2. For example, the processor81may be composed only of a CPU (Central Processing Unit), or may be composed of a SoC (System-on-a-chip) having a plurality of functions such as a CPU function and a GPU (Graphics Processing Unit) function. The processor81executes an information processing program (e.g., a game program) stored in a storage section (specifically, an internal storage medium such as a flash memory84, an external storage medium attached to the slot23, or the like), thereby performing the various types of information processing.
The main body apparatus2includes a flash memory84and a DRAM (Dynamic Random Access Memory)85as examples of internal storage media built into the main body apparatus2. The flash memory84and the DRAM85are connected to the processor81. The flash memory84is a memory mainly used to store various data (or programs) to be saved in the main body apparatus2. The DRAM85is a memory used to temporarily store various data used for information processing.
The main body apparatus2includes a slot interface (hereinafter abbreviated as “I/F”)91. The slot I/F91is connected to the processor81. The slot I/F91is connected to the slot23, and in accordance with an instruction from the processor81, reads and writes data from and to the predetermined type of storage medium (e.g., a dedicated memory card) attached to the slot23.
The processor81appropriately reads and writes data from and to the flash memory84, the DRAM85, and each of the above storage media, thereby performing the above information processing.
The main body apparatus2includes a network communication section82. The network communication section82is connected to the processor81. The network communication section82communicates (specifically, through wireless communication) with an external apparatus via a network. In the exemplary embodiment, as a first communication form, the network communication section82connects to a wireless LAN and communicates with an external apparatus, using a method compliant with the Wi-Fi standard. Further, as a second communication form, the network communication section82wirelessly communicates with another main body apparatus2of the same type, using a predetermined communication method (e.g., communication based on a unique protocol or infrared light communication). It should be noted that the wireless communication in the above second communication form achieves the function of enabling so-called “local communication” in which the main body apparatus2can wirelessly communicate with another main body apparatus2placed in a closed local network area, and the plurality of main body apparatuses2directly communicate with each other to transmit and receive data.
The main body apparatus2includes a controller communication section83. The controller communication section83is connected to the processor81. The controller communication section83wirelessly communicates with the left controller3and/or the right controller4. The communication method between the main body apparatus2and the left controller3and the right controller4is optional. In the exemplary embodiment, the controller communication section83performs communication compliant with the Bluetooth (registered trademark) standard with the left controller3and with the right controller4.
The processor81is connected to the left terminal17, the right terminal21, and the lower terminal27. When performing wired communication with the left controller3, the processor81transmits data to the left controller3via the left terminal17and also receives operation data from the left controller3via the left terminal17. Further, when performing wired communication with the right controller4, the processor81transmits data to the right controller4via the right terminal21and also receives operation data from the right controller4via the right terminal21. Further, when communicating with the cradle, the processor81transmits data to the cradle via the lower terminal27. As described above, in the exemplary embodiment, the main body apparatus2can perform both wired communication and wireless communication with each of the left controller3and the right controller4. Further, when the unified apparatus obtained by attaching the left controller3and the right controller4to the main body apparatus2or the main body apparatus2alone is attached to the cradle, the main body apparatus2can output data (e.g., image data or sound data) to the stationary monitor or the like via the cradle.
Here, the main body apparatus2can communicate with a plurality of left controllers3simultaneously (in other words, in parallel). Further, the main body apparatus2can communicate with a plurality of right controllers4simultaneously (in other words, in parallel). Thus, a plurality of users can simultaneously provide inputs to the main body apparatus2, each using a set of the left controller3and the right controller4. As an example, a first user can provide an input to the main body apparatus2using a first set of the left controller3and the right controller4, and simultaneously, a second user can provide an input to the main body apparatus2using a second set of the left controller3and the right controller4.
Further, the display12is connected to the processor81. The processor81displays a generated image (e.g., an image generated by executing the above information processing) and/or an externally acquired image on the display12.
The main body apparatus2includes a codec circuit87and speakers (specifically, a left speaker and a right speaker)88. The codec circuit87is connected to the speakers88and a sound input/output terminal25and also connected to the processor81. The codec circuit87is a circuit for controlling the input and output of sound data to and from the speakers88and the sound input/output terminal25.
The main body apparatus2includes a power control section97and a battery98. The power control section97is connected to the battery98and the processor81. Further, although not shown inFIG.6, the power control section97is connected to components of the main body apparatus2(specifically, components that receive power supplied from the battery98, the left terminal17, and the right terminal21). Based on a command from the processor81, the power control section97controls the supply of power from the battery98to the above components.
Further, the battery98is connected to the lower terminal27. When an external charging device (e.g., the cradle) is connected to the lower terminal27, and power is supplied to the main body apparatus2via the lower terminal27, the battery98is charged with the supplied power.
FIG.7is a block diagram showing examples of the internal configurations of the main body apparatus2, the left controller3, and the right controller4. It should be noted that the details of the internal configuration of the main body apparatus2are shown inFIG.6and therefore are omitted inFIG.7.
The left controller3includes a communication control section101, which communicates with the main body apparatus2. As shown inFIG.7, the communication control section101is connected to components including the terminal42. In the exemplary embodiment, the communication control section101can communicate with the main body apparatus2through both wired communication via the terminal42and wireless communication not via the terminal42. The communication control section101controls the method for communication performed by the left controller3with the main body apparatus2. That is, when the left controller3is attached to the main body apparatus2, the communication control section101communicates with the main body apparatus2via the terminal42. Further, when the left controller3is detached from the main body apparatus2, the communication control section101wirelessly communicates with the main body apparatus2(specifically, the controller communication section83). The wireless communication between the communication control section101and the controller communication section83is performed in accordance with the Bluetooth (registered trademark) standard, for example.
Further, the left controller3includes a memory102such as a flash memory. The communication control section101includes, for example, a microcomputer (or a microprocessor) and executes firmware stored in the memory102, thereby performing various processes.
The left controller3includes buttons103(specifically, the buttons33to39,43,44, and47). Further, the left controller3includes the analog stick (“stick” inFIG.7)32. Each of the buttons103and the analog stick32outputs information regarding an operation performed on itself to the communication control section101repeatedly at appropriate timing.
The communication control section101acquires information regarding an input (specifically, information regarding an operation or the detection result of the sensor) from each of input sections (specifically, the buttons103and the analog stick32). The communication control section101transmits operation data including the acquired information (or information obtained by performing predetermined processing on the acquired information) to the main body apparatus2. It should be noted that the operation data is transmitted repeatedly, once every predetermined time. It should be noted that the interval at which the information regarding an input is transmitted from each of the input sections to the main body apparatus2may or may not be the same.
The above operation data is transmitted to the main body apparatus2, whereby the main body apparatus2can obtain inputs provided to the left controller3. That is, the main body apparatus2can determine operations on the buttons103and the analog stick32based on the operation data.
The left controller3includes a power supply section108. In the exemplary embodiment, the power supply section108includes a battery and a power control circuit. Although not shown inFIG.7, the power control circuit is connected to the battery and also connected to components of the left controller3(specifically, components that receive power supplied from the battery).
As shown inFIG.7, the right controller4includes a communication control section111, which communicates with the main body apparatus2. Further, the right controller4includes a memory112, which is connected to the communication control section111. The communication control section111is connected to components including the terminal64. The communication control section111and the memory112have functions similar to those of the communication control section101and the memory102, respectively, of the left controller3. Thus, the communication control section111can communicate with the main body apparatus2through both wired communication via the terminal64and wireless communication not via the terminal64(specifically, communication compliant with the Bluetooth (registered trademark) standard). The communication control section111controls the method for communication performed by the right controller4with the main body apparatus2.
The right controller4includes input sections similar to the input sections of the left controller3. Specifically, the right controller4includes buttons113and the analog stick52. These input sections have functions similar to those of the input sections of the left controller3and operate similarly to the input sections of the left controller3.
The right controller4includes a power supply section118. The power supply section118has a function similar to that of the power supply section108of the left controller3and operates similarly to the power supply section108.
FIG.8is a block diagram showing an example of a configuration in which the game system shown inFIG.1is communicably connected to a server. In the exemplary embodiment, the game system1is communicably connected to a server201via a network202. That is, the game system1and the server201are connectable to the network202such as the Internet and/or a mobile communication network. The game system1and the server201are communicable with each other via the network202. The server201is also communicable with other game systems that are of the same type as the game system1and are used by users different from the user of the game system1. Although described later in detail, the server201receives, via the network202, game data of a game executed in the game system1and the like, and stores the game data. The server201is an information processing apparatus or an information processing system including a control section (specifically, a processor) and a storage section. The server201includes a communication section that is connected to the network202and has a function of communicating with other apparatuses (e.g., the game system1) via the network202.
2. Outline of Processing in Game System
Next, an outline of processing executed in the game system1will be described with reference toFIGS.9to14. In the exemplary embodiment, the game system1executes a game in which an object is placed in a game space which is a virtual space. In the exemplary embodiment, in the game, a user (also referred to as a player) can perform editing regarding the placed object in a predetermined editing area in the game space. In the exemplary embodiment, the editing area is a room of a character that appears in the game.
[2-1. Editing of Editing Area]
First, a process regarding editing of an editing area will be described.FIG.9shows an example of a game image in an editing mode. In the exemplary embodiment, during the game, the user can start the editing mode of performing editing in the editing area. For example, when a player character operated by the user receives a request for room coordination from a non-player character that appears in the game, the editing mode of editing the room of the non-player character is started.
A room211shown inFIG.9is an example of the editing area in which the user can perform editing. As shown inFIG.9, in the room211as an example of the editing area, one or more placement objects are placed. Each placement object is an object on which the user can perform editing regarding placement thereof in the editing area. Specifically, examples of the placement objects include furniture objects such as a desk and a bed, and objects of items such as a clock and a vase.
As described above, in the exemplary embodiment, the editing area is a room in the virtual space, and the placement objects are a plurality of types of objects including at least furniture. However, the contents of the editing area and the placement objects are discretionary. For example, the specific example of the editing area may not necessarily be a room of a character, and may be a yard or the like of the house of the character. The specific examples of the placement objects may be objects of types other than furniture and items, such as objects of trees planted in the yard.
In the editing mode, the user can make an instruction for editing a placement object in the editing area. That is, in the editing mode, the game system1performs editing regarding a placement object in the editing area, based on an operation input performed by the user. The “editing regarding a placement object” includes: designating an object to be placed in the editing area; placing the designated object in the editing area; moving the object placed in the editing area; deleting, from the editing area, the object placed in the editing area; and the like. This allows the user to place a desired placement object in the editing area, and arrange the placement object at a desired position and in a desired direction in the editing area. In the example shown inFIG.9, the user can operate a cursor212, and designates, with the cursor212, a placement object to be operated.
In the exemplary embodiment, in the editing mode, the game system1measures a time during which the user performs editing.FIG.10shows an example of an editing time counting method. As shown inFIG.10, the game system1counts the time during a period in which the user is performing an input in the editing mode (this period is referred to as “mid-input period”) and a period until a predetermined time elapses from when the input has ended (this period is referred to as “post-input period). The game system1regards the counted time as the editing time. When the predetermined time has elapsed from when the input ended, the game system1stops counting the editing time. Then, when an input is performed again by the user, the game system1resumes counting the editing time. Thus, the counted editing time is a time indicating the length of the mid-input period and the post-input period in the editing mode.
During the editing mode, the game system1counts the editing time according to the above method. Then, the game system1stores the counted time at the end of the editing mode, as the editing time regarding the editing in the editing mode. In the following description, the editing time in one editing mode may be referred to as “individual editing time” so as to be distinguished from “cumulative editing time” described later. The game system1stores the individual editing time in association with the editing area (i.e., the room) or the non-player character corresponding to the editing area (i.e., the non-player character living in the room).
As described above, in the exemplary embodiment, counting of the editing time is performed based on a determination as to whether or not an input is being performed in a period in which editing is allowed (i.e., the editing mode period). Thus, the time in which the user is performing an input during the above period can be counted as the editing time (specifically, the individual editing time). Thus, whether or not the user actually takes time and effort for editing can be accurately determined.
In the exemplary embodiment, the game system1counts the editing time with respect to any input to the game system1, regardless of the content of an instruction made by the input. However, in another embodiment, for example, if an input not to be used for the game is made (i.e., if an input to a specific button on the controller is not used for the game), the game system1may not necessarily count the editing time for such an input.
In another embodiment, when the same input continues for a predetermined time or more (e.g., when one button continues to be pressed), the game system1may not necessarily count the editing time for a part or the entirety of a period in which this input is performed. This is because such an input is considered to be performed only for the purpose of counting an editing time while the user does not substantially perform editing. The game system1may not necessarily count the editing time for a part of the period in which the above input is performed (e.g., for a period after the same input continues for a predetermined time or more), or may not necessarily count the editing time for the entirety of the period in which the above input is performed.
When editing in the editing mode has been completed, the user ends the editing mode by making an editing end instruction. That is, with the editing end instruction performed by the user, the game system1determines that editing in the editing mode has been completed. The condition for determining that editing in the editing mode has been completed is discretionary. In another embodiment, the game system1may determine that editing in the editing mode has been completed, according to a predetermined completion condition (e.g., a predetermined number of placement objects having been placed).
When editing in the editing mode has been completed, the game system1stores arrangement data indicating information regarding the edited room (specifically, information about arrangement of placement objects, or the like). The arrangement data is stored in association with the room having been edited, or the non-player character corresponding to the room (i.e., the non-player character living in the room). Individual editing time data indicating the individual editing time may be stored in association with the arrangement data (e.g., such that the individual editing time data is included in the arrangement data).
When editing in the editing mode has been completed, the game system1evaluates the editing. In the exemplary embodiment, the game system1displays a message based on evaluation of the editing (in other words, evaluation of the arrangement data) as a speech of the non-player character living in the room211.
FIG.11shows an example of a game image displayed at the end of the editing mode. As shown inFIG.11, when the editing mode has ended, the game system1arranges a player character221and a non-player character222in the room211having been edited. Then, the game system1displays a message223indicating evaluation of the editing, as a speech of the non-player character222.
In the exemplary embodiment, the content of the message223varies depending on the length of the individual editing time. Specifically, when the individual editing time is shorter than a predetermined first threshold value (e.g., 5 minutes), the game system1displays a message (e.g., “You've finished already?) indicating that the evaluation is low. When the individual editing time is longer than a predetermined second threshold value (e.g., 30 minutes), the game system1displays a message (e.g., “Thank you for your hard work!”) indicating that the evaluation is high. When the individual editing time is moderate (i.e., when the individual editing time is not shorter than the first threshold value and not longer than the second threshold value), the game system1displays a message (e.g., “Thank you.”) indicating that the evaluation is moderate.
As described above, in the exemplary embodiment, the game system1displays the message of the non-player character in the virtual space such that the content of the message varies according to the evaluation. The message allows the user to know the evaluation of the editing performed by himself/herself.
In another embodiment, the method for evaluating editing may not necessarily be the above method, and any method may be adopted. For example, the game system1may display a message indicating the evaluation result in a form different from the speech of the non-player character. Moreover, for example, the game system1may give a currency used in the game to the player character, as a reward for the request by the non-player character. At this time, the amount of currency to be given may be varied according to the individual editing time.
In the exemplary embodiment, evaluation is performed by directly using the individual editing time. However, in another embodiment, an evaluation score may be calculated based on the individual editing time, and an evaluation result may be displayed based on the evaluation score. For example, an evaluation score may be calculated such that the longer the individual editing time is, the greater the evaluation score is (specifically, such that the evaluation score is in proportion to the individual editing time). Meanwhile, the game system1may calculate an evaluation score with a specific period, of the period in which the individual editing time is counted, being weighted. For example, the game system1may calculate, as an evaluation score, a value obtained by multiplying the individual editing time by a predetermined coefficient (specifically, a coefficient larger than 1) with respect to a period in which a specific input is performed, and may use, as an evaluation score, the value of the individual editing time as it is (or may calculate a value obtained by multiplying the value of the individual editing time by a coefficient smaller than 1) with respect to a period in which an input different from the specific input is performed. The specific input is, for example, an operation input directly related to editing of a placement object. The game system1may deal with, as the specific input, an operation input for moving a placement object or an input for selecting a placement object to be arranged in the room211. The input different from the specific input is an operation input not directly related to editing of a placement object. The game system1may deal with, as the input different from the specific input, an input for simply moving the cursor212(i.e., an input not for moving a placement object along with movement of the cursor212) or an input for changing the direction of the virtual camera.
As described above, the game system1may calculate an evaluation score based on the individual editing time and on the content of an input (i.e., the content of an editing operation performed by the input), and may perform evaluation based on the evaluation score. Thus, accuracy of the evaluation based on the editing time can be improved.
In the exemplary embodiment, the game system1gives the user a reward according to the evaluation. Specifically, after the above editing mode, if the player character221talks to the non-player character222that has edited the room, the non-player character222gives the player character221an item, on the condition that the evaluation of the editing is high (i.e., the individual editing time is longer than the second threshold value). Thus, motivation to get a high evaluation of editing can be given to the user. The type of the item to be given may be the same regardless of the non-player character, may be set for each non-player character, or may be set according to the content of the evaluation (i.e., the individual editing time). The content of the reward according to the evaluation is discretionary. The reward may not necessarily be an item to be used in the game, and may be any reward related to the game, such as a currency used in the game.
The timing at which the reward is given is discretionary. In another embodiment, the reward may be given immediately after the editing mode has ended. For example, when displaying the message223, the game system1may cause the item having the content according to the evaluation to be given from the non-player character222to the player character221.
In the exemplary embodiment, the user can re-edit the room that was edited by the user according to the request for coordination from the non-player character. That is, in the exemplary embodiment, the non-player character may make a request for renovation of the room for which the non-player character made the request for coordination. When the player character has received the request for renovation, an editing mode of re-editing the room is started. In this editing mode, the game system1reads out the arrangement data indicating the current arrangement of the placement objects (i.e., the arrangement data stored in the previous editing), and performs editing on the arrangement indicated by the arrangement data, according to an operation input performed by the user.
When the re-editing is performed, the game system1calculates a cumulative editing time as an editing time regarding the room to be edited.FIG.12shows an example of a method for calculating the cumulative editing time.FIG.12shows an example in which the individual editing time of the first editing is 30 minutes and the individual editing time of the second editing (i.e., the above re-editing) is 25 minutes. In the editing mode of the second editing, the game system1counts the individual editing time by the same method as in the editing mode of the first editing.
After the second editing has been completed, the game system1calculates a cumulative editing time for the room having been subjected to the second editing. The cumulative editing time is the total editing time of the preceding editing (including the current editing) performed so far on the room having been subjected to the second editing. In the example shown inFIG.12, the cumulative editing time at the time of completion of the second editing is 55 minutes which is the total of the individual editing time (30 minutes) of the first editing and the individual editing time (25 minutes) of the second editing. The game system1stores the cumulative editing time in association with the room having been edited (or the non-player character living in the room).
In the exemplary embodiment, at the time of completion of the second or subsequent editing, the game system1performs evaluation of the editing (specifically, determination of a speech of the non-player character), based on the individual editing time of the editing. Moreover, after completion of the second or subsequent editing, the game system1determines a reward to be given to the player character (specifically, an item to be given from the non-player character to the player character), based on the individual editing time. However, in another embodiment, the game system1may perform evaluation at the time of completion of the second or subsequent editing, based on the cumulative editing time, and may determine a reward to be given to the player character after completion of the second or subsequent editing, based on the cumulative editing time. Although described later in detail, the game system1uses the cumulative editing time to perform “list display” described later.
In the exemplary embodiment, the player character can edit a specific room even if there is no request from a non-player character. The specific room may be a room in which no non-player character lives, or may be a room in which a non-player character lives but does not make a request for editing. For example, the user can practice coordinating a room by editing the specific room. The game system1also stores arrangement data regarding the editing on the specific room, and also counts the editing time and stores editing time data. However, as for the editing of the specific room, a message indicating evaluation of the editing may not necessarily be displayed and a reward according to the evaluation may not necessarily be given, at the time of completion of the editing.
In the exemplary embodiment, when editing a room according to a request from the non-player character, the user can perform the editing by applying, to the room, a part or the entirety of arrangement of placement objects in the specific room. Specifically, when there is a predetermined instruction made by the user in the editing mode, the game system1reads out the arrangement data regarding the specific room, and changes arrangement of placement objects in the room being currently edited to the arrangement indicated by the arrangement data. Thus, the user can easily use the arrangement of the room, which he/she has created for practice, for editing of another room, thereby improving convenience of the user. In the exemplary embodiment, even after the arrangement in the placement objects in the room being currently edited has been changed as described above, the user can perform further editing, and can further change the arrangement in the specific room.
When another arrangement is applied to the room being currently edited as described above, the game system1calculates an individual editing time of the current editing, taking into account the editing time regarding the other arrangement.FIG.13shows an example of a method for calculating an editing time when another arrangement is applied.FIG.13shows an example in which the editing time of the specific room is 40 minutes, and the actual editing time of the editing to which the arrangement in the specific room is applied is 10 minutes. The phrase “actual editing time” means an editing time counted in the editing mode by the above method.
In the above case, the game system1adds, to the actual editing time (10 minutes in the example ofFIG.13) in the current editing mode, the editing time (40 minutes in the example ofFIG.13) regarding the other arrangement applied to the editing mode, thereby calculating the editing time (specifically, the individual editing time) in the editing mode. Therefore, in the example shown inFIG.13, the editing time in the current editing mode is calculated to be 50 minutes. In the exemplary embodiment, the editing time to be added is the cumulative editing time regarding the other arrangement. In another embodiment, the editing time to be added may be the individual editing time.
As described above, in the exemplary embodiment, when, in certain editing, arrangement data of another editing is applied, the game system1calculates the editing time of the certain editing by using the editing time of the other editing (specifically, by adding the editing time of the other editing to the editing time counted in the certain editing). Thus, the editing time can be calculated taking into account the time and effort that the user has actually taken in the above-described case. Therefore, the game system1can obtain more accurate editing time.
[2-2. Display of List of Edited Rooms]
Next, a process of displaying a list of edited rooms will be described. In the exemplary embodiment, the game system1uploads the arrangement data regarding the room edited by the user of the game system1, to the server201according to an instruction of the user. The server201stores the arrangement data transmitted from the game system1, in association with the user. In the exemplary embodiment, the arrangement data transmitted from the game system1to the server201includes cumulative editing time data indicating the cumulative editing time. The server201also receives, from other game systems of the same type as the game system1, arrangement data regarding rooms edited by users of the other game systems, and stores the arrangement data. That is, the server201stores arrangement data of a plurality of users performing the game.
In the exemplary embodiment, the game system1acquires, from the server201, a list of edited rooms (in other words, a list of arrangement data) of other users, according to an instruction of the user, and displays the acquired list. Thus, the user of the game system1can see the list of the rooms edited by the other users. Specifically, with the instruction of the user, the game system1performs a list acquisition request to the server201. In response to the list acquisition request, the server201transmits, to the game system1, list data regarding a plurality of rooms among the rooms corresponding to the arrangement data stored in the server201. The game system1displays a list image indicating the list of the rooms, by using the list data from the server201.
FIG.14shows an example of the list image. As shown inFIG.14, the list image includes a plurality of room images indicating the states of the rooms (e.g., a room image231shown inFIG.14). In the exemplary embodiment, the list image is scrollable in the up-down direction, and the list image being scrolled allows more room images to be displayed.
In the exemplary embodiment, the room image231includes a thumbnail image232indicating arrangement in the room, and a character icon image233indicating the non-player character living in the room. The room image231further includes the name of a user who has edited the room. Each of the room images included in the list image, similar to the room image231, includes a thumbnail image, a character icon image, and a user name. Each room image may include any information with which the corresponding room is identifiable. Each room image may include an icon image indicating the player character of the user, and a title given to the room by the user, in addition to the above images and information.
In the exemplary embodiment, when the user performs an operation input designating a room image included in the list image, the user can see the details of the room indicated by the room image. Specifically, when the operation input designating the room image is performed, the game system1transmits, to the server201, a request for acquiring arrangement data regarding the room image, and receives the arrangement data from the server201. Then, the game system1constructs a room in the virtual space, based on the received arrangement data, and generates and displays an image of the constructed room. At this time, the game system1may place the player character in the room and cause the player character to move in the room.
The game system1may subject the room image designated by the user to various processes in addition to (or instead of) displaying the image of the room. For example, the game system1may acquire, from the server201, data regarding the design of the room of the designated room image so that the user can use the design for his/her editing. Moreover, the game system1may register the user of the designated room image so that, for example, the user can see arrangement data uploaded by the registered user and can transmit a message to the registered user.
In the exemplary embodiment, rooms indicated by room images to be included in the list image are selected by the server201. That is, upon receiving the list acquisition request from the game system1, the server201selects some rooms from among the rooms of the arrangement data stored therein, and transmits, to the game system1, list data regarding the selected rooms.
In the exemplary embodiment, the server201performs the above-described selection by using the cumulative editing time regarding each room. The server201performs the selection by using the cumulative editing time indicated by the cumulative editing time data included in the arrangement data stored therein. Specifically, the server201calculates a selection value for each room, based on the cumulative editing time. In the exemplary embodiment, the selection value is calculated based on the cumulative editing time and also based on other elements different from the cumulative editing time (e.g., the number of placement objects arranged in the room). If the other elements are fixed, the selection value is calculated such that the shorter the cumulative editing time, the smaller the selection value is. The server201selects rooms to be included in the list data such that rooms, the selection values of which are smaller than a predetermined reference value, are excluded from the rooms of the arrangement data. That is, in the exemplary embodiment, arrangement data the selection values of which are smaller than the predetermined reference value are deleted from a set of the arrangement data stored in the server201, and arrangement data of the rooms to be included in the list are selected from the set of the arrangement data after the deletion.
As described above, in the exemplary embodiment, the game system1transmits the arrangement data to the server201, and receives, from the server201, list data indicating a list of a plurality of arrangement data transmitted from other game systems to the server201. Then, on the basis of the list data received from the server201, the game system1performs list display regarding the plurality of arrangement data. Thus, the rooms edited by the other users are introduced to the user.
In the exemplary embodiment, the game system1performs list display regarding arrangement data (in other words, rooms indicated by the arrangement data) selected based on the editing time, out of a set of arrangement data from a plurality of other information processing apparatuses. More specifically, the game system1performs list display regarding a plurality of arrangement data which are obtained by deleting, from the set of the arrangement data from the plurality of other game systems, arrangement data the editing-time-based evaluations (specifically, selection values) of which are lower than the predetermined reference value. Thus, a room the cumulative editing time of which is shorter is less likely to be included in the list, and therefore, a room for which the user has taken much time and effort becomes more likely to be included in the list. Thus, the user can be provided with the list including many rooms that are helpful for the user in editing. Moreover, the user, who wants to have many other users to see the room edited by him/her, is motivated to take time and effort in editing the room.
The cumulative editing time regarding the arrangement data is used for evaluation for the list display. That is, in the exemplary embodiment, when re-editing is performed for a room, the game system1performs evaluation of editing on the room having been re-edited, based on a time obtained by adding the editing time of the re-editing to the editing time before the re-editing is performed (i.e., based on the cumulative editing time). This allows the game system1to perform the evaluation while reflecting the time and effort taken by the user for a plurality of times of editing so far.
In another embodiment, as another example of determining arrangement data to be displayed in the list display on the basis of the editing time, the game system1may determine, according to evaluation (i.e., selection value), the order of arrangement data (specifically, room images) to be displayed in the list display. For example, the room images corresponding to the arrangement data may be arranged such that the room image of the arrangement data having the greater selection value is displayed earlier in the order (i.e., at the more upper side in the list image). In another embodiment, as for the arrangement data (specifically, room images) to be displayed in the list display, the game system1may vary the display modes of the arrangement data, based on the selection values. For example, when the selection value of a room image is greater than a predetermined reference value, the game system1may display the room image with a mark indicating that the room image is “recommended”. In the other embodiments described above, a process of deleting a part of the set of the arrangement data based on the selected values may be executed or may not be executed.
3. Specific Example of Processing in Information Processing System
Next, a specific example of information processing in the information processing system including the game system1and the server201will be described with reference toFIGS.15to20.
FIG.15shows an example of various types of data used for the information processing in the information processing system. The various types of data shown inFIG.15are stored in a storage medium (e.g., the flash memory84, the DRAM85, and/or the memory card attached to the slot23) accessible by the main body apparatus2.
As shown inFIG.15, the game system1stores therein a game program. The game program is a program for executing the game processing (specifically, the processes shown inFIGS.16to19) of the exemplary embodiment. The game system1further stores therein character data, arrangement data, and list data.
The arrangement data is data indicating arrangement of placement objects in an editing area (here, a room) to be edited. In the exemplary embodiment, the arrangement data includes placement object data, individual editing time data, and cumulative editing time data. The placement object data is data indicating the type, position, and direction of each placement object arranged in the room. The individual editing time data is data indicating the above-described individual editing time. If one room is subjected to a plurality of times of editing, a plurality of individual editing time data per editing may be included in the arrangement data. The cumulative editing time data is data indicating the above-described cumulative editing time. The arrangement data is stored for each room in the game space.
The character data is data regarding a non-player character associated with a room. In the exemplary embodiment, the character data includes data of a high evaluation flag. The high evaluation flag is a flag indicating whether or not a high evaluation has been made on editing of the room of the non-player character. The character data may further include, for example, data indicating a parameter that indicates ability and/or nature (including individuality) of the character, in addition to the above-described data. The character data is stored for each non-player character that appears in the game space.
The list data is data indicating rooms (in other words, arrangement data) included in a list image displayed in the game system1. The list data includes, for example, data of identification numbers of the rooms or the arrangement data included in the list image.
In the exemplary embodiment, a part or the entirety of the respective data shown inFIG.15is stored on the terminal side (i.e., in the game system1), and is also stored in the server201. The respective data shown inFIG.15may be stored in either the game system1or the server201. If the same data is stored in the game system1and the server201, synchronization between the data stored in the game system1and the data stored in the server201is made at an appropriate timing.
The server201stores therein a server-side game program, in addition to the data shown inFIG.15. The server-side game program is a program for executing game processing executed by the server201(i.e., a server process shown inFIG.20). That is, when a processor of the server201executes the server-side game program by using a memory, the server process described later (seeFIG.20) is executed in the server201.
FIG.16is a flowchart showing an example of a flow of a field process executed by the game system1. The field process shown inFIG.16is a process including: controlling an action of a player character or the like arranged in a game field; and displaying a game image indicating the state of the player character or the like. For example, the field process is started in accordance with that an instruction to start the game has been made by the user during execution of the game program.
In the exemplary embodiment, the processor81of the main body apparatus2executes the game program stored in the game system1to execute processes in steps shown inFIGS.16to19. The processor of the server201executes the server-side game program stored in the server201to execute processes in steps shown inFIG.20. However, in another embodiment, a part of the processes in the steps may be executed by a processor (e.g., a dedicated circuit or the like) other than the above processor. In addition, a part of the processes in the steps to be executed by the game system1may be executed by the server201, and some of the processes in the steps to be executed by the server201may be executed by the game system1. The processes in the steps shown inFIGS.16to20are merely examples, and the processing order of the steps may be changed or another process may be executed in addition to (or instead of) the processes in the steps as long as similar results can be obtained.
The processor of the game system1or the server201executes the processes in the steps shown inFIGS.16to20by using a memory (e.g., the DRAM85). That is, the processor stores information (in other words, data) obtained in each process step, in the memory, and reads out the information from the memory when using the information for the subsequent process steps.
In step S1shown inFIG.16, the processor81controls the action of each of the characters (i.e., the player character and the non-player character) in the game space. That is, the processor81acquires operation data indicating an operation input performed by the user, via the controller communication section83and/or the terminals17and21, and controls the action of the player character, based on the operation data. The processor81controls the action of the non-player character, based on an algorithm defined in the game program. In the exemplary embodiment, the process in step S1is repeatedly executed once every predetermined time period (e.g., one-frame time) except for a case where a determination result in step S2or S7described later is negative and the field process is ended. Through a single process in step S1, the processor81moves the character by an amount according to the predetermined time period. Next to step S1, the process in step S2is executed.
In step S2, the processor81determines whether or not to start editing of a room in the game space. In the exemplary embodiment, when the player character has received a request for room coordination from the non-player character during the game, or when the user has made an instruction to edit the above-described specific room, the processor81determines to start editing of the room. When the determination result in step S2is positive, the process in step S3is executed. When the determination result in step S2is negative, the process in step S4is executed.
In step S3, the processor81shifts the process to be executed from the field process to an editing mode process, and ends the field process. The editing mode process is a process for editing a room according to an instruction of the user, in the above-described editing mode. The editing mode process will be described later in detail (seeFIG.17).
Meanwhile, in step S4, the processor81determines whether or not the player character has had a conversation with the non-player character, based on the action of the player character controlled in step S2. When the determination result in step S4is positive, the process in step S5is executed. When the determination result in step S4is negative, the processes in steps S5and S6are skipped and the process in step S7is executed.
In step S5, as for the room of the non-player character which has had a conversation with the player character, the processor81determines whether or not a high evaluation was given to editing of the room performed in the past. In the exemplary embodiment, as described above, the character data regarding the non-player character includes data of a high evaluation flag indicating whether or not a high evaluation has been given to editing of the room of the non-player character. When the processor81has made a high evaluation with respect to editing in the editing mode process (i.e., when the individual editing time has been determined to be longer than the second threshold value), the processor81sets the high evaluation flag for the non-player character living in the room, to ON. The determination in step S5is performed according to whether or not the high evaluation flag is ON. When the determination result in step S5is positive, the process in step S6is executed. When the determination result in step S5is negative, the process in step S6is skipped and the process in step S7is executed.
In step S6, the processor81gives an item from the non-player character to the player character. Specifically, the processor81updates data indicating items possessed by the player character such that the item to be given is included in the data. Moreover, the processor81controls the non-player character so as to perform an action of giving the item after the conversation with the player character. Next to step S6, the process in step S7is executed.
In step S7, the processor81determines whether or not an instruction to display a menu image has been made by the user. In the exemplary embodiment, if the game image of the game field is displayed in the field process, the user can make the instruction to display the menu image by performing a predetermined operation input. When the determination result in step S7is positive, the process in step S8is executed. When the determination result in step S7is negative, the process in step S9is executed.
In step S8, the processor81shifts the process to be executed, from the field process to a menu process, and ends the field process. The menu process is a process for receiving various types of instructions with the menu image being displayed. The menu process will be described later in detail (seeFIG.18).
In step S9, the processor81generates a game image and causes the display12to display the game image. The processor81generates the game image indicating the game space in which the processing result in step S1is reflected, and displays the game image on the display12. If a processing loop of steps S1, S2and S4to S7is repeatedly executed, the process in step S9is repeatedly executed once every predetermined time period described above. Thus, a moving image indicating a state in which the characters move in the game space, is displayed. In the exemplary embodiment, the game system1displays the image on the display12. However, the image may be displayed on another display device (e.g., a monitor connected to the main body apparatus2) different from the display12. Next to step S9, the process in step S1is executed again.
Although not shown in the drawings, when a condition for ending the game has been satisfied (e.g., when the user has performed an instruction to end the game) in the field process, the processor81ends the field process.
FIG.17is a flowchart showing an example of a flow of the editing mode process executed by the game system1. The editing mode process is started in accordance with that determination to shift to the editing mode process has been made in the field process (step S3).
In step S11shown inFIG.17, the processor81sets arrangement of a room at start of editing. That is, when editing to be performed in the current editing mode is re-editing, the processor81reads out, from the memory, arrangement data regarding the room to be edited, and sets arrangement of placement objects in the room to be edited, based on the read arrangement data. Meanwhile, when the editing to be performed in the current editing mode is not re-editing, the processor81sets arrangement of placement objects in the room to be edited, to initial arrangement determined in advance. Next to step S11, the process in step S12is executed.
In step S12, the processor81determines whether or not an operation input by the user is performed on the game system1. That is, the processor81acquires the above-described operation data via the controller communication section83and/or the terminals17and21, and performs the determination in step S12, based on the operation data. When the determination result in step S12is positive, the process in step S13is executed. When the determination result in step S12is negative, the process of step S14described later is executed.
In step S13, the processor81executes a process according to the operation input received in step S12. For example, the processor81performs a process for editing, such as moving the cursor, designating a placement object with the cursor, or moving the designated placement object. Moreover, for example, when an instruction to apply the arrangement of the above-described specific room has been made by the user, the processor81reads out the arrangement data regarding the specific room, and changes the arrangement of the room being edited, based on the read arrangement data. Next to step S13, the process in step S15described later is executed.
In step S14, the processor81determines whether or not a period in which there is no operation input to the game system1by the user has continued for a predetermined time. When the determination result in step S14positive, the process in step S18described later is executed. When the determination result in step S14is negative, the process in step S15is executed.
In step S15, the processor81counts an editing time. In the exemplary embodiment, a processing loop of steps S12to S18is executed once every predetermined time period. The processor81updates the value of the current individual editing time such that the predetermined time is added thereto. Next to step S15, the process in step S16is executed.
In step S16, the processor81determines, based on the process executed in step S13, whether or not arrangement data of another room (i.e., the specific room) different from the room being currently edited, has been applied. When the determination result in step S16is positive, the process in step S17is executed. When the determination result in step S16is negative, the process in step S17is skipped and the process in step S18is executed.
In step S17, the processor81adds the editing time regarding the applied arrangement data to the value of the individual editing time. That is, the processor81reads out the cumulative editing time data regarding the specific room, and adds the value indicated by the read cumulative editing time data to the individual editing time being counted. Next to step S17, the process in step S18is executed.
In step S18, the processor81determines whether or not to end the editing mode. Specifically, the processor81determines whether or not the editing end instruction for ending the editing mode has been made by the user. When the determination result in step S18is positive, the process in step S19is executed. When the determination result in step S18is negative, the process in step S12is executed again. Thereafter, a series of processes in steps S12to S18is repeatedly executed until it is determined to end the editing mode in step S18.
In step S19, as for the arrangement data regarding the room having been edited, the processor81stores, in the memory, the placement object data and the individual editing time data. That is, the processor81stores, in the memory, data indicating the arrangement state of the placement objects obtained through the editing performed in step S13, as the placement object data. Moreover, the processor81stores, in the memory, data indicating the editing time counted during the editing mode, as the individual editing time data. Next to step S19, the process in step S20is executed.
In step S20, the processor81determines whether or not the editing performed in the current editing mode is re-editing. When the determination result in step S20is negative, the process in step S21is executed. When the determination result in step S20is positive, the process in step S22is executed.
In step S21, the processor81stores the individual editing time obtained in step S19, as the cumulative editing time of the edited room. Specifically, the processor81stores, in the memory, data of the same content as the individual editing time data included in the arrangement data regarding the edited room, as the cumulative editing time data included in the arrangement data. Next to step S21, the process in step S23is executed.
In step S22, the processor81stores, as the post-editing cumulative editing time, a time obtained by adding the individual editing time obtained in step S19to the pre-editing cumulative editing time of the room having been edited. Specifically, the processor81stores, as new cumulative editing time data, data indicating a time obtained by adding the editing time counted in the current editing mode process, to the value indicated by the cumulative editing time data included in the arrangement data regarding the edited room (i.e., to the cumulative editing time so far). Next to step S22, the process in step S23is executed.
In step S23, the processor81causes the display12to display a message according to the individual editing time calculated in step S17(seeFIG.11). Specifically, the processor81reads out the arrangement data stored in the memory, and determines a message to be displayed, based on the individual editing time data included in the arrangement data. The processor81determines the content of the message according to the method described in the above “[2-1. Editing of editing area]”, and displays the determined message together with the game image indicating the state in which the player character and the non-player character are arranged in the edited room. Thus, evaluation based on the editing time has been performed. When the evaluation result is high (i.e., when the individual editing time is longer than the second threshold value), the processor81sets the high evaluation flag for the non-player character to ON. When the evaluation result is not high, the processor81sets the high evaluation flag to OFF. After step S23, the processor81ends the editing mode process. After the editing mode process has ended, the processor81starts to execute the field process shown inFIG.17.
As described above, in the exemplary embodiment, the game system1, when performing evaluation, reads out the arrangement data stored in the memory. Then, evaluation of editing is performed based on the editing time included in the read arrangement data. The timing at which the evaluation is performed is discretionary. The evaluation may not necessarily be performed immediately after completion of the editing, and may be performed at any timing after the editing (e.g., the timing when the player character talks to the non-player character living in the edited room).
FIG.18is a flowchart showing an example of a flow of a menu process executed by the game system1. The menu process is started in accordance with that, in the field process, shifting to the menu process has been determined (step S8).
In step S31shown inFIG.18, the processor81causes the display12to display the menu image. In the exemplary embodiment, with the menu image being displayed, the processor81receives an upload instruction to upload the arrangement data to the server201, and a list display instruction to perform the above-described list display. Next to step S31, the process in step S32is executed.
In step S32, the processor81determines whether or not the upload instruction has been made by the user. When the determination result in step S32is positive, the process in step S33is executed. When the determination result in step S32is negative, the process in step S33is skipped and the process in step S34is executed.
In step S33, the processor81transmits the arrangement data stored in the memory (i.e., the arrangement data of the room having been edited by the user during the game), to the server201via the network communication section82. In the exemplary embodiment, the processor81associates data of user ID for identifying the user with the arrangement data, and transmits the data of user ID together with the arrangement data to the server201. Here, the processor81may display the list of the arrangement data stored in the memory, receive, from the user, an instruction to select the arrangement data to be transmitted to the server201, and transmit the arrangement data instructed by the user to the server201. Next to step S33, the process in step S34is executed.
In step S34, the processor81determines whether or not the list display instruction has been made by the user. When the determination result in step S34is positive, the process in step S35is executed. When the determination result in step S34is negative, the process in step S36is executed.
In step S35, the processor81shifts the process to be executed, from the menu process to the list display process, and ends the menu process. The list display process is a process for displaying the list of arrangement data regarding editing performed by other users. The list display process will be described later in detail (seeFIG.19).
In step S36, the processor81determines whether or not to end the menu process. Specifically, the processor81determines whether or not an instruction to end display of the menu image has been made. When the determination result in step S36is negative, the process in step S32is executed again. Thereafter, the processes in steps S32to S36are repeatedly executed until it is determined in step S36to end the menu process. When the determination result in step S36is positive, the processor81ends the menu process. Thereafter, the processor81starts to execute the field process shown inFIG.17.
FIG.19is a flowchart showing an example of a flow of the list display process executed by the game system1. The list display process is started in accordance with that a list display instruction has been made in the menu process.
In step S41shown inFIG.19, the processor81transmits a list acquisition request for acquiring list data, to the server201via the network communication section82. Next to step S41, the process in step S42is executed.
In step S42, the processor81receives the list data from the server201. The server201, having received the list acquisition request from the game system1, transmits the list data to the game system1(step S56described later), and the processor81receives the list data via the network communication section82. The received list data is stored in the memory. Next to step S42, the process in step S43is executed.
In step S43, the processor81displays a list image on the display12(seeFIG.14). Here, the list data received in step S42includes data for generating room images to be included in the list image. The processor81generates the list image based on the received list data, and displays the list image on the display12. Next to step S43, the process in step S44is executed.
In step S44, the processor81determines whether or not a room image included in the list image displayed on the display12has been designated by the user. When the determination result in step S44is positive, the process in step S45is executed. When the determination result in step S44is negative, the process in step S46is executed.
In step S45, the processor81shifts the process to be executed, from the list display process to a room browsing process, and ends the list display process. The room browsing process is a process for browsing the room regarding the designated room image. Although not shown in the drawings, in the room browsing process, the processor81transmits, to the server201, a request for acquiring the arrangement data regarding the designated room image, and receives the arrangement data from the server201. Then, the processor81constructs a room in the virtual space, based on the received arrangement data, and generates and displays an image of the constructed room.
In step S46, the processor81determines whether or not to end the list display. Specifically, the processor81determines whether or not an instruction to end the list display has been made by the user. When the determination result in step S46is negative, the process in step S43is executed again. Thereafter, the processes in steps S43to S46are repeatedly executed until it is determined in step S46to end the list display. When the determination result in step S46is positive, the processor81ends the list display process. Thereafter, the processor81starts to execute the menu process shown inFIG.18.
FIG.20shows an example of a flow of a server process executed by the server201. In the server201, a series of processes shown inFIG.20is repeatedly executed so as to respond to requests from a plurality of game systems.
In step S51shown inFIG.20, the processor of the server201determines whether or not arrangement data has been received from a game system via the communication section. When the determination result in step S51is positive, the process in step S52is executed. When the determination result in step S51is negative, the process in step S52is skipped and the process in step S53is executed.
In step S52, the processor stores the received arrangement data in the storage section. Since data of user ID is associated with the arrangement data transmitted from the game system, the processor stores the arrangement data and the data of user ID in association with each other. Next to step S52, the process in step S53is executed.
In step S53, the processor determines whether or not the list acquisition request has been received from the game system via the communication section. When the determination result in step S53is positive, the process in step S54is executed. When the determination result in step S53is negative, the processes in steps S54to S56are skipped and the process in step S51is executed again.
In step S54, the processor calculates the above-described selection value for each arrangement data stored in the storage section (i.e., for the room corresponding to the arrangement data). As described in the above “[2-2. Display of list of edited rooms]”, the selection value is calculated based on the cumulative editing time included in the arrangement data. Next to step S54, the process in step S55is executed.
In step S55, the processor selects arrangement data to be included in the list, based on the selection values of the respective arrangement data. As described in the above “[2-2. Display of list of edited rooms]”, from the set of the arrangement data stored in the storage section, arrangement data the selection value of which is smaller than the predetermined reference value is deleted, and arrangement data of rooms to be included in the list are selected from the set of the arrangement data after the deletion. Next to step S55, the process in step S56is executed.
In step S56, the processor transmits list data to the game system. That is, the processor generates list data regarding the respective arrangement data selected in the step S55, and transmits the generated list data to the game system via the communication section. The list data may include information to be used for generating a list image in the game system. In the exemplary embodiment, the list data includes data of identification numbers of the arrangement data, data indicating non-player characters associated with the arrangement data, data of thumbnail images of the arrangement data, and data of user IDs associated with the arrangement data. The data of the thumbnail images may be automatically generated by the server201based on the arrangement data, or may be generated based on an operation input performed by the user when arrangement data is generated on the game system1side. Next to step S56, the process in step S51is executed again.
4. Function and Effect of the Present Embodiment, and Modifications
As described above, in the above exemplary embodiment, the game program is configured to cause a processor of an information processing apparatus (e.g., the main body apparatus2) to execute the following processes.Performing, in a predetermined area (e.g., a room) in a virtual space, editing including at least one of selecting a placement object to be placed in the area, placing the placement object, and moving the placement object, on the basis of an operation input (step S13).Counting editing time during which the editing is performed (step S17).Storing, in a memory, arrangement data indicating arrangement of the placement object in the predetermined area (step S13).Performing evaluation of the editing on the basis of at least the editing time such that, when the editing time is short, lower evaluation is given as compared to the case where the editing time is long (step S17).
According to the above configuration, since the editing time is reflected in evaluation, it is possible to perform evaluation in which time and effort that the user has taken for editing are reflected. Thus, evaluation regarding editing can be appropriately performed.
The above “editing time” may be either the individual editing time or the cumulative editing time in the exemplary embodiment.
In the exemplary embodiment, displaying a message according to editing time or calculating a selection value corresponds to the above-described “evaluation”. However, a specific process for evaluation is discretionary and may not necessarily be the above processes. For example, the above-described process of calculating an evaluation score is an example of the “evaluation”. Furthermore, for example, a process of imparting a mark indicating “recommended” to a room image in a list image according to the editing time, is also an example of the “evaluation”.
In another embodiment, the information processing system may not include some of the components in the above embodiment, and may not execute some of the processes executed in the above embodiment. For example, in order to achieve a specific effect of a part of the above embodiment, the information processing system only needs to include a configuration for achieving the effect and execute a process for achieving the effect, and need not include other configurations and need not execute other processes.
The exemplary embodiment can be used as, for example, a game system and a game program, in order to, for example, appropriately performing evaluation of editing regarding arrangement of objects.
While certain example systems, methods, devices and apparatuses have been described herein, it is to be understood that the appended claims are not to be limited to the systems, methods, devices and apparatuses disclosed, but on the contrary, are intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
Claims
- A non-transitory computer-readable storage medium having stored therein a game program that, when executed by a processor of an information processing apparatus, causes the processor to execute: performing, in a predetermined area in a virtual space, editing including at least one of selecting a placement object to be placed in the area, placing the placement object, and moving the placement object, on the basis of an operation input;counting an editing time during which the editing is performed;storing, in a memory, arrangement data indicating arrangement of the placement object in the predetermined area;and performing evaluation of the editing, based on at least the editing time such that a lower evaluation is given when the editing time is shorter than when the editing time is longer.
- The non-transitory computer-readable storage medium according to claim 1, wherein the game program causes the processor to execute counting the editing time, based on a determination as to whether or not an input is performed in a period in which the editing is allowed.
- The non-transitory computer-readable storage medium according to claim 1, wherein the game program further causes the processor to execute giving a reward according to the evaluation to a user who has performed the operation input.
- The non-transitory computer-readable storage medium according to claim 1, wherein the game program further causes the processor to execute displaying a message from a non-player character in the virtual space such that the message has different contents according to the evaluation.
- The non-transitory computer-readable storage medium according to claim 1, wherein the information processing apparatus is communicable with a server, and the game program further causes the processor to execute: transmitting the arrangement data to the server;receiving, from the server, list data indicating a list regarding a plurality of arrangement data transmitted from other information processing apparatuses different from the information processing apparatus to the server;and performing list display regarding the plurality of arrangement data, based on the list data received from the server.
- The non-transitory computer-readable storage medium according to claim 5, wherein the list display relates to arrangement data that is selected based on the editing time from a set of arrangement data transmitted from a plurality of other information processing apparatuses.
- The non-transitory computer-readable storage medium according to claim 5, wherein the list display relates to the plurality of arrangement data obtained by deleting arrangement data whose evaluation based on editing time is lower than a predetermined reference, from the set of the arrangement data transmitted from a plurality of other information processing apparatuses.
- The non-transitory computer-readable storage medium according to claim 1, wherein the predetermined area is a room in the virtual space, and the placement object is any of a plurality of types of objects including at least furniture.
- The non-transitory computer-readable storage medium according to claim 1, wherein the game program further causes the processor to execute: storing, in the memory, editing time data indicating the editing time such that the editing time data is included in the arrangement data;reading out the arrangement data stored in the memory, based on an instruction input to perform re-editing;and performing, based on an operation input, re-editing on arrangement of the placement object indicated by the read arrangement data, and evaluation of editing regarding the arrangement data on which the re-editing has been performed is performed based on a time obtained by adding an editing time of the re-editing to the editing time indicated by the editing time data included in the arrangement data before the re-editing is performed.
- The non-transitory computer-readable storage medium according to claim 1, wherein the arrangement data includes editing time data indicating the editing time, the game program further causes the processor to execute reading out the arrangement data stored in the memory, when the evaluation is performed, and the evaluation is performed based on the editing time included in the read arrangement data.
- An information processing system including a terminal device and a server communicable with the terminal device, the information processing system comprising at least one processor and a storage medium having stored therein a game program, the processor being configured to execute the game program to at least: perform, in a predetermined area in a virtual space, editing including at least one of selecting a placement object to be placed in the area, placing the placement object, and moving the placement object, on the basis of an operation input;count an editing time during which the editing is performed;store, in a memory, arrangement data indicating arrangement of the placement object in the predetermined area;and perform evaluation of the editing, based on at least the editing time such that a lower evaluation is given when the editing time is shorter than when the editing time is longer.
- The information processing system according to claim 11, wherein the processor counts the editing time, based on a determination as to whether or not an input is performed in a period in which the editing is allowed.
- The information processing system according to claim 11, wherein the processor gives a reward according to the evaluation to a user who has performed the operation input.
- The information processing system according to claim 11, wherein the processor displays a message from a non-player character in the virtual space such that the message has different contents according to the evaluation.
- The information processing system according to claim 11, wherein the processor included in the terminal device transmits the arrangement data to the server, receives, from the server, list data indicating a list regarding a plurality of arrangement data transmitted from other terminal devices different from the terminal device to the server, and performs list display regarding the plurality of arrangement data, based on the list data received from the server.
- The information processing system according to claim 15, wherein the processor included in the server determines the arrangement data to be displayed in the list display, based on the editing time regarding the arrangement data.
- The information processing system according to claim 15, wherein the processor included in the server transmits, to the terminal device, the list data regarding the plurality of arrangement data obtained by deleting arrangement data whose evaluation based on editing time is lower than a predetermined reference, from a set of arrangement data transmitted from a plurality of other terminal devices.
- The information processing system according to claim 11, wherein the predetermined area is a room in the virtual space, and the placement object is any of a plurality of types of objects including at least furniture.
- The information processing system according to claim 11, wherein the processor stores, in the memory, editing time data indicating the editing time such that the editing time data is included in the arrangement data, reads out the arrangement data stored in the memory, based on an instruction input to perform re-editing, performs, based on an operation input, re-editing on arrangement of the placement object indicated by the read arrangement data, and performs evaluation of editing regarding the arrangement data on which the re-editing has been performed is performed based on a time obtained by adding an editing time of the re-editing to the editing time indicated by the editing time data included in the arrangement data before the re-editing is performed.
- The information processing system according to claim 11, wherein the arrangement data includes editing time data indicating the editing time, the processor reads out the arrangement data stored in the memory, when the evaluation is performed, and the evaluation is performed based on the editing time included in the read arrangement data.
- An information processing apparatus including a processor, the processor being configured to at least: perform, in a predetermined area in a virtual space, editing including at least one of selecting a placement object to be placed in the area, placing the placement object, and moving the placement object, on the basis of an operation input;count an editing time during which the editing is performed;store, in a memory, arrangement data indicating arrangement of the placement object in the predetermined area;and perform evaluation of the editing, based on at least the editing time such that a lower evaluation is given when the editing time is shorter than when the editing time is longer.
- The information processing apparatus according to claim 21, wherein the processor counts the editing time, based on a determination as to whether or not an input is performed in a period in which the editing is allowed.
- The information processing apparatus according to claim 21, wherein the processor gives a reward according to the evaluation to a user who has performed the operation input.
- The information processing apparatus according to claim 21, wherein the processor displays a message from a non-player character in the virtual space such that the message has different contents according to the evaluation.
- The information processing apparatus according to claim 21 being communicable with a server, wherein the processor transmits the arrangement data to the server, receives, from the server, list data indicating a list regarding a plurality of arrangement data transmitted from other information processing apparatuses different from the information processing apparatus to the server, and performs list display regarding the plurality of arrangement data, based on the list data received from the server.
- The information processing apparatus according to claim 25, wherein the processor performs the list display regarding arrangement data selected based on the editing time from a set of arrangement data transmitted from a plurality of other information processing apparatuses.
- The information processing apparatus according to claim 25, wherein the processor performs the list display regarding the plurality of arrangement data obtained by deleting arrangement data whose evaluation based on editing time is lower than a predetermined reference, from a set of arrangement data transmitted from a plurality of other information processing apparatuses.
- The information processing apparatus according to claim 21, wherein the predetermined area is a room in the virtual space, and the placement object is any of a plurality of types of objects including at least furniture.
- The information processing apparatus according to claim 21, wherein the processor stores, in the memory, editing time data indicating the editing time such that the editing time data is included in the arrangement data, reads out the arrangement data stored in the memory, based on an instruction input to perform re-editing, performs, based on an operation input, re-editing on arrangement of the placement object indicated by the read arrangement data, and performs evaluation of editing regarding the arrangement data on which the re-editing has been performed is performed based on a time obtained by adding an editing time of the re-editing to the editing time indicated by the editing time data included in the arrangement data before the re-editing is performed.
- The information processing apparatus according to claim 21, wherein the arrangement data includes editing time data indicating the editing time, the processor reads out the arrangement data stored in the memory, when the evaluation is performed, and the evaluation is performed based on the editing time included in the read arrangement data.
- A game processing method executed by an information processing system, the information processing system being configured to at least: perform, in a predetermined area in a virtual space, editing including at least one of selecting a placement object to be placed in the area, placing the placement object, and moving the placement object, on the basis of an operation input;count an editing time during which the editing is performed;store, in a memory, arrangement data indicating arrangement of the placement object in the predetermined area;and perform evaluation of the editing, based on at least the editing time such that a lower evaluation is given when the editing time is shorter than when the editing time is longer.
- The game processing method according to claim 31, wherein the information processing system counts the editing time, based on a determination as to whether or not an input is performed in a period in which the editing is allowed.
- The game processing method according to claim 31, wherein the information processing system gives a reward according to the evaluation to a user who has performed the operation input.
- The game processing method according to claim 31, wherein the information processing system displays a message from a non-player character in the virtual space such that the message has different contents according to the evaluation.
- The game processing method according to claim 31, wherein the information processing system includes an information processing apparatus, and a server communicable with the information processing apparatus, and the information processing apparatus transmits the arrangement data to the server, receives, from the server, list data indicating a list regarding a plurality of arrangement data transmitted from other information processing apparatuses different from the information processing apparatus to the server, and performs list display regarding the plurality of arrangement data, based on the list data received from the server.
- The game processing method according to claim 35, wherein the information processing apparatus performs the list display regarding arrangement data selected based on the editing time from a set of arrangement data transmitted from a plurality of other information processing apparatuses.
- The game processing method according to claim 35, wherein the information processing apparatus performs the list display regarding the plurality of arrangement data obtained by deleting arrangement data whose evaluation based on editing time is lower than a predetermined reference, from a set of arrangement data transmitted from a plurality of other information processing apparatuses.
- The game processing method according to claim 31, wherein the predetermined area is a room in the virtual space, and the placement object is any of a plurality of types of objects including at least furniture.
- The game processing method according to claim 31, wherein the information processing system stores, in the memory, editing time data indicating the editing time such that the editing time data is included in the arrangement data, reads out the arrangement data stored in the memory, based on an instruction input to perform re-editing, performs, based on an operation input, re-editing on arrangement of the placement object indicated by the read arrangement data, and performs evaluation of editing regarding the arrangement data on which the re-editing has been performed is performed based on a time obtained by adding an editing time of the re-editing to the editing time indicated by the editing time data included in the arrangement data before the re-editing is performed.
- The game processing method according to claim 31, wherein the arrangement data includes editing time data indicating the editing time, the information processing system reads out the arrangement data stored in the memory, when the evaluation is performed, and the evaluation is performed based on the editing time included in the read arrangement data.
Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.