U.S. Pat. No. 9,084,938

HANDHELD DEVICE FOR SPECTATOR VIEWING OF AN INTERACTIVE APPLICATION

AssigneeSony Interactive Entertainment Inc

Issue DateSeptember 30, 2014

Illustrative Figure

Abstract

A handheld device is provided, comprising: a sensor configured to generate sensor data for determining and tracking a position and orientation of the handheld device during an interactive session of an interactive application presented on a main display, the interactive session being defined for interactivity between a user and the interactive application; a communications module configured to send the sensor data to a computing device, the communications module being further configured to receive from the computing device a spectator video stream of the interactive session that is generated based on a state of the interactive application and the tracked position and orientation of the handheld device, the state of the interactive application being determined based on the interactivity between the user and the interactive application; a display configured to render the spectator video stream.

Description

DETAILED DESCRIPTION The following embodiments describe methods and apparatus for a system that enables an interactive application to utilize the resources of a handheld device. In one embodiment of the invention, a primary processing interface is provided for rendering a primary video stream of the interactive application to a display. A first user views the rendered primary video stream on the display and interacts by operating a controller device which communicates with the primary processing interface. Simultaneously, a second user operates a handheld device in the same interactive environment. The handheld device renders an ancillary video stream of the interactive application on a display of the handheld device, separate from the display showing the primary video stream. Accordingly, methods and apparatus in accordance with embodiments of the invention will now be described. It will be obvious, however, to one skilled in the art, that the present invention may be practiced without some or all of these specific details. In other instances, well known process operations have not been described in detail in order not to unnecessarily obscure the present invention. With reference toFIG. 1, a system for enabling a handheld device to capture video of an interactive session of an interactive application is shown, in accordance with an embodiment of the invention. The system exists in an interactive environment in which a user interacts with the interactive application. The system includes a console device10, which executes an interactive application12. The console device10may be a computer, console gaming system, set-top box, media player, or any other type of computing device that is capable of executing an interactive application. Examples of console gaming systems include the Sony Playstation 3® and the like. The interactive application12may be any of various kinds of applications which facilitate interactivity between one of more users' and the ...

DETAILED DESCRIPTION

The following embodiments describe methods and apparatus for a system that enables an interactive application to utilize the resources of a handheld device. In one embodiment of the invention, a primary processing interface is provided for rendering a primary video stream of the interactive application to a display. A first user views the rendered primary video stream on the display and interacts by operating a controller device which communicates with the primary processing interface. Simultaneously, a second user operates a handheld device in the same interactive environment. The handheld device renders an ancillary video stream of the interactive application on a display of the handheld device, separate from the display showing the primary video stream. Accordingly, methods and apparatus in accordance with embodiments of the invention will now be described.

It will be obvious, however, to one skilled in the art, that the present invention may be practiced without some or all of these specific details. In other instances, well known process operations have not been described in detail in order not to unnecessarily obscure the present invention.

With reference toFIG. 1, a system for enabling a handheld device to capture video of an interactive session of an interactive application is shown, in accordance with an embodiment of the invention. The system exists in an interactive environment in which a user interacts with the interactive application. The system includes a console device10, which executes an interactive application12. The console device10may be a computer, console gaming system, set-top box, media player, or any other type of computing device that is capable of executing an interactive application. Examples of console gaming systems include the Sony Playstation 3® and the like. The interactive application12may be any of various kinds of applications which facilitate interactivity between one of more users' and the application itself. Examples of interactive applications include video games, simulators, virtual reality programs, utilities, or any other type of program or application which facilitates interactivity between a user and the application.

The interactive application12produces video data based on its current state. This video data is processed by an audio/video processor26to generate an application video stream28. The application video stream28is communicated to a main display30and rendered on the main display30. The term “video,” as used herein, shall generally refer to a combination of image data and audio data which are synchronized, though in various embodiments of the invention, the term “video” may refer to image data alone. Thus, by way of example, the application video stream28may include both image data and audio data, or just image data alone. The main display may be a television, monitor, lcd display, projector, or any other kind of display device capable of visually rendering a video data stream.

The interactive application12also includes a controller input module24for communicating with a controller device32. The controller device32may be any of various controllers which may be operated by a user to provide interactive input to an interactive application. Examples of controller devices include the Sony Dualshock 3® wireless controller. The controller device32may communicate with the console device10via a wireless or wired connection. The controller device32includes at least one input mechanism34for receiving interactive input from a user36. Examples of input mechanisms may include a button, joystick, touchpad, touchscreen, directional pad, stylus input, or any other types of input mechanisms which may be included in a controller device for receiving interactive input from a user. As the user36operates the input mechanism34, the controller device32generates interactive input data38which is communicated to the controller input module24of the interactive application12.

The interactive application12includes a session module14, which initiates a session of the interactive application, the session defining interactivity between the user36and the interactive application12. For example, the initiated session of the interactive application may be a new session or a continuation of a previously saved session of the interactive application12. As the interactive application12is executed and rendered on the main display30via the application video stream28, the user36views the rendered application videos stream on the main display30and provides interactive input data38to the interactive application by operating the controller device32. The interactive input data38is processed by the interactive application12to affect the state of the interactive application12. The state of the interactive application12is updated, and this updated state is reflected in the application video stream28rendered to the main display30. Thus, the current state of the interactive application12determined based on the interactivity between the user36and the interactive application12in the interactive environment.

The interactive application12also includes a handheld device module16for communicating with a handheld device40. The handheld device40may be any of various handheld devices, such as a cellular phone, personal digital assistant (PDA), tablet computer, electronic reader, pocket computer, portable gaming device or any other type of handheld device capable of displaying video. One example of a handheld device is the Sony Playstation Portable®. The handheld device40is operated by a spectator48, and includes a display42for rendering a spectator video stream52. An input mechanism44is provided for enabling the user48to provide interactive input to the interactive application12. Also, the handheld device40includes position and orientation sensors46. The sensors46detect data which may be utilized to determine the position and orientation of the handheld device40within the interactive environment. Examples of position and orientation sensors include an accelerometer, magnetometer, gyroscope, camera, optical sensor, or any other type of sensor or device which may be included in a handheld device to generate data that may be utilized to determine the position or orientation of the handheld device. As the spectator48operates the handheld device40, the position and orientation sensors46generate sensor data50which is transmitted to the handheld device module16. The handheld device module16includes position module18for determining the position of the handheld device40within the interactive environment, and orientation module20for determining the orientation of the handheld device40within the interactive environment. The position and orientation of the handheld device40are tracked during the interactive session of the interactive application12.

It will be understood by those skilled in the art that any of various technologies for determining the position and orientation of the handheld device40may be applied without departing from the scope of the present invention. For example, in one embodiment, the handheld device includes an image capture device for capturing an image stream of the interactive environment. The position and orientation of the handheld device40is tracked based on analyzing the captured image stream. For example, one or more tags may be placed in the interactive environment, and utilized as fiduciary markers for determining the position and orientation of the handheld device based on their perspective distortion when captured by the image capture device of the handheld device40. The tags may be objects or figures, or part of the image displayed on the main display, that are recognized when present in the captured image stream of the interactive environment. The tags serve as fiduciary markers which enable determination of a location within the interactive environment. Additionally, the perspective distortion of the tag in the captured image stream indicates the position and orientation of the handheld device.

In other embodiments, any of various methods may be applied for purposes of tracking the location and orientation of the handheld device40. For example, natural feature tracking methods or simultaneous location and mapping (SLAM) methods may be applied to determine position and orientation of the handheld device40. Natural feature tracking methods generally entail the detection and tracking of “natural” features within a real environment (as opposed to artificially introduced fiducials) such as textures, edges, corners, etc. In other embodiments of the invention, any one or more image analysis methods may be applied in order to track the handheld device40. For example, a combination of tags and natural feature tracking or SLAM methods might be employed in order to track the position and orientation of the handheld device40.

Additionally, the movement of the handheld device40may be tracked based on information from motion sensitive hardware within the handheld device40, such as an accelerometer, magnetometer, or gyroscope. In one embodiment, an initial position of the handheld device40is determined, and movements of the handheld device40in relation to the initial position are determined based on information from an accelerometer, magnetometer, or gyroscope. In other embodiments, information from motion sensitive hardware of the handheld device40, such as an accelerometer, magnetometer, or gyroscope, may be used in combination with the aforementioned technologies, such as tags, or natural feature tracking technologies, so as to ascertain the position, orientation and movement of the handheld device40.

The interactive application includes a spectator video module22, which generates the spectator video stream52based on the current state of the interactive application12and the tracked position and orientation of the handheld device40in the interactive environment. The spectator video stream52is transmitted to the handheld device40and rendered on the display42of the handheld device40.

In various embodiments of the invention, the foregoing system may be applied to enable the spectator48to capture video of an interactive session of the interactive application12. For example, in one embodiment, the interactive application12may define a virtual space in which activity of the interactive session takes place. Accordingly, the tracked position and orientation of the handheld device40may be utilized to define a position and orientation of a virtual viewpoint within the virtual space that is defined by the interactive application. The spectator video stream52will thus include images of the virtual space captured from the perspective of the virtual viewpoint. As the spectator48maneuvers the handheld device40to different positions and orientations in the interactive environment, so the position and orientation of the virtual viewpoint will change, thereby changing the perspective in the virtual space from which the images of the spectator video stream52are captured, and ultimately displayed on the display42of the handheld device. In this manner, the spectator48can function as the operator of a virtual camera in the virtual space in an intuitive fashion—maneuvering the handheld device40so as to maneuver the virtual camera in the virtual space in a similar manner.

In an alternative embodiment, the spectator video stream is generated at the handheld device40, rather than at the console device10. In such an embodiment, application state data is first determined based on the current state of the interactive application12. This application state data is transmitted to the handheld device40. The handheld device40processes the application state data and generates the spectator video stream based on the application state data and the tracked position and orientation of the handheld device.

In another embodiment, an environmental video stream of the interactive environment is captured, and the spectator video stream52is generated based on the environmental video stream. The environmental video stream may be captured at a camera included in the handheld device40. In one embodiment, the spectator video stream52is generated by augmenting the environmental video stream with a virtual element. In such an embodiment, wherein the environmental video stream is captured at the handheld device40, then the spectator48is able to view an augmented reality scene on the display42of the handheld device40that is based on the current state of the interactive application12as it is affected by the interactivity between the user36and the interactive application12. For example, in one embodiment, the image of the user36may be detected within the environmental video stream. And a portion of or the entirety of the user's image in the environmental video stream may then be replaced with a virtual element. In one embodiment, the detected user's image in the environmental video stream could be replaced with a character of the interactive application12controlled by the user36. In this manner, when the spectator48aims the camera of the handheld device40at the user36, the spectator48sees the character of the interactive application12which the user36is controlling instead of the user36on the display42.

In another embodiment, the position and orientation of the controller32as it is operated by the user36may also be determined and tracked. The position and orientation of the controller32, as well as the position and orientation of the handheld device40relative to the location of the controller32, may be utilized to determine the spectator video stream. Thus, the location of the controller may be utilized as a reference point according to which the spectator video stream is determined based on the relative position and orientation of the handheld device40to the location of the controller32. For example, the location of the controller32may correspond to an arbitrary location in a virtual space of the interactive application12, such as the present location of a character controlled by the user36in the virtual space. Thus, as the spectator maneuvers the handheld device40in the interactive environment, its position and orientation relative to the location of the controller32determine the position and orientation of a virtual viewpoint in the virtual space relative to the location of the character. Images of the virtual space from the perspective of the virtual viewpoint may be utilized to form the spectator video stream. In one embodiment, the correspondence between the relative location of the handheld device40to the controller32and the relative location of the virtual viewpoint to the character is such that changes in the position or orientation of the handheld device40relative to the location of the controller32cause similar changes in the position or orientation of the virtual viewpoint relative to the location of the character. As such, the spectator48is able to intuitively maneuver the handheld device about the controller36to view and capture a video stream of the virtual space from various perspectives that are linked to the location of the character controlled by the user36.

The foregoing embodiment has been described with reference to a character controlled by the user36in the virtual space of the interactive application12. However, in other embodiments, the location of the controller32may correspond to any location, stationary or mobile, within the virtual space of the interactive application. For example, in some embodiments, the location of the controller may correspond to a character, a vehicle, a weapon, an object or some other thing which is controlled by the user36. Or in some embodiments, the location of the controller may correspond to moving characters or objects which are not controlled by the user32, such as an artificial intelligence character or a character being controlled by a remote user of the interactive application during the interactive session. Or in still other embodiments, the location of the controller32may correspond to a specific geographic location, the location of an object, or some other stationary article within the virtual space of the interactive application12.

In another embodiment, an initial position of the user36is determined, and the position of the user36is tracked during the interactive session. The spectator video stream is then generated based on the position and orientation of the handheld device40relative to the position of the user36. In various embodiments, the position of the user36may be determined based on various technologies. For example, an image capture device may be included in the interactive system for capturing images of the interactive environment including the user36. These images may be analyzed to determine the location and pose of the user36in the interactive environment. In one embodiment, a depth camera capable of detecting depth data of the interactive environment is included in the interactive system. The depth data may be analyzed to detect the location and pose of the user36in the interactive environment. By determining the position and/or pose of the user in the interactive environment, various features in accordance with embodiments of the invention are enabled.

For example, in one embodiment an object within a virtual environment of the interactive application12is mapped to the position of the user36. The position and orientation of the handheld device40relative to the position of the user36is utilized to define a position and orientation of a virtual viewpoint within the virtual space defined by the interactive application12. The spectator video stream52thus includes images of the virtual space captured from the perspective of the virtual viewpoint, the object being included in the images of the spectator video stream when the handheld device40is oriented towards the position of the user36. In one embodiment, the object is controlled by the user. Examples of such objects include a character, vehicle, weapon, or other type of article of the interactive application12which the user36may control. In one embodiment, the position of the user36is determined based on the position of the controller device32operated by the user.

In one embodiment, the spectator video stream52is stored in a memory. The memory may be included in the handheld device40, the console device10, or located remotely. After storing the spectator video stream52, the spectator48will be able to perform operations such as editing the stored spectator video or uploading the stored spectator video to an Internet website for sharing the video.

With reference toFIG. 2, an interactive environment including a user and a spectator interacting with an interactive application is shown, in accordance with an embodiment of the invention. The interactive environment includes a console device10which executes an interactive application. As the interactive application executes, it generates an application video stream that is rendered on a main display30. As shown by way of example only, the rendered application video stream depicts a scene60including a character62and a stop sign64. A user36views the scene60and interacts with the interactive application by operating controller32. Simultaneously, a spectator48views a display42of a handheld device40. The application video stream rendered on the main display30depicts a view of a virtual space of the interactive application from the perspective of a primary virtual viewpoint defined with reference to the location of the user36. The primary virtual viewpoint may be static, presuming the user36to be in an approximate location in front of the main display30, or dynamic, changing based on changes in the position of the user36. Simultaneously, the position and orientation of the handheld device40are determined and tracked, and utilized to define a secondary virtual viewpoint within the virtual space of the interactive application. This secondary virtual viewpoint is dynamically updated based on changes in the position and orientation of the handheld device40as it is operated by the spectator48. A spectator video stream is rendered on the handheld device40, depicting the virtual space from the perspective of the secondary virtual viewpoint.

In the embodiment shown, the virtual space includes the character62and the stop sign64. The spectator48and handheld device40are situated to the left of the user36, the spectator48orienting the handheld device40towards the apparent position of the scene60. The position and orientation of the secondary virtual viewpoint based on the position and orientation of the handheld device40is thus similarly situated relative to the primary virtual viewpoint in the virtual space. As such, the spectator40views the scene60from a different perspective that is based on the position and orientation of the handheld device40. As shown, the depiction of the scene60from the primary virtual viewpoint shows the character62having a horizontal separation from the stop sign of x. Whereas in the magnified view66of the handheld device40, the scene60is shown depicted from the perspective of the secondary virtual viewpoint, such that the horizontal separation between the character62and the stop sign64is x+Δx. As can be seen, the horizontal separation of the character62and the stop sign64is increased in the spectator video stream rendered on the handheld device40because of the different location and orientation of the secondary virtual viewpoint relative to the primary virtual viewpoint. In this manner, the spectator48views the same scene as the user36, but shown from a different virtual perspective in an intuitive manner as one would expect based on the relative positioning of the spectator48and the user36.

With reference toFIG. 3, an interactive environment including a user36and a spectator48interacting with an interactive application is shown, in accordance with an embodiment of the invention. The user36operates a controller32to provide input to the interactive application. As shown, the interactive application generates an application video stream that is rendered on the display30, based on the current state of the interactive application. As shown, the rendered application video stream depicts a scene70. The scene70may be a view of a virtual space of the interactive application. In one embodiment, the spectator48is able to view an expanded area72of the scene70by utilizing the handheld device40. For example, in one embodiment, the spectator may maneuver the handheld device so as to be directed towards a region74within the expanded area72. The position and orientation of the handheld device are tracked so as to enable determination of the direction towards which the handheld device40is oriented. The region74is shown on the handheld device40for the spectator to view. In this way, the spectator74is able to view other portions of the scene70which the user36is not able to view because of the limited view which is shown on the display30.

In another embodiment, the handheld device40initially renders a same image as that shown on the display30. Then, the spectator48may view other regions of the expanded area72, such as region74, by providing input to the handheld device40, such as by moving a joystick or touching and dragging a finger across a touchscreen of the handheld device40so as to move the view that is shown on the handheld device40to a different region. In a cooperative-style game, such a feature could be utilized to enable a spectator48to view regions of a virtual space which the user36is not presently viewing. The spectator48could communicate information based on what was viewed to the user36to aid the user36in the gameplay.

With reference toFIG. 4A, a top view of an interactive environment including a user36and a spectator48interacting with an interactive application is shown, in accordance with an embodiment of the invention. The user36operates a controller32to provide input to the interactive application. As shown, the spectator48operates a handheld device40, and orients it towards the user36. The position and orientation of the handheld device40are tracked so as to enable determination of the direction towards which the handheld device40is directed. Additionally, the position of the user36may be identified by tracking the position of the controller32.FIG. 4Billustrates the interactive environment ofFIG. 4Afrom the perspective of the spectator48. As shown, as the spectator48orients the handheld device40towards the user36(typically, directing a backside of the handheld device40so that it faces the user36), the spectator views a character80of the interactive application in place of the user36. The character may be set in a virtual space of the interactive application.

In one embodiment, the location of the user36is determined relative to the handheld device40by capturing an image stream from a rear-facing camera of the handheld device40. The captured image stream is analyzed to determine the presence of the user36, and the user36is replaced with a character of the interactive application. In one embodiment, selected portions of the user36may be augmented with virtual features, so as to form an augmented reality image stream which is displayed on the handheld device. For example, the user36could be shown holding a weapon, wearing different clothes, or being outfitted with various other items. In one embodiment, the user36might be shown set in a virtual space of the interactive application. In this manner, the spectator48is able to view the user36as if he or she is directly in the virtual space of the interactive application.

With reference toFIG. 5, an interactive environment including a user36interacting with an interactive application is shown from the perspective of a spectator operating a handheld device40, in accordance with an embodiment of the invention. The user36operates a controller32to provide input to the interactive application. As shown, the user36is playing a chess game, which the user36views on the display30. Simultaneously, the spectator is able to view a virtual space90in which the chess game takes place. As the spectator maneuvers the handheld device40within the interactive environment, the virtual space90is shown from different perspectives on the handheld device40based on the location and orientation of the handheld device40. In this manner, the spectator is able to view the virtual space90from different vantage points as if it existed in the interactive environment. The handheld device40thus functions as a viewer or virtual window enabling the spectator to view the virtual space of the interactive application.

With reference toFIG. 6, an interactive environment is shown from the perspective of a spectator, in accordance with an embodiment of the invention. As shown, a user36operates motion controllers100aand100bwhile viewing an interactive application rendered on a display30. In the present embodiment, the interactive application is a boxing game, wherein the movements of the motion controllers100aand100bas they are maneuvered by the user36are detected and processed to enable the interactive application to render similar movements of fists shown on the display30. In this manner, the user36is provided with an experience of fighting in a boxing match, wherein as the user36throws punches, the fists shown on the display move in a similar fashion. Thus, the user36controls the fists and their movements by moving the motion controllers100aand100b. In one embodiment, the interactive system includes an image capture device106, which captures images of the motion controllers100aand100b. These images are analyzed to determine the motion of the controllers100aand100b. Furthermore, the captured images may be analyzed to determine the position and orientation of the handheld device40. In one embodiment, the image capture device106includes a depth camera for capturing depth information of the interactive environment. The captured depth information can be processed to aid in determining the location and movements of the user36as well as the spectator and the handheld device40. Examples of motion controllers include the Playstation Move® controller which is manufactured by Sony Computer Entertainment Inc.

In one embodiment, the user36fights against a virtual opponent102. The spectator operating a handheld device40directs the handheld device40towards the user36(typically, so that the back side of the handheld device40faces the user36). The handheld device40displays video of the user36fighting with the virtual opponent102in real-time. In one embodiment, as the spectator directs the handheld device40towards the user36, a camera included in the handheld device40captures video including the user36and the motion controllers100aand100b. The captured video may be analyzed to determine the location of the user as well as the movements of the motion controllers100aand100b. The handheld device40may also receive information about the location of the user and movements of the motion controllers100aand100bfrom the interactive application. In one embodiment, the user36is shown in the video displayed on the handheld device40but with boxing gloves on the user's36hands in place of the motion controllers100aand100b. In another embodiment, instead of displaying the user36, a character of the boxing game that is being controlled by the user36is displayed on the handheld device40.

Additionally, as the spectator maneuvers the handheld device40to different positions and orientations relative to the user36, the video shown on the handheld device40will reflect these changes. Thus, if the spectator moves to a location approximately behind the user36and directs the handheld device40towards the user36, then the spectator will be able to view on the handheld device40the back side of the user36or a character controlled by the user36, and may see the front side of the virtual opponent102if not occluded by the user36or some other object. Similarly, if the spectator moves to a location approximately in front of the user36and directs the handheld device towards the user36, then the spectator will be able to view the front side of the user36if not occluded by, e.g. the virtual opponent102(the back side of which may be visible in the view). The spectator may maneuver the handheld device40to be closer to the user36and so zoom in on the user36, or maneuver the handheld device40to be further away, and so be able to enjoy a more expansive view of the boxing match between the user36and the virtual opponent102. As the spectator moves the handheld device40to different positions and orientations, the handheld device40displays video from a virtual viewpoint within a virtual space of the interactive application whose position and orientation is determined based on the position and orientation of the handheld device40. Thus, as the user36interacts with the boxing game, the spectator is able to interact as a virtual cameraman, controlling the viewpoint from which video of the interactive game is displayed on the handheld device40by maneuvering the handheld device40.

In another embodiment, the camera106may be a depth camera capable of capturing depth information of objects in the interactive environment. In one embodiment, the user36does not require motion controllers, but is able to provide interactive input to the interactive application through movements which are detected by the camera106. The camera106may be configured to capture both images of the interactive environment as well as depth information, which are processed to determine the position, orientation, and movements of the user36. These are then utilized by the interactive application as input.

With reference toFIG. 7, an interactive environment is shown, including a console device10which executes an interactive application that is rendered on a main display30. A user36operates a controller32so as to provide interactive input to the interactive application. An image capture device106captures images of the interactive environment. Simultaneously, a spectator48operates and views a handheld device40, the handheld device40rendering an ancillary video stream which is related to that being rendered on the main display30. As shown, the position, orientation, and movements of the controller32and the handheld device40may be tracked in a three-dimensional (x, y, z) coordinate space. In the illustrated embodiment, the controller32has coordinates (4, 2, 3) whereas the handheld device has coordinates (8, 3, 2). By tracking the positions and movements of the controller32and the handheld device40, it is possible to determine the relative positioning of the controller32and the handheld device40. This information may be utilized to affect the ancillary video stream rendered on the handheld device40, for example, determining the location of a virtual viewpoint within a virtual space of the interactive application based on the relative positions of the controller32and handheld device40in the real-world interactive environment.

In one embodiment, the interactive application may define a correlation between the virtual space and the aforementioned three-dimensional coordinate space in the real-world interactive environment, such that the coordinate system of the real-world interactive environment is mapped to a coordinate system of the virtual space. In various embodiments, the nature of this mapping may vary. For example, in one embodiment, the mapping may be scaled one-to-one, such that distances in the real-world correspond to equivalent distances in the virtual world. In other embodiments, the mapping may be scaled by a certain factor, such that distances in the virtual-world are greater than or less than corresponding distances in the real-world.

Additionally, the scale of the mapping for each of the corresponding axes of the virtual and real-world coordinate systems may vary. For example, in one embodiment, the y-axis in the real-world may be mapped so as to have an increased scale in the virtual world (e.g. multiplied by a factor greater than one). Whereas, the x-axis and z-axis in the real-world may be mapped to the corresponding axes in the virtual space in an exact correspondence. One example of such a configured is illustrated with reference toFIGS. 8A and 8B.FIG. 8Aillustrates a coordinate space of a real-world interactive environment. A vector movement110is shown from coordinates (1, 1, 0) to (3, 3, 0).FIG. 8Billustrates a corresponding coordinate space of a virtual space. The vector112from coordinates (1, 2, 0) to (3, 6, 0) corresponds to the vector movement110. As can be seen, the vector movement110is amplified in the vertical y-axis only, whereas it is not amplified in the x and z-axes. Thus, by selectively amplifying y-axis movements, when the spectator48moves the handheld device40vertically, the movement of the virtual viewpoint in the virtual space is magnified, which may enable the spectator48to more easily view and capture video from high and low positions, whereas movement of the handheld device along horizontal directions (x-axis and z-axis) is not magnified. It will be appreciated by those skilled in the art that in other embodiments of the invention, the scale of the mapping for any given direction or axis may vary from other directions or axes.

And furthermore, the scale of the mapping may vary along any given axis or direction, or with reference to a particular location, such as the location of the user. For example, in one embodiment, the mapping near the user is scaled in a more exact correspondence, whereas mapping further away from the user is scaled so that movements of the handheld device away from the user are magnified in the virtual space. Thus, when the spectator operates the handheld device near the user, movements in the interactive environment correspond to similar movements in the virtual space. Whereas, as the spectator moves the handheld device away from the user, the corresponding movement in the virtual space (e.g. movement away from a character controlled by the user) may be come progressively magnified. This enables the spectator to control the location of the virtual viewpoint in a natural manner when close the user, yet also have the ability to move the virtual viewpoint further away in the virtual space than would be possible if the mapping maintained the same scale of correspondence throughout.

One example of such an embodiment is illustrated with reference toFIGS. 9A and 9B.FIG. 9Aillustrates a user36whose location in an interactive environment defines the origin of a spherical or cylindrical coordinate system. A vector120is shown indicating a movement from a distance of 1 to 2 away from the user36. Similarly, a vector122indicates a movement from a distance of 2 to 3; and a vector124indicates a movement from 3 to 4.FIG. 9Billustrates a virtual space to which the interactive environment has been mapped. The location125in the virtual space corresponds to the location of the user36in the interactive environment. For example, in some embodiments, the location125may represent the location of a character, vehicle or some other object controlled by the user36. The vector126corresponds to the real-world vector120. As shown, vector126has an exact correspondence to vector120, indicating a change in distance from 1 to 2. The vector128corresponds to the vector122. However, as shown, the mapping of the coordinate space of the interactive environment to that of the virtual space is such that the vector122is magnified in scale by a factor of two. Hence the vector122indicating a change in distance from 2 to 3 in the real-world interactive environment is mapped to the vector128indicating a change in distance from 2 to 4 in the virtual space. The vector130corresponds to the vector124. As shown, while the vector124indicates a change in distance from 3 to 4 in the interactive environment, the corresponding vector130indicates a change in distance from 4 to 7, thereby tripling the change in distance in the virtual space.

The foregoing illustrated example represents merely one embodiment wherein mapping from the interactive environment to the virtual space is configured to have increasing amplification with increased distance from a reference location in the interactive environment. In other embodiments, the changes in amplification with increasing distance may vary according to any schema, such as linearly, logarithmically, or according to preset zones which may be constant within each zone or vary as well.

In some embodiments, the nature of the mapping may change in accordance with changes in the state of the interactive application. By varying the mapping and correspondence of the real-world to the virtual world, it is possible to tailor the use of information regarding real-world movements of the handheld device40and controller32, so as to enable improved interactive functionality with respect to the interactive application.

With reference toFIG. 10A, an overhead view of an interactive environment is shown, in accordance with an embodiment of the invention. A user36operates a controller32to interact with an interactive application which is rendered on a display30. Simultaneously, a spectator48operates a handheld device40in the interactive environment, the handheld device40displaying video of a virtual space of the interactive application from a virtual viewpoint in the virtual space which is based on the location of the handheld device40relative to the location of the user36. In an interactive sequence, the spectator48moves from a position140next to the user36, to a position142further to the side of the user36, to a position144in front of the user36. With reference toFIG. 10B, scenes146,148, and150illustrate video which is displayed on the handheld device40when the spectator is located at positions140,142, and144respectively.

In the illustrated embodiment, the user36is playing an interactive game in which the user36drives a vehicle in the virtual space. The position of the user36is mapped to the position of a virtual driver (controlled by the user36) of the vehicle in the virtual space. Thus, when the spectator48is located at position140adjacent to the user36, the virtual viewpoint in the virtual space is located adjacent to the driver of the vehicle. As shown by the scene146, the video shown on the handheld device40illustrates the interior of the vehicle from the perspective of a viewpoint adjacent to the virtual driver of the vehicle. When the spectator48moves to the position142further to the side of the user36, the virtual viewpoint is moved in a corresponding fashion in the virtual space. As shown by the corresponding scene148, the virtual viewpoint has moved to a location outside of the vehicle, as the scene148illustrates the side of the vehicle. Similarly, when the spectator48moves to the position144in front of the user36, the virtual viewpoint is moved in a corresponding manner within the virtual space, to a position in front of the vehicle. As shown by the scene150, when located at position144, the user48views the front of the vehicle on the handheld device40.

For ease of description, some of the aforementioned embodiments of the invention have been generally described with reference to one player operating a controller (or providing input through a motion detection system) and/or one user operating a handheld device. It will be understood by those skilled in the art that in other embodiments of the invention, there may be multiple users operating controllers (or providing input through motion detection) and/or multiple users operating handheld devices.

With reference toFIG. 11, a method for enabling a portable device operated by a spectator to display video of an interactive game is shown, in accordance with an embodiment of the invention. At method operation160, a player initiates gameplay on a console device. At method operation162, a spectator activates a portable device. At method operation164, a network connection between the portable device and the console is established. At method operation166, information about the game is communicated from the console to the portable device. At method operation168, an initial pose of the portable device is established. In various embodiments, the initial pose may be determined through various mechanisms, such as defining a current state of internal position tracking sensors as the initial pose, or by searching for a marker on the TV screen and defining the relationship to the TV as the initial pose. At method operation170, an updated game state is transmitted to the portable device as the player moves through the game. At method operation172, the portable device displays the spectator's private view of the game. At method operation174, as the spectator moves the portable device, the device senses changes in position relative to the initial pose. In various embodiments, this may be accomplished by, for example, using internal motion sensors, a directional pad, or by tracking the marker on the TV screen and determining position based on its new orientation. At method operation176, the portable device uses this new information to update the spectator's private view of the display.

In various embodiments of the invention, various other method operations may be included in the method. For example, in one embodiment, the spectator may press a record button on the portable device to initiate recording of the spectator's private view of the game on the portable device. In another embodiment, the portable device is configured to allow the spectator to pan, zoom, or crop the live video or the prerecorded video. In another embodiment, the portable device provides the spectator with other video editing tools to manipulate the live or prerecorded video, such as chroma keying, color change tools, cut and paste, etc. In yet another embodiment, the spectator can upload the recorded video or images to an Internet site to share with others.

In various embodiments of the invention, different configurations may be utilized to enable the portable device to display video of the game. For example, in one embodiment, the console renders all images from the portable device's perspective and streams these rendered images to the portable device. In this case, the portable device communicates its pose to the console, but is not required to execute or maintain any software or information specifically related to the actual execution of the game itself other than that needed to render the images which it receives from the console. However, in another embodiment, the portable device contains some or all of the levels and models of the game being played. In this embodiment, the console only needs to send the coordinates of the dynamic objects to the portable device so that the portable device can update its scene graph and render the appropriate images on its display. In yet another embodiment, the portable device could run a scaled down version of the interactive application, with the stored levels and models having reduced complexity.

Furthermore, in various embodiments of the invention, the interactive functionality of the portable device thus described may be designed with or without allowing the spectator to affect the outcome of the game. For example, in one embodiment, the spectator's personal view would be limited to a reasonably small deviation from the player's game view. This allows the spectator some level of control without revealing much additional information that the spectator could share with the player. Conversely, in another embodiment, the spectator might be allowed full degrees of motion with the portable device. In this case, the spectator may actually impact the outcome of the game because the spectator can watch action that is occurring outside the view of the player and communicate this to the player (e.g. calling out a warning if an enemy approaches from behind that the player is not aware of).

With reference toFIG. 12, a system for enabling a spectator to utilize a handheld device to affect the state of an interactive application is shown, in accordance with an embodiment of the invention. A console device10executes an interactive application180. An application video stream188is generated based on the state of the interactive application180and sent to a main display30for rendering. A user36operates a controller device32to provide input to the interactive application180via a controller module186which handles the raw input from the controller device32. An application data generator182generates application data190, which is sent to an ancillary application194executing on the handheld device40. The handheld device40includes an input mechanism200for receiving input from the spectator48and a display198. The ancillary application194includes a graphical user interface (GUI)196which is displayed on the display198. The GUI may provide a series of menus and prompts which allow the spectator to affect the state of the interactive application, such as aural and visual aspects of the interactive application. The spectator48provides input utilizing the input mechanism200. Based on the spectator's input, the ancillary application generates modification data192which is sent to a modification data handler184of the interactive application180on the console device10. Based on the modification data192, the state of the interactive application180is altered, which affects the application video stream188which is generated.

With reference toFIG. 13, a method for enabling a spectator to affect a game utilizing a portable device is shown, in accordance with an embodiment of the invention. At method operation210, a player initiates gameplay on a console device. At method operation212, a spectator activates a portable device. At method operation214, the portable device establishes a network connection with the console device. At method operation216, information about the game is communicated from the console device to the portable device. In various embodiments, the information may include, by way of example only, characters in play, available costumes for a character, a list of songs available to play as a sound track, or other parameters of the game which the spectator may be able to adjust. At method operation218, the portable device presents the choices provided by the game to the spectator. At method operation220, the spectator selects one of the choices. For example, the spectator might pick a song to be played or select a costume to be worn by a character. At method operation222, the portable device communicates the selection to the console via the network connection. At method operation224, the console invokes the selection. For example, the console may play the selected song or change the costume of a character to the selected costume.

In various embodiments, the parameters which the spectator is allowed to adjust may or may not affect the outcome of the game to varying degrees. For example, in one embodiment, the spectator is able to modify the costumes or sound effects of the game, such that the outcome of the game is unlikely to be affected. In another embodiment, the spectator might control more intrusive elements of the game, such as the flow of time (e.g. speeding it up or slowing it down), or changing the world's gravity to match that of a larger or smaller planet, or perhaps set no gravity at all. Such changes would likely have a significant impact on the outcome of the game.

With reference toFIG. 14, a diagram illustrating components of a portable device40is shown, in accordance with an embodiment of the invention. The portable device40includes a processor300for executing program instructions. A memory302is provided for storage purposes, and may include both volatile and non-volatile memory. A display304is included which provides a visual interface that a user may view. A battery306is provided as a power source for the portable device40. A motion detection module308may include any of various kinds of motion sensitive hardware, such as a magnetometer310, an accelerometer312, and a gyroscope314.

An accelerometer is a device for measuring acceleration and gravity induced reaction forces. Single and multiple axis models are available to detect magnitude and direction of the acceleration in different directions. The accelerometer is used to sense inclination, vibration, and shock. In one embodiment, three accelerometers312are used to provide the direction of gravity, which gives an absolute reference for two angles (world-space pitch and world-space roll).

A magnetometer measures the strength and direction of the magnetic field in the vicinity of the controller. In one embodiment, three magnetometers310are used within the controller, ensuring an absolute reference for the world-space yaw angle. In one embodiment, the magnetometer is designed to span the earth magnetic field, which is ±80 microtesla. Magnetometers are affected by metal, and provide a yaw measurement that is monotonic with actual yaw. The magnetic field may be warped due to metal in the environment, which causes a warp in the yaw measurement. If necessary, this warp can be calibrated using information from other sensors such as the gyroscope or the camera. In one embodiment, accelerometer312is used together with magnetometer310to obtain the inclination and azimuth of the portable device40.

A gyroscope is a device for measuring or maintaining orientation, based on the principles of angular momentum. In one embodiment, three gyroscopes314provide information about movement across the respective axis (x, y and z) based on inertial sensing. The gyroscopes help in detecting fast rotations. However, the gyroscopes can drift overtime without the existence of an absolute reference. This requires resetting the gyroscopes periodically, which can be done using other available information, such as positional/orientation determination based on visual tracking of an object, accelerometer, magnetometer, etc.

A camera316is provided for capturing images and image streams of a real environment. More than one camera may be included in the portable device40, including a camera that is rear-facing (directed away from a user when the user is viewing the display of the portable device), and a camera that is front-facing (directed towards the user when the user is viewing the display of the portable device). Additionally, a depth camera318may be included in the portable device for sensing depth information of objects in a real environment.

The portable device40includes speakers320for providing audio output. Also, a microphone322may be included for capturing audio from the real environment, including sounds from the ambient environment, speech made by the user, etc. The portable device40includes tactile feedback module324for providing tactile feedback to the user. In one embodiment, the tactile feedback module324is capable of causing movement and/or vibration of the portable device40so as to provide tactile feedback to the user.

LEDs326are provided as visual indicators of statuses of the portable device40. For example, an LED may indicate battery level, power on, etc. A card reader328is provided to enable the portable device40to read and write information to and from a memory card. A USB interface330is included as one example of an interface for enabling connection of peripheral devices, or connection to other devices, such as other portable devices, computers, etc. In various embodiments of the portable device40, any of various kinds of interfaces may be included to enable greater connectivity of the portable device40.

A WiFi module332is included for enabling connection to the Internet via wireless networking technologies. Also, the portable device40includes a Bluetooth module334for enabling wireless connection to other devices. A communications link336may also be included for connection to other devices. In one embodiment, the communications link336utilizes infrared transmission for wireless communication. In other embodiments, the communications link336may utilize any of various wireless or wired transmission protocols for communication with other devices.

Input buttons/sensors338are included to provide an input interface for the user. Any of various kinds of input interfaces may be included, such as buttons, touchpad, joystick, trackball, etc. Additionally, bio-sensors may be included to enable detection of physiological data from a user. In one embodiment, the bio-sensors include one or more dry electrodes for detecting bio-electric signals of the user through the user's skin.

An ultra-sonic communication module340may be included in portable device40for facilitating communication with other devices via ultra-sonic technologies.

The foregoing components of portable device40have been described as merely exemplary components that may be included in portable device40. In various embodiments of the invention, the portable device40may or may not include some of the various aforementioned components. Embodiments of the portable device40may additionally include other components not presently described, but known in the art, for purposes of facilitating aspects of the present invention as herein described.

FIG. 15illustrates hardware and user interfaces that may be used to execute and render an interactive application, in accordance with one embodiment of the present invention. More specifically,FIG. 15schematically illustrates the overall system architecture of the Sony® Playstation 3® entertainment device, a console that may be compatible for interfacing a control device and a handheld device with a computer program executing at a base computing device in accordance with embodiments of the present invention. A system unit700is provided, with various peripheral devices connectable to the system unit700. The system unit700comprises: a Cell processor728; a Rambus® dynamic random access memory (XDRAM) unit726; a Reality Synthesizer graphics unit730with a dedicated video random access memory (VRAM) unit732; and an I/O bridge734. The system unit700also comprises a Blu Ray® Disk BD-ROM® optical disk reader740for reading from a disk740aand a removable slot-in hard disk drive (HDD)736, accessible through the I/O bridge734. Optionally the system unit700also comprises a memory card reader738for reading compact flash memory cards, Memory Stick® memory cards and the like, which is similarly accessible through the I/O bridge734.

The I/O bridge734also connects to six Universal Serial Bus (USB) 2.0 ports724; a gigabit Ethernet port722; an IEEE 802.11b/g wireless network (Wi-Fi) port720; and a Bluetooth® wireless link port718capable of supporting up to seven Bluetooth connections.

In operation, the I/O bridge734handles all wireless, USB and Ethernet data, including data from one or more game controllers702-703. For example when a user is playing a game, the I/O bridge734receives data from the game controller702-703via a Bluetooth link and directs it to the Cell processor728, which updates the current state of the game accordingly.

The wireless, USB and Ethernet ports also provide connectivity for other peripheral devices in addition to game controllers702-703, such as: a remote control704; a keyboard706; a mouse708; a portable entertainment device710such as a Sony Playstation Portable® entertainment device; a video camera such as an EyeToy® video camera712; a microphone headset714; and a microphone715. Such peripheral devices may therefore in principle be connected to the system unit700wirelessly; for example the portable entertainment device710may communicate via a Wi-Fi ad-hoc connection, whilst the microphone headset714may communicate via a Bluetooth link.

The provision of these interfaces means that the Playstation 3 device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital cameras, portable media players, Voice over IP telephones, mobile telephones, printers and scanners.

In addition, a legacy memory card reader716may be connected to the system unit via a USB port724, enabling the reading of memory cards748of the kind used by the Playstation® or Playstation 2® devices.

The game controllers702-703are operable to communicate wirelessly with the system unit700via the Bluetooth link, or to be connected to a USB port, thereby also providing power by which to charge the battery of the game controllers702-703. Game controllers702-703can also include memory, a processor, a memory card reader, permanent memory such as flash memory, light emitters such as an illuminated spherical section, LEDs, or infrared lights, microphone and speaker for ultrasound communications, an acoustic chamber, a digital camera, an internal clock, a recognizable shape such as the spherical section facing the game console, and wireless communications using protocols such as Bluetooth®, WiFi™, etc.

Game controller702is a controller designed to be used with two hands, and game controller703is a single-hand controller with an attachment. In addition to one or more analog joysticks and conventional control buttons, the game controller is susceptible to three-dimensional location determination. Consequently gestures and movements by the user of the game controller may be translated as inputs to a game in addition to or instead of conventional button or joystick commands. Optionally, other wirelessly enabled peripheral devices such as the Playstation™ Portable device may be used as a controller. In the case of the Playstation™ Portable device, additional game or control information (for example, control instructions or number of lives) may be provided on the screen of the device. Other alternative or supplementary control devices may also be used, such as a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown) or bespoke controllers, such as a single or several large buttons for a rapid-response quiz game (also not shown).

The remote control704is also operable to communicate wirelessly with the system unit700via a Bluetooth link. The remote control704comprises controls suitable for the operation of the Blu Ray™ Disk BD-ROM reader540and for the navigation of disk content.

The Blu Ray™ Disk BD-ROM reader740is operable to read CD-ROMs compatible with the Playstation and PlayStation 2 devices, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs. The reader740is also operable to read DVD-ROMs compatible with the Playstation 2 and PlayStation 3 devices, in addition to conventional pre-recorded and recordable DVDs. The reader740is further operable to read BD-ROMs compatible with the Playstation 3 device, as well as conventional pre-recorded and recordable Blu-Ray Disks.

The system unit700is operable to supply audio and video, either generated or decoded by the Playstation 3 device via the Reality Synthesizer graphics unit730, through audio and video connectors to a display and sound output device742such as a monitor or television set having a display744and one or more loudspeakers746. The audio connectors750may include conventional analogue and digital outputs whilst the video connectors752may variously include component video, S-video, composite video and one or more High Definition Multimedia Interface (HDMI) outputs. Consequently, video output may be in formats such as PAL or NTSC, or in 720p, 1080i or 1080p high definition.

Audio processing (generation, decoding and so on) is performed by the Cell processor728. The Playstation 3 device's operating system supports Dolby® 5.1 surround sound, Dolby® Theatre Surround (DTS), and the decoding of 7.1 surround sound from Blu-Ray® disks.

In the present embodiment, the video camera712comprises a single charge coupled device (CCD), an LED indicator, and hardware-based real-time data compression and encoding apparatus so that compressed video data may be transmitted in an appropriate format such as an intra-image based MPEG (motion picture expert group) standard for decoding by the system unit700. The camera LED indicator is arranged to illuminate in response to appropriate control data from the system unit700, for example to signify adverse lighting conditions. Embodiments of the video camera712may variously connect to the system unit700via a USB, Bluetooth or Wi-Fi communication port. Embodiments of the video camera may include one or more associated microphones and also be capable of transmitting audio data. In embodiments of the video camera, the CCD may have a resolution suitable for high-definition video capture. In use, images captured by the video camera may for example be incorporated within a game or interpreted as game control inputs. In another embodiment the camera is an infrared camera suitable for detecting infrared light.

In general, in order for successful data communication to occur with a peripheral device such as a video camera or remote control via one of the communication ports of the system unit700, an appropriate piece of software such as a device driver should be provided. Device driver technology is well-known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the present embodiment described.

The aforementioned system devices, including a console device and a portable handheld device, constitute means for enabling the handheld device to display and capture video of an interactive session of an interactive application presented on a main display. The console device constitutes means for initiating the interactive session of the interactive application, the interactive session defining interactivity between a user and the interactive application. The system devices constitute means for determining an initial position and orientation of the handheld device operated by a spectator. The console device constitutes means for determining a current state of the interactive application based on the interactivity between the user and the interactive application. The system devices constitute means for tracking the position and orientation of the handheld device during the interactive session. The system devices constitute means for generating a spectator video stream of the interactive session based on the current state of the interactive application and the tracked position and orientation of the handheld device. The handheld device constitutes means for rendering the spectator video stream on a handheld display of the handheld device.

Embodiments of the present invention may be practiced with various computer system configurations including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like. The invention can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a network.

With the above embodiments in mind, it should be understood that the invention can employ various computer-implemented operations involving data stored in computer systems. These operations are those requiring physical manipulation of physical quantities. Any of the operations described herein that form part of the invention are useful machine operations. The invention also relates to a device or an apparatus for performing these operations. The apparatus may be specially constructed for the required purpose, such as a special purpose computer. When defined as a special purpose computer, the computer can also perform other processing, program execution or routines that are not part of the special purpose, while still being capable of operating for the special purpose. Alternatively, the operations may be processed by a general purpose computer selectively activated or configured by one or more computer programs stored in the computer memory, cache, or obtained over a network. When data is obtained over a network the data maybe processed by other computers on the network, e.g., a cloud of computing resources.

The embodiments of the present invention can also be defined as a machine that transforms data from one state to another state. The transformed data can be saved to storage and then manipulated by a processor. The processor thus transforms the data from one thing to another. Still further, the methods can be processed by one or more machines or processors that can be connected over a network. Each machine can transform data from one state or thing to another, and can also process data, save data to storage, transmit data over a network, display the result, or communicate the result to another machine.

One or more embodiments of the present invention can also be fabricated as computer readable code on a computer readable medium. The computer readable medium is any data storage device that can store data, which can be thereafter be read by a computer system. Examples of the computer readable medium include hard drives, network attached storage (NAS), read-only memory, random-access memory, CD-ROMs, CD-Rs, CD-RWs, magnetic tapes and other optical and non-optical data storage devices. The computer readable medium can include computer readable tangible medium distributed over a network-coupled computer system so that the computer readable code is stored and executed in a distributed fashion.

Although the method operations were described in a specific order, it should be understood that other housekeeping operations may be performed in between operations, or operations may be adjusted so that they occur at slightly different times, or may be distributed in a system which allows the occurrence of the processing operations at various intervals associated with the processing, as long as the processing of the overlay operations are performed in the desired way.

Although the foregoing invention has been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications can be practiced within the scope of the appended claims. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.

Claims

  1. A handheld device, comprising: a sensor configured to generate sensor data for determining and tracking a position and orientation of the handheld device during an interactive session of an interactive application presented on a main display, the interactive session being defined for interactivity between a user and the interactive application;a communications module configured to send the sensor data to a computing device, the communications module being further configured to receive from the computing device a spectator video stream of the interactive session that is generated based on a state of the interactive application and the tracked position and orientation of the handheld device, the state of the interactive application being determined based on the interactivity between the user and the interactive application;a display configured to render the spectator video stream;wherein the position and orientation of the handheld device define a position and orientation of a virtual viewpoint within a virtual space defined by the interactive application;and wherein the spectator video stream includes images of the virtual space captured from the perspective of the virtual viewpoint.
  1. The handheld device of claim 1 , wherein the state of the interactive application is determined from interactive input, the interactive input being received from a controller device during the interactive session.
  2. The handheld device of claim 1 , wherein the interactive session defines interactivity between the user and an application video stream rendered on the main display, the application video stream being generated based on the state of the interactive application.
  3. The handheld device of claim 1 , wherein the sensor is defined by one or more of an accelerometer, a magnetometer, and/or a gyroscope;and wherein the sensor data includes inertial data generated by the sensor.
  4. The handheld device of claim 1 , wherein the sensor is defined by an image capture device;and wherein the sensor data is defined by captured image data of an interactive environment in which the interactivity occurs.
  5. The handheld device of claim 5 , wherein determining and tracking the position and orientation of the handheld device is based on identification of one or more markers in the interactive environment.
  6. A handheld device, comprising: a sensor configured to generate sensor data for determining and tracking a position and orientation of the handheld device during an interactive session of an interactive application presented on a main display, the interactive session being defined for interactivity between a user and the interactive application;a communications module configured to send the sensor data to a computing device, the communications module being further configured to receive application state data from the computing device that is generated based on a state of the interactive application, the state of the interactive application being determined based on the interactivity between the user and the interactive application;a spectator video stream generator configured to generate a spectator video stream based on the application state data and the tracked position and orientation of the handheld device;a display configured to render the spectator video stream.
  7. The handheld device of claim 7 , further comprising: an image capture device configured to capture an environmental video stream of an interactive environment in which the interactivity between the user and the interactive application occurs;wherein the spectator video stream generator is configured to generate the spectator video stream based on the environmental video stream.
  8. The handheld device of claim 8 , wherein the spectator video stream generator is configured to generate the spectator video stream by augmenting the environmental video stream with a virtual element.
  9. The handheld device of claim 8 , wherein the spectator video stream generator is configured to generate the spectator video stream by replacing at least a portion of the user that is detected in the environmental video stream with a virtual element.
  10. The handheld device of claim 10 , wherein the replacing at least a portion of the user includes replacing the detected user in the environmental video stream with a character of the interactive application controlled by the user.
  11. The handheld device of claim 7 , wherein the spectator video stream is generated based on a tracked position and orientation of the handheld device relative to a tracked position of the user.
  12. The handheld device of claim 7 , wherein the sensor is defined by one or more of an accelerometer, a magnetometer, and/or a gyroscope;and wherein the sensor data includes inertial data generated by the sensor.
  13. A handheld device, comprising: a sensor configured to generate sensor data for determining and tracking a position and orientation of the handheld device during an interactive session of an interactive application presented on a main display, the interactive session being defined for interactivity between a user and the interactive application;a communications module configured to send the sensor data to a computing device, the communications module being further configured to receive from the computing device a spectator video stream of the interactive session that is generated based on a state of the interactive application and the tracked position and orientation of the handheld device, the state of the interactive application being determined based on the interactivity between the user and the interactive application;a display configured to render the spectator video stream;wherein the spectator video stream is generated based on the position and orientation of the handheld device relative to a tracked position of the user during the interactive session;wherein the position and orientation of the handheld device relative to the position of the user define a position and orientation of a virtual viewpoint within a virtual space defined by the interactive application;and wherein an object within a virtual environment of the interactive application is mapped to the position of the user;wherein the spectator video stream includes images of the virtual space captured from the perspective of the virtual viewpoint, the object being included in the images of the spectator video stream when the handheld device is oriented towards the position of the user.
  14. The handheld device of claim 14 , wherein the object is controlled by the user.
  15. The handheld device of claim 15 , wherein the object is a character of the interactive application.
  16. The handheld device of claim 14 , wherein the position of the user is determined based on the position of a controller device operated by the user.
  17. The handheld device of claim 17 , further comprising, an image capture device configured to capture an environmental video stream of an interactive environment in which the interactivity between the user and the interactive application occurs;wherein the position of the user and the position and orientation of the handheld device relative to the position of the user are determined at least in part based on the environmental video stream.

Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.