U.S. Pat. No. 11,209,966

EXTENDED ON-SCREEN GAMEPLAY VIA AUGMENTED REALITY

AssigneeDisney Enterprises Inc

Issue DateJanuary 24, 2019

Illustrative Figure

Abstract

Various embodiments of the invention disclosed herein provide techniques for extending on-screen gameplay via an augmented reality (AR) system. An extended AR application executing on an AR headset system receives, via a game controller, first data associated with a first object associated with a computer-generated game. The extended AR application renders an augmented reality object based on the first data associated with the first object. The extended AR application displays at least a first portion of the augmented reality object via an augmented reality headset system. Further, an image associated with the computer-generated game is simultaneously rendered on a display monitor.

Description

DETAILED DESCRIPTION In the following description, numerous specific details are set forth to provide a more thorough understanding of the present invention. However, it will be apparent to one of skill in the art that embodiments of the present invention may be practiced without one or more of these specific details. System Overview FIG. 1illustrates a system100configured to implement one or more aspects of the present invention. As shown, the system includes, without limitation, a gaming console102, an AR headset system104, a display monitor106, and a game controller108in communication with each other via a computer network120. Computer network120may be any suitable environment to enable communications among remote or local computer systems and computing devices, including, without limitation, point-to-point communications channels, Bluetooth, WiFi, infrared communications, wireless and wired LANs (Local Area Networks), and one or more internet-based WANs (Wide Area Networks). Gaming console102, includes, without limitation, a computing device that may be a standalone server, a cluster or “farm” of servers, one or more network appliances, or any other device suitable for implementing one or more aspects of the present invention. Illustratively, gaming console102communicates over computer network120via communications link112. In operation, gaming console102executes an extended gaming application to control various aspects of gameplay. Gaming console102receives control inputs from AR headset system104related to tracking data. The tracking data includes the position and/or orientation of the AR headset system104. Additionally or alternatively, the tracking data includes the position and/or orientation of one or more objects detected by AR headset system104. Similarly, gaming console102receives control inputs from game controller108. The control inputs include, without limitation, button presses, trigger activations, and tracking data associated with game controller108. The tracking data may include the location and/or orientation of the game controller108. Game controller108transmits control inputs to gaming console102. In this manner, gaming console102determines which controls of ...

DETAILED DESCRIPTION

In the following description, numerous specific details are set forth to provide a more thorough understanding of the present invention. However, it will be apparent to one of skill in the art that embodiments of the present invention may be practiced without one or more of these specific details.

System Overview

FIG. 1illustrates a system100configured to implement one or more aspects of the present invention. As shown, the system includes, without limitation, a gaming console102, an AR headset system104, a display monitor106, and a game controller108in communication with each other via a computer network120. Computer network120may be any suitable environment to enable communications among remote or local computer systems and computing devices, including, without limitation, point-to-point communications channels, Bluetooth, WiFi, infrared communications, wireless and wired LANs (Local Area Networks), and one or more internet-based WANs (Wide Area Networks).

Gaming console102, includes, without limitation, a computing device that may be a standalone server, a cluster or “farm” of servers, one or more network appliances, or any other device suitable for implementing one or more aspects of the present invention. Illustratively, gaming console102communicates over computer network120via communications link112.

In operation, gaming console102executes an extended gaming application to control various aspects of gameplay. Gaming console102receives control inputs from AR headset system104related to tracking data. The tracking data includes the position and/or orientation of the AR headset system104. Additionally or alternatively, the tracking data includes the position and/or orientation of one or more objects detected by AR headset system104.

Similarly, gaming console102receives control inputs from game controller108. The control inputs include, without limitation, button presses, trigger activations, and tracking data associated with game controller108. The tracking data may include the location and/or orientation of the game controller108. Game controller108transmits control inputs to gaming console102. In this manner, gaming console102determines which controls of game controller108are active. Further, gaming console102tracks the location and orientation of the game controller108.

AR headset system104includes, without limitation, a computing device that may be a personal computer, personal digital assistant, mobile phone, mobile device, or any other device suitable for implementing one or more aspects of the present invention. In some embodiments, AR headset system104may include an embedded computing system that is integrated within augmented reality goggles, augmented reality glasses, heads-up display (HUD), handheld device, or any other technically feasible AR viewing device. Illustratively, AR headset system104communicates over network120via communications link114.

In operation, AR headset system104transmits tracking data to gaming console102. The tracking data may include the location and/or orientation of the AR headset system104. AR headset system104transmits the location and orientation of AR headset system104to gaming console102. In this manner, gaming console102tracks the location and orientation of the head of the user.

Additionally or alternatively, the tracking data may include location and/or orientation of objects detected by the AR headset system104. As one example, AR headset system104could include a camera and a mechanism for tracking objects visible in images captured by the camera. AR headset system104could detect when the hands of the user are included in the image captured by the camera. AR headset system104could then determine the location and orientation of the hands of the user. AR headset system104would then transmit the location and orientation of the hands of the user to gaming console102.

AR headset system104receives location and/or orientation data from gaming console102regarding objects to display as AR objects. In response, AR headset system104renders and displays the AR objects. In this manner, the user sees the AR objects in 3D space in addition to seeing the images rendered by gaming console102and displayed on display monitor106.

Display monitor106includes, without limitation, any conventional display device such as a cathode ray tube, liquid crystal display, light-emitting diode display, or the like. Display monitor106includes a computing device that may be an embedded processor, a personal computer, personal digital assistant, mobile phone, mobile device, or any other device suitable for implementing one or more aspects of the present invention. Illustratively, display monitor106communicates over network120via communications link116. In operation, display monitor106receives image data from gaming console102, and displays the image data on the display device. Additionally or alternatively, display monitor106receives image data from AR headset system104, and displays the image data on the display device.

Game controller108is a device that includes one or more controls that may be activated by a user. Such controls include, without limitation, buttons, a joystick, a “weapon” that may be aimed and fired, and steering mechanism. Illustratively, game controller108communicates over computer network120via communications link118. In operation, game controller108detects when one or more controls are activated, such as when a button is pushed, a joystick is moved, or a weapon is aimed and fired. Game controller108converts the detected activations into electronic data and transmits the electronic data to one or both of gaming console102and AR headset system104over communications channel118.

It will be appreciated that the system shown herein is illustrative and that variations and modifications are possible. In one example, although the system100ofFIG. 1is illustrated with one gaming console102, one AR headset system104, one VR/AR headset106, one display monitor106, and one game controller108, the system100could include any technically feasible number of gaming consoles102, AR headset systems104, VR/AR headsets106, display monitors106, and game controllers108within the scope of the present disclosure. In another example, the techniques are disclosed as being executed in part on gaming console102and in part on AR headset system104. However, the disclosed techniques could be performed entire on gaming console102and/or entirely on AR headset system104within the scope of the present disclosure.

In yet another example, the techniques are disclosed herein in the context of computer gaming environments. However, the disclosed techniques could be employed in any technically feasible environment within the scope of the present disclosure. The disclosed techniques could be employed in teleconferencing applications where individual or multi-person groups, such as a group convened in a corporate conference room, communicate and/or collaboratively work with each other. Additionally or alternatively, the disclosed techniques could be employed in scenarios where a user engages in an interactive meeting with a physician or other professional. Additionally or alternatively, the disclosed techniques could be employed in collaborative work scenarios where a single user, multiple users, and/or groups of users review and edit various documents that appear on a display monitor106and/or in 3D space as AR objects rendered and displayed by AR headset system104. Any or all of these embodiments are within the scope of the present disclosure, in any technically feasible combination.

In some embodiments, multiple users access one or more systems, such as system100. For example, a first user and a second user could each access a different system100to execute a remote multiplayer game. A first user could access a first system100that scans and analyzes a physical environment associated with the first user. Similarly, a second user could access a second system100that scans and analyzes a physical environment associated with the second user. The system100associated with the first user could exchange this physical environment data with the system100associated with the second user. Consequently, when the first user and/or the second user enter a common area of a virtual map associated with the remote multiplayer game, the first user could view a representation of the physical environment of the second user on the display monitor106associated with the first user. Likewise, the second user could view a representation of the physical environment of the first user on the display monitor106associated with the second user. In this manner, the first user and the second user sees a virtual viewport into the physical environment of the other user. Further, either user could “enter” the representation of the physical environment of the other user, such that the first user and the second user appear to occupy the same virtual space, even though the first user and the second user are physically remote from one another.

In some embodiments, a user associated with the system100may employ a wearable computing device, such as a smartwatch. In such embodiments, the display of the wearable computing device may replace and/or augment the display of display monitor106. Similarly, the tracking data and/or other controls of the wearable computing device may replace and/or augment the tracking data and/or other controls associated with AR headset system104and/or game controller108. In this manner, such a wearable computing device may be capable to perform any or all of the techniques disclosed herein.

Techniques for rendering and displaying 2D and 3D AR objects as part of an immersive computer gaming experience are now described in greater detail below in conjunction withFIGS. 2-9C.

Extended On-Screen Gameplay Via Augmented Reality

FIG. 2is a more detailed illustration of the gaming console102ofFIG. 1, according to various embodiments of the present invention. As shown in gaming console102includes, without limitation, a central processing unit (CPU)202, storage204, an input/output (I/O) devices interface206, a network interface208, an interconnect210, and a system memory212.

The processor202retrieves and executes programming instructions stored in the system memory212. Similarly, the processor202stores and retrieves application data residing in the system memory212. The interconnect210facilitates transmission, such as of programming instructions and application data, between the processor202, input/output (I/O) devices interface206, storage204, network interface208, and system memory212. The I/O devices interface206is configured to receive input data from user I/O devices222. Examples of user I/O devices222may include one of more buttons, a keyboard, and a mouse or other pointing device. The I/O devices interface206may also include an audio output unit configured to generate an electrical audio output signal, and user I/O devices222may further include a speaker configured to generate an acoustic output in response to the electrical audio output signal. Another example of a user I/O device222is a display device that generally represents any technically feasible means for generating an image for display. For example, the display device could be a liquid crystal display (LCD) display, CRT display, or DLP display. The display device may be a TV that includes a broadcast or cable tuner for receiving digital or analog television signals.

Processor202is included to be representative of a single CPU, multiple CPUs, a single CPU having multiple processing cores, and the like. And the system memory212is generally included to be representative of a random access memory. The storage204may be a disk drive storage device. Although shown as a single unit, the storage204may be a combination of fixed and/or removable storage devices, such as fixed disc drives, floppy disc drives, tape drives, removable memory cards, or optical storage, network attached storage (NAS), or a storage area-network (SAN). Processor202communicates to other computing devices and systems via network interface208, where network interface208is configured to transmit and receive data via a communications network.

The system memory212includes, without limitation, an extended gaming application232and a data store242. Extended gaming application232, when executed by the processor202, performs one or more operations associated with gaming console ofFIG. 1, as further described herein. Data store242provides memory storage for various items of 2D and 3D gaming content, as further described herein. Extended gaming application232stores data in and retrieves data from data store242, as further described herein.

In operation, extended gaming application232controls various aspects of gameplay for a particular on-screen computer game. Extended gaming application232receives control inputs from AR headset system104related to tracking data. The tracking data includes the position and/or orientation of the AR headset system104. Additionally or alternatively, the tracking data includes the position and/or orientation of one or more objects detected by AR headset system104.

Similarly, extended gaming application232receives control inputs from game controller108. The control inputs include, without limitation, button presses, trigger activations, and tracking data associated with game controller108. The tracking data may include the location and/or orientation of the game controller108. Game controller108transmits control inputs to extended gaming application232. In this manner, extended gaming application232determines which controls of game controller108are active. Further, extended gaming application232tracks the location and orientation of the game controller108.

Extended gaming application232proceeds through gameplay based on events within the executing game, control inputs received from AR headset system104, and control inputs received from game controller108. Extended gaming application232detects when an object being displayed on display monitor106is transitioning from the bounds of the display monitor106and about to enter the 3D space outside of the display monitor106. In response, extended gaming application232transmits position data, orientation data, and other related information to AR headset system104. Extended gaming application232removes the object from being displayed on display monitor106. In response to receiving position data, orientation data, and other related information from extended gaming application232, AR headset system104renders and displays the AR objects. In this manner, the user sees the AR objects in 3D space in addition to seeing the images rendered by extended gaming application232and displayed on display monitor106.

Similarly, extended gaming application232detects when an object being displayed as an AR object by AR headset system104is transitioning from 3D space to enter into the bounds of the display monitor106. In response, extended gaming application232transmits position data, orientation data, and other related information to AR headset system104. In response, AR headset system104ends the rendering and displaying the AR object. Extended gaming application232then renders and displays the object display monitor106.

When exiting or entering the bound of the display monitor106, an object may undergo a transition phase, where a portion of the object is rendered and displayed on display monitor106and a portion of the object is rendered and display as an AR object in 3D space. During the transition phase, extended gaming application232determines a first portion of the object that is to be displayed on display monitor106. Extended gaming application232renders and displays the first portion of the object on display monitor106. Similarly, extended gaming application232determines a second portion of the object that is to be displayed on as an AR object. Extended gaming application232transmits position data, orientation data, and other information related to the second portion of the object to AR headset system104. In response to receiving position data, orientation data, and other related information from extended gaming application232, AR headset system104renders and displays the second portion of the object as an AR object.

FIG. 3is a more detailed illustration of the AR headset system104ofFIG. 1, according to various embodiments of the present invention. As shown in AR headset system104includes, without limitation, a central processing unit (CPU)302, storage304, an input/output (I/O) devices interface306, a network interface308, an interconnect310, a system memory312, a vision capture device314, a display316, and sensors318.

The processor302retrieves and executes programming instructions stored in the system memory312. Similarly, the processor302stores and retrieves application data residing in the system memory312. The interconnect310facilitates transmission, such as of programming instructions and application data, between the processor302, input/output (I/O) devices interface306, storage304, network interface308, system memory312, vision capture device314, display316, and sensors318. The I/O devices interface306is configured to receive input data from user I/O devices322. Examples of user I/O devices322may include one of more buttons, a keyboard, and a mouse or other pointing device. The I/O devices interface306may also include an audio output unit configured to generate an electrical audio output signal, and user I/O devices322may further include a speaker configured to generate an acoustic output in response to the electrical audio output signal. Another example of a user I/O device322is a display device that generally represents any technically feasible means for generating an image for display. For example, the display device could be a liquid crystal display (LCD), light emitting diode (LED) display, CRT display, or DLP display. The display device may be a TV that includes a broadcast or cable tuner for receiving digital or analog television signals.

The vision capture device314includes one or more cameras to capture images from the physical environment for analysis, processing, and display. In operation, the vision capture device314captures and transmits vision information to any one or more other elements included in the AR headset system104. In some embodiments, the vision capture device314provides support for various vision-related functions, including, without limitation, image recognition, visual inertial odometry, and simultaneous locating and mapping.

The display316includes one or more display devices for displaying AR objects and other AR content. The display may be embedded into a head-mounted display (HMD) system that is integrated into the AR headset system104. The display316reflects, overlays, and/or generates an image including one or more AR objects into or onto the physical environment via an LCD display, LED display, projector, or any other technically feasible display technology. The display316may employ any technically feasible approach to integrate AR objects into the physical environment, including, without limitation, pass-thru, waveguide, and screen-mirror optics approaches.

The sensors318include one or more devices to acquire location and orientation data associated with the AR headset system104. The sensors318may employ any technically feasible approach to acquire location and orientation data, including, without limitation, gravity-sensing approaches and magnetic-field-sensing approaches. In that regard, the sensors318may include any one or more accelerometers, gyroscopes, magnetometers, and/or any other technically feasible devices for acquiring location and orientation data. The location and orientation data acquired by sensors318may be supplemental to or as an alternative to camera orientation data, e.g. yaw, pitch, and roll data, generated by the vision capture device314.

Processor302is included to be representative of a single CPU, multiple CPUs, a single CPU having multiple processing cores, and the like. And the system memory312is generally included to be representative of a random access memory. The storage304may be a disk drive storage device. Although shown as a single unit, the storage304may be a combination of fixed and/or removable storage devices, such as fixed disc drives, floppy disc drives, tape drives, removable memory cards, or optical storage, network attached storage (NAS), or a storage area-network (SAN). Processor302communicates to other computing devices and systems via network interface308, where network interface308is configured to transmit and receive data via a communications network.

The system memory312includes, without limitation, an extended AR application332and a data store342. Extended AR application332, when executed by the processor302, performs one or more operations associated with gaming console ofFIG. 1, as further described herein. Data store342provides memory storage for various items of 2D and 3D gaming content, as further described herein. Extended AR application332stores data in and retrieves data from data store342, as further described herein

In operation, extended AR application332transmits tracking data to gaming console102. The tracking data may include the location and/or orientation of the AR headset system104. Extended AR application332transmits the location and orientation of AR headset system104to gaming console102. In this manner, gaming console102tracks the location and orientation of the head of the user.

Additionally or alternatively, extended AR application332transmits the location and/or orientation of objects detected by the extended AR application332. As one example, AR headset system104could include a camera and a mechanism for tracking objects visible in images captured by the camera. Extended AR application332could detect when the hands of the user are included in the image captured by the camera. Extended AR application332could then determine the location and orientation of the hands of the user. Extended AR application332would then transmit the location and orientation of the hands of the user to gaming console102.

Further, extended AR application332receives position data, orientation data, and other related information from gaming console102regarding objects transitioning from being displayed on display monitor to being displayed as an AR object. In response to receiving such position data, orientation data, and other related information, extended AR application332renders and displays the AR objects. Similarly, extended AR application332receives position data, orientation data, and other related information from gaming console102regarding objects transitioning from being displayed as an AR object to being displayed on display monitor. In response to receiving such position data, orientation data, and other related information, extended AR application332ends the rendering and displaying the AR object. In this manner, the user sees the AR objects rendered by extended AR application332in 3D space in addition to seeing the images rendered by extended gaming application232and displayed on display monitor106.

When exiting or entering the bound of the display monitor106, an object may undergo a transition phase, where a portion of the object is rendered and displayed on display monitor106and a portion of the object is rendered and display as an AR object in 3D space. During the transition phase, extended AR application332receives position data, orientation data, and other related information from gaming console102regarding the portion of the object that is to be rendered and displayed as an AR object. In response to receiving such position data, orientation data, and other related information, extended AR application332renders and displays the portion of the object to be display as an AR object.

FIG. 4illustrates an exemplary calibration process for the system100ofFIG. 1, according to various embodiments of the present invention. As shown, a user402views a display monitor408while wearing an AR headset system404and holding a game controller406. The display monitor408displays two calibration glyphs410and412.

Gaming console102displays calibration glyphs410and412prior to gameplay to perform initial calibration of the system100. The user views calibration glyphs410and412via AR headset system404. AR headset system404transmits position and/or orientation data of calibration glyphs410and412to gaming console102. Gaming console102accesses data regarding the physical screen size of display monitor408as well as the original size of calibration glyphs410and412. In some embodiments, the physical screen size of display monitor408, the original size of calibration glyphs410and412, and other related information may be encoded as data in calibration glyphs410and412. The data could be included in calibration glyphs410and412as QR codes (as shown) or in any other technically feasible encoding format. Based on this data, and on the position and/or orientation data of calibration glyphs410and412received from AR headset system404, gaming console102determines the location, size, and orientation of display monitor408with respect to the user402. After initial calibration, gaming console102may continually update calibration data. To perform continual calibration, calibration glyphs410and412or other suitable glyphs may be encoded into the images displayed on display monitor408at regular intervals. Calibration glyphs410and412could be encoded as watermarks in the game images rendered by gaming console102so that the user402is unable to detect that calibration glyphs410and412are included in the display image.

Additionally or alternatively, gaming console102may perform initial calibration by tracking the location and/or orientation of game controller406. Gaming console102displays an image to place game controller406at a first fixed location of display monitor408, such as the upper left corner of display monitor408, and activate a button on game controller406. Game controller406then transmits the location and/or orientation of game controller406to gaming console102. Gaming console102then displays an image to place game controller406at a second fixed location of display monitor408, such as the lower right corner of display monitor408, and activate a button on game controller406. Game controller406then transmits the location and/or orientation of game controller406to gaming console102. Based on these two locations and/or orientations, gaming console102determines the location, size, and orientation of display monitor408.

Additionally or alternatively, AR headset system404and/or gaming console102may include simultaneous localization and mapping (SLAM) capability. Via SLAM technology, gaming console102continually receives updated information from AR headset system404regarding the location, size, and orientation of display monitor408with respect to the user402.

Additionally or alternatively, gaming console102may determine the location, size, and orientation of display monitor408via a beacon (not explicitly shown). A beacon establishes a fixed position in 3D space as an anchor for the AR objects displayed in 3D space. In general, a beacon is a device that provides an anchor point to a particular point within the physical environment of the user. The beacon transmits location information to gaming console102and/or AR headset system404over any technically feasible wired or wireless communications link. Via this location information, gaming console102and/or AR headset system404track and anchor computer-generated virtual objects relative to the location of the beacon.

FIG. 5illustrates an exemplary computer game generated by the system100ofFIG. 1, according to various embodiments of the present invention. As shown, a user502views a display monitor508while wearing an AR headset system504and holding a game controller506. The display monitor508displays an image510of an on-screen video game.

The user502manipulates game controller506to move a character from a first position512within the bounds of display monitor508to a second position514above display monitor508. In response, AR headset system504renders and displays the character512as an AR object that appears above display monitor508. Additionally, AR headset system504renders and displays additional game tiles520,522, and524as AR objects that appear above display monitor508. Additionally or alternatively, AR headset system504may render and display additional AR objects to the left or right of display monitor508and/or below display monitor508. Additionally or alternatively, AR headset system504may render and display additional AR objects on other walls within the physical environment, such as walls to the left or right of the user502and/or behind the user502. In this manner, the user502plays the game via virtual objects displayed on various walls in the physical environment in addition to playing the game via display monitor508.

In some embodiments, gaming console102renders and displays other additional elements that lead to hidden objects outside of the bounds of display monitor508. In one example, display monitor508could show a portion of a vine that appears to grow vertically out of the top of display monitor508. The user502could manipulate game controller506and/or AR headset system504to move character512in position to climb the vine. As the character512exits the top of display screen508, AR headset system504could render the character512and a portion of the vine as AR objects that the user502sees above display monitor508.

In another example, display monitor508could show a portion of a pipe that appears to end at the left or right of display monitor508. The user502could manipulate game controller506and/or AR headset system504to move character512in position to crawl along the pipe. As the character512exits the left or right of display screen508, AR headset system504could render the character512and a portion of the pipe as AR objects that the user502sees to the left or right of display monitor508.

FIG. 6illustrates another exemplary computer game generated by the system100ofFIG. 1, according to various embodiments of the present invention. As shown, a user602views a display monitor608while wearing an AR headset system604and holding a game controller606. The display monitor608displays an image of an on-screen video game.

The user606manipulates game controller606to guide starship610during gameplay. Enemy starships620,622, and624fire laser weapons directed at the user602and/or at starship610. The user606manipulates game controller606so that starship610avoids being hit by the laser beams from the laser weapons that enter the bounds of display monitor608. Further, the user606may physically move within the physical environment to avoid being hit by the laser weapons.

AR headset system604renders and displays enemy starships620,622, and624. AR headset system604further renders and displays AR objects related to the portion of the laser beams fired by the laser weapons of enemy starships620,622, and624that appear in 3D space. Similarly, gaming console102renders and displays the portion of the laser beams fired by the laser weapons of enemy starships620,622, and624that enter the bounds of display monitor608. More specifically, the laser beam630fired by enemy starship620does not enter the bounds of display monitor608. Therefore, AR headset system604renders the entirety of laser beam630as an AR object. By contrast, the laser beam fired by enemy starship622enters the bounds of display monitor608. Therefore, AR headset system604renders the portion of laser beam632that appears in 3D space as an AR object. Gaming console102renders and displays the portion of laser beam634that enters the bounds of display monitor608on display monitor608. Similarly, the laser beam fired by enemy starship624enters the bounds of display monitor608. Therefore, AR headset system604renders the portion of laser beam636that appears in 3D space as an AR object. Gaming console102renders and displays the portion of laser beam638that enters the bounds of display monitor608on display monitor608.

FIG. 7illustrates yet another exemplary computer game generated by the system100ofFIG. 1, according to various other embodiments of the present invention. As shown, a user702views a display monitor708while wearing an AR headset system704and holding a game controller706. The display monitor708displays an image of an on-screen video game.

Via an event within the executing game, control inputs received from AR headset system704, and/or control inputs received from game controller706, gaming console102determines that a character710is about to leave the bounds of display monitor708and lunge at the user702. After the character710leaves the bounds of display monitor708completely, gaming console102terminates operations related to rendering and displaying the character710, and AR headset system704renders and displays the character710as an AR object. During the transition phase as the character710is in the process of leaving the bounds of display monitor708, gaming console102determines the portion712of the character710that is still within the bounds of display monitor708and the portion714of the character710that appears in 3D space. Gaming console102renders and displays the portion712of the character710that is still within the bounds of display monitor708as part of the image on display monitor708. Gaming console102transmits position data, orientation data, and other information related to the portion714of the character710that appears in 3D space to AR headset system704. In response, AR headset system704renders and displays the portion714of the character710that appears in 3D space as an AR object. In some embodiments, gaming console102and/or AR headset system704renders other objects, such as flying particles of glass or other material, so that the character710appears to exit a portal or window defined by display monitor708into the physical environment of the user702.

FIGS. 8A-8Cillustrate another exemplary computer game generated by the system100ofFIG. 1, according to various other embodiments of the present invention.

As shown inFIG. 8A, a user802views a display monitor808while wearing an AR headset system804and holding a game controller806. The display monitor808displays an image of an on-screen video game. The on-screen video game includes a scale puzzle where the user802is tasked with placing a key810into a keyhole812to complete the current phase of the on-screen video game. As shown in the image displayed on display monitor808, the key810is too large and is in the wrong orientation to fit into the keyhole812.

As shown inFIG. 8B, the user802“grabs” the key814from display monitor808via game controller806and “pulls” the key814into 3D space. Gaming console102terminates operations related to rendering and displaying an image of the key814on display monitor808. Gaming console102transmits location data, orientation data, and other information related to the key814to AR headset system804. In response, AR headset system804displays the key814as an AR object in 3D space.

As shown inFIG. 8C, the user802manipulates his or her body, and/or manipulates the controls of game controller806to change the size, location, and orientation of the key816. During this process, gaming console102continually transmits updated location data, orientation data, and other information related to the key816to AR headset system804. In response, AR headset system804displays updated versions of the key816as an AR object in 3D space. When the key816is at the correct size, location, and orientation, the user802manipulates his or her body, and/or manipulates the controls of game controller806to insert the key816into the keyhole812. Once gaming console102determines that the user802has inserted the key816into the keyhole812with the correct size, location, and orientation, the gaming console102determines that the user802has completed the puzzle.

FIGS. 9A-9Cset forth a flow diagram of method steps for extending on-screen gameplay via an augmented reality system, according to various embodiments of the present invention. Although the method steps are described in conjunction with the systems ofFIGS. 1-8C, persons of ordinary skill in the art will understand that any system configured to perform the method steps, in any order, is within the scope of the present invention.

As shown, a method900begins at step902, where extended gaming application232executing on gaming console102determines that an object is transitioning between the bounds of a display monitor and a location in 3D space in a physical environment. The object could be transitioning from being displayed on the display monitor to being displayed as an AR object in 3D space. Alternatively, the object could be transitioning from being displayed as an AR object in 3D space to being displayed on the display monitor.

At step904, extended gaming application232determines a first portion of the object that is within the bounds of the display monitor and a second portion of the object that is not within the bounds of the display monitor. At step906, extended gaming application232renders and displays the first portion of the object that is within the bounds of the display monitor on the display monitor. At step908, extended gaming application232transmits position data, orientation data, and other information related to the second portion of the object to extended AR application332.

At step910, extended AR application332executing on AR headset system104receives the position data, orientation data, and other information related to the second portion of the object from extended gaming application232. At step912, extended AR application332renders and displays the second portion of the object as an AR object.

At step914, extended gaming application232determines whether the transition of the object between the bounds of a display monitor and the location in 3D space in the physical environment is complete. If, the transition of the object is not complete, then the method proceeds to step904, described above. If, however, the transition of the object is complete, then the method proceeds to step916, where extended gaming application232determines whether the object is transitioning from within the bounds of the display monitor to outside the bounds of the display monitor. If the object is transitioning from within the bounds of the display monitor to outside the bounds of the display monitor, then the method900proceeds to step918, where extended gaming application232terminates operations related to rendering and displaying the object on the image displayed on the display monitor. At step920, extended gaming application232transmits position data, orientation data, and other information related to the object to extended AR application332. At step922, extended AR application332receives the position data, orientation data, and other information related to the object from extended gaming application232. At step924, extended AR application332renders the object as an AR object. The method900then terminates.

Returning to step916, if the object is not transitioning from within the bounds of the display monitor to outside the bounds of the display monitor, then the object is transitioning from outside the bounds of the display monitor to within the bounds of the display monitor. In such cases, the method900proceeds to step926, where extended gaming application232renders and displays the object on the image displayed on the display monitor. At step928, extended gaming application232transmits position data, orientation data, and other information related to the object to extended AR application332. At step930, extended AR application332receives the position data, orientation data, and other information related to the object from extended gaming application232. At step932, extended AR application332terminates operations related to rendering the object as an AR object. The method900then terminates.

In sum, techniques are disclosed for generating an extended on-screen gameplay experience via augmented reality. The disclosed techniques merge conventional on-screen-based gameplay with AR to generate new immersive forms of gameplay. More specifically, the disclosed techniques combine gameplay within a conventional display monitor with an augmented reality headset system that displays AR objects via an AR headset, a projection system, a mobile device such as a smartphone or tablet, or any other AR headset system. The user plays a game on a 2D display using a game controller and wearing an augmented reality headset system, or employing any other type of AR system. At times, the game is constrained to the bounds of the display monitor. At various points in the game, one or more objects transition between being displayed on the display monitor and being displayed as an AR object by the AR headset system. During the transition phase, the gaming console determines the portion of the object that is within the bounds of the display monitor and the portion of the object that is outside the bounds of the display monitor. The gaming console renders and displays the portion of the object that is within the bounds of the display monitor on the display monitor. The gaming console transmits position data, orientation data, and other information related to the portion of the object that is outside the bounds of the display monitor to the AR headset system. The AR headset system renders and displays the portion of the object that is outside the bounds of the display monitor as an AR object. In this manner, the AR headset system tracks 2D objects that transition from being displayed on the display monitor into being displayed in 3D physical space. The AR headset system continuously retrieves game state information related to game play from the gaming console and determines a physical coordinate for various game objects based on the size, aspect ratio, and position of the display screen.

At least one advantage of the disclosed techniques relative to the prior art is that a user experiences more realistic computer-generated gameplay because objects are displayed via traditional display monitor and additional 2D and 3D AR objects are displayed via a companion AR system. As a result, the user can have a more immersive overall experience with enhanced effects, such as the effect of objects displayed via the traditional display monitor appearing to come out of the display monitor into the physical environment of the user. Similarly, the AR objects displayed via the AR system and in the physical environment of the user can appear to enter into the display monitor. In effect, the user is able to play the computer game on the traditional display monitor as well as in the physical environment where the user is physically present, thereby creating a more fully immersive gameplay experience relative to the prior art. These advantages represent one or more technological improvements over prior art approaches.

1. In some embodiments, a computer-implemented method for extending on-screen gameplay via an augmented reality system includes receiving, via a game controller, first data associated with a first object associated with a computer-generated game; rendering an augmented reality object based on the first data associated with the first object; and displaying at least a first portion of the augmented reality object via an augmented reality headset system, wherein an image associated with the computer-generated game is simultaneously rendered on a display monitor.

2. The computer-implemented method according to clause 1, wherein the image associated with the computer-generated game includes at least a second portion of the first object that is different than the first portion of the augmented reality object.

3. The computer-implemented method according to clause 1 or clause 2, further comprising: receiving an indication that the first object is displayed fully within the display monitor; and in response, terminating operations related to rendering and displaying the first portion of the augmented reality object via the augmented reality headset system.

4. The computer-implemented method according to any of clauses 1-3, further comprising: receiving an indication that no portion of the first object is being displayed within the display monitor; and displaying fully the augmented reality object via the augmented reality headset system.

5. The computer-implemented method according to any of clauses 1-4, further comprising: receiving, via the game controller, second data associated with the first object; rendering the augmented reality object based on the second data associated with the first object; and displaying at least a second portion of the augmented reality object via the augmented reality headset system.

6. The computer-implemented method according to any of clauses 1-5, wherein the data associated with the first object is based on data received from the augmented reality headset system.

7. The computer-implemented method according to any of clauses 1-6, wherein the data associated with the first object is based on data received from the game controller.

8. The computer-implemented method according to any of clauses 1-7, further comprising: tracking at least one of a location and an orientation of the augmented reality headset system; and transmitting the at least one of the location and the orientation to the gaming console.

9. The computer-implemented method according to any of clauses 1-8, further comprising: tracking at least one of a location and an orientation of a second object included in an image captured by a camera associated with the augmented reality headset system; and transmitting the at least one of the location and the orientation to the gaming console.

10. In some embodiments, a non-transitory computer-readable medium includes instructions that, when executed by a processor, cause the processor to perform the steps of: receiving first data associated with a first object associated with a computer-generated game; rendering an augmented reality object based on the first data associated with the first object; and displaying at least a first portion of the augmented reality object via an augmented reality headset system, wherein a first image associated with the computer-generated game is simultaneously rendered on a display monitor.

11. The non-transitory computer-readable medium according to clause 10, wherein the first image associated with the computer-generated game includes at least a second portion of the first object that is different than the first portion of the augmented reality object.

12. The non-transitory computer-readable medium according to clause 10 or clause 11, further comprising: receiving an indication that the first object is displayed fully within the display monitor; and in response, terminating operations related to rendering and displaying the first portion of the augmented reality object via the augmented reality headset system.

13. The non-transitory computer-readable medium according to any of clauses 10-12, further comprising: receiving an indication that no portion of the first object is being displayed within the display monitor; and displaying fully the augmented reality object via the augmented reality headset system.

14. The non-transitory computer-readable medium according to any of clauses 10-13, further comprising: receiving, via the game controller, second data associated with the first object; rendering the augmented reality object based on the second data associated with the first object; and displaying at least a second portion of the augmented reality object via the augmented reality headset system.

15. The non-transitory computer-readable medium according to any of clauses 10-14, further comprising: detecting a glyph included in a second image rendered on the display monitor; determining one or more of a size and a location of the glyph based on the second image; and in response, determining one or more of a size and a location of the display monitor.

16. The non-transitory computer-readable medium according to any of clauses 10-15, wherein a physical screen size of the display monitor is included as data encoded in the glyph, and further comprising decoding the data encoded in the glyph to determine the physical screen size of the display monitor.

17. The non-transitory computer-readable medium according to any of clauses 10-16, wherein an original size of the glyph is included as data encoded in the glyph, and further comprising decoding the data encoded in the glyph to determine the original size of the glyph.

18. In some embodiments, a system, includes: an augmented reality headset system, comprising: a memory that includes instructions, and a processor that is coupled to the memory and, when executing the instructions, is configured to: receive first data associated with a first object associated with a computer-generated game, render an augmented reality object based on the first data associated with the first object, and display at least a first portion of the augmented reality object via an augmented reality headset system; and a display monitor configured to simultaneously render an image associated with the computer-generated game.

19. The system according to clause 18, further comprising: a game controller configured to: detect one or more of a location and an orientation associated with the game controller; and transmit the first data associated with the first object to the augmented reality headset system, wherein the first data includes the one or more of a location and an orientation.

20. The system according to clause 18 or clause 19, further comprising: a game controller configured to: detect an activation of a control associated with the game controller; and transmit the first data associated with the first object to the augmented activation of a control reality headset system, wherein the first data includes an indication of the activation of the control.

Any and all combinations of any of the claim elements recited in any of the claims and/or any elements described in this application, in any fashion, fall within the contemplated scope of the present invention and protection.

The descriptions of the various embodiments have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments.

Aspects of the present embodiments may be embodied as a system, method or computer program product. Accordingly, aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “module” or “system.” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.

Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.

Aspects of the present disclosure are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, enable the implementation of the functions/acts specified in the flowchart and/or block diagram block or blocks. Such processors may be, without limitation, general purpose processors, special-purpose processors, application-specific processors, or field-programmable

The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

While the preceding is directed to embodiments of the present disclosure, other and further embodiments of the disclosure may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.

Claims

  1. A computer-implemented method for extending on-screen gameplay via an augmented reality system, the method comprising: receiving, via a game controller, first data associated with a first object associated with a computer-generated game;rendering an augmented reality object based on the first data associated with the first object;and displaying at least a first portion of the augmented reality object via an augmented reality headset system, wherein an image associated with the computer-generated game is simultaneously rendered on a display monitor in response to the first object transitioning from being located at a first location within the display of the augmented reality headset system to being located at a second location within the display of the augmented reality headset system, wherein the first location is outside a visible area of the display monitor, and the second location is inside the visible area of the display monitor, and wherein the image depicts at least a portion of the first object.
  1. The computer-implemented method of claim 1 , wherein the portion of the first object is different than the first portion of the augmented reality object.
  2. The computer-implemented method of claim 2 , further comprising: receiving an indication that the first object is displayed fully within the display monitor;and in response, terminating operations related to rendering and displaying the first portion of the augmented reality object via the augmented reality headset system.
  3. The computer-implemented method of claim 2 , further comprising: receiving an indication that no portion of the first object is being displayed within the display monitor;and displaying fully the augmented reality object via the augmented reality headset system.
  4. The computer-implemented method of claim 1 , further comprising: receiving, via the game controller, second data associated with the first object;rendering the augmented reality object based on the second data associated with the first object;and displaying at least a second portion of the augmented reality object via the augmented reality headset system.
  5. The computer-implemented method of claim 1 , wherein the first data associated with the first object is based on data received from the augmented reality headset system.
  6. The computer-implemented method of claim 1 , wherein the first data associated with the first object is based on data received from the game controller.
  7. The computer-implemented method of claim 1 , further comprising: tracking at least one of a location and an orientation of the augmented reality headset system;and transmitting the at least one of the location and the orientation to a gaming console.
  8. The computer-implemented method of claim 1 , further comprising: tracking at least one of a location and an orientation of a second object included in an image captured by a camera associated with the augmented reality headset system;and transmitting the at least one of the location and the orientation to a gaming console.
  9. A non-transitory computer-readable medium including instructions that, when executed by a processor, cause the processor to perform the steps of: receiving first data associated with a first object associated with a computer-generated game;rendering an augmented reality object based on the first data associated with the first object;and displaying at least a first portion of the augmented reality object via an augmented reality headset system, wherein a first image associated with the computer-generated game is simultaneously rendered on a display monitor in response to the first object transitioning from being located at a first location within the display of the augmented reality headset system to being located at a second location within the display of the augmented reality headset system, wherein the first location is outside a visible area of the display monitor, and the second location is inside the visible area of the display monitor, and wherein the image depicts at least a portion of the first object.
  10. The non-transitory computer-readable medium of claim 10 , wherein the portion of the first object that is different than the first portion of the augmented reality object.
  11. The non-transitory computer-readable medium of claim 11 , further comprising: receiving an indication that the first object is displayed fully within the display monitor;and in response, terminating operations related to rendering and displaying the first portion of the augmented reality object via the augmented reality headset system.
  12. The non-transitory computer-readable medium of claim 11 , further comprising: receiving an indication that no portion of the first object is being displayed within the display monitor;and displaying fully the augmented reality object via the augmented reality headset system.
  13. The non-transitory computer-readable medium of claim 10 , further comprising: receiving, via a game controller, second data associated with the first object;rendering the augmented reality object based on the second data associated with the first object;and displaying at least a second portion of the augmented reality object via the augmented reality headset system.
  14. The non-transitory computer-readable medium of claim 10 , further comprising: detecting a glyph included in a second image rendered on the display monitor;determining a size or a location of the glyph based on the second image;and in response, determining a size or a location of the display monitor.
  15. The non-transitory computer-readable medium of claim 15 , wherein a physical screen size of the display monitor is included as data encoded in the glyph, and further comprising decoding the data encoded in the glyph to determine the physical screen size of the display monitor.
  16. The non-transitory computer-readable medium of claim 15 , wherein an original size of the glyph is included as data encoded in the glyph, and further comprising decoding the data encoded in the glyph to determine the original size of the glyph.
  17. A system, comprising: an augmented reality headset system, comprising: a memory that includes instructions, and a processor that is coupled to the memory and, when executing the instructions, is configured to: receive first data associated with a first object associated with a computer-generated game, render an augmented reality object based on the first data associated with the first object, and display at least a first portion of the augmented reality object via the augmented reality headset system;and a display monitor configured to simultaneously render an image associated with the computer-generated game in response to the first object transitioning from being located at a first location within the display of the augmented reality headset system to being located at a second location within the display of the augmented reality headset system, wherein the first location is outside a visible area of the display monitor, and the second location is inside the visible area of the display monitor, and wherein the image depicts at least a portion of the first object.
  18. The system of claim 18 , further comprising: a game controller configured to: detect a location or an orientation associated with the game controller;and transmit the first data associated with the first object to the augmented reality headset system, wherein the first data includes the location or the orientation associated with the game controller.
  19. The system of claim 18 , further comprising: a game controller configured to: detect an activation of a control associated with the game controller;and transmit the first data associated with the first object to an augmented activation of a control reality headset system, wherein the first data includes an indication of the activation of the control.

Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.