U.S. Pat. No. 9,089,775
INTERACTIVE GAME SYSTEM AND METHODS FOR A TELEVISION AUDIENCE MEMBER TO MIMIC PHYSICAL MOVEMENTS OCCURRING IN TELEVISION BROADCAST CONTENT
AssigneeISAAC DANIEL INVENTORSHIP GROUP, LLC
Issue DateJune 24, 2011
Illustrative Figure
Abstract
A system comprising at least one processor; and computer executable instructions readable by the at least one processor and operative to: use at least one sensor to sense at least one first physical movement of at least one first object; and compare the at least one first physical movement of at least one first object with at least one second physical movement of at least one second object in at least one piece of content.
Description
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS System Level Overview With reference to the drawings,FIG. 1Ashows an embodiment of system100, wherein system100comprises at least one processor102, and computer executable instructions (not shown) readable by the at least one processor and operative use at least one sensor (shown as104inFIGS. 1B-1D) to sense at least one first physical movement of at least one first object (shown as106inFIGS. 1C and 1D) and compare the at least one first physical movement of the at least one first object106to at least one second physical movement of at least one second physical object (shown as108inFIGS. 1C and 1D) in at least one piece of content (shown as110inFIGS. 1C and 1D). In some embodiments, at least one processor102may be any type of processor, including, but not limited to, a single core processor, a multi core processor, a video processor, and the like. In some embodiments, at least one sensor104may be any type of sensor, including, but not limited to, a camera, an infrared camera, a panoramic sensor, a stereo sensor, a three dimensional sensor and/or camera (“3D sensor”), a thermal imaging camera, a video sensor, a digital camera, and the like. At least one sensor104may include a light source, such as an infrared light source, a laser, or a flash, which may be used to illuminate the objects in the sensor's field of view. In preferred embodiments, at least one sensor104may include a field of view that encompasses the same field of view as, or larger than, a device (shown as112inFIGS. 1C and 1D), such as a display device. In a further embodiment, at least one sensor104may include a one hundred and eighty degree field of view, such as by including an ultra wide angle lens. In embodiments where at least one sensor104comprises a 3D sensor and/or camera, at ...
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
System Level Overview
With reference to the drawings,FIG. 1Ashows an embodiment of system100, wherein system100comprises at least one processor102, and computer executable instructions (not shown) readable by the at least one processor and operative use at least one sensor (shown as104inFIGS. 1B-1D) to sense at least one first physical movement of at least one first object (shown as106inFIGS. 1C and 1D) and compare the at least one first physical movement of the at least one first object106to at least one second physical movement of at least one second physical object (shown as108inFIGS. 1C and 1D) in at least one piece of content (shown as110inFIGS. 1C and 1D).
In some embodiments, at least one processor102may be any type of processor, including, but not limited to, a single core processor, a multi core processor, a video processor, and the like.
In some embodiments, at least one sensor104may be any type of sensor, including, but not limited to, a camera, an infrared camera, a panoramic sensor, a stereo sensor, a three dimensional sensor and/or camera (“3D sensor”), a thermal imaging camera, a video sensor, a digital camera, and the like. At least one sensor104may include a light source, such as an infrared light source, a laser, or a flash, which may be used to illuminate the objects in the sensor's field of view. In preferred embodiments, at least one sensor104may include a field of view that encompasses the same field of view as, or larger than, a device (shown as112inFIGS. 1C and 1D), such as a display device. In a further embodiment, at least one sensor104may include a one hundred and eighty degree field of view, such as by including an ultra wide angle lens. In embodiments where at least one sensor104comprises a 3D sensor and/or camera, at least one sensor104may be any type of 3D sensor and/or camera, such as a time of flight camera, a structured light camera, a modulated light camera, a triangulation camera, and the like, including, but not limited to, those cameras developed and manufactured by PMDTechnologies, GmbH, Am Eichenhang 50, D-57076 Siegen, Germany; Canesta, Inc., 1156 Sonora Court, Sunnyvale, Calif., 94086, USA; Optrima, NV, Witherenstraat 4, 1040 Brussels, Belgium; Primesense, of Israel; and the Bidirectional Screen developed by the Massachusetts Institute of Technology. In an alternate embodiment, at least one sensor104may comprise an accelerometer or gyroscopic sensor.
The computer executable instructions may be loaded directly on at least one processor102, or may be stored in a storage means, such as, but not limited to, computer readable media, such as, but not limited to, a hard drive, a solid state drive, a flash memory, random access memory, CD-ROM, CD-R, CD-RW, DVD-ROM, DVD-R, DVD-RW, and the like. The computer executable instructions may be any type of computer executable instructions, which may be in the form of a computer program, the program being composed in any suitable programming language or source code, such as C++, C, JAVA, JavaScript, HTML, XML, and other programming languages.
In a further embodiment, at least one processor102may be electronically connected to at least one sensor104, as shown inFIG. 1B. The terms “electronically connected,” “electronic connection,” and the like, as used throughout the present disclosure, are intended to describe any type of electronic connection or electronic communication, such as, but not limited to, a physically connected or wired electronic connection and/or a wireless electronic connection, such as a direct connection, or a connection through a network, such as the internet.
In some embodiments, at least one piece of content110may be any type of content, such as, but not limited to, sports content, movie content, live content, music content, arts content, news content, television content, game content, and the like. At least one piece of content110may be played on a device112, such as a display device. The display device may be any type of display device, such as, but not limited to, a television, a computer monitor, a laptop screen, a mobile phone screen, a projector, a projection screen, or any other type of screen and/or display device.
In yet a further embodiment, at least one sensor104may be positioned near device112, such as, but not limited to, on top of or above device112, in front of device112, behind device112, below device112, to the side of device112, or integrated with device112, such as being built into device112(as shown inFIG. 1.E). In an alternate embodiment, at least one sensor104may be positioned near at least one first object106, such as, but not limited to, on top of or above at least one first object106, in front of at least one first object106, behind at least one first object106, below at least one first object106, or to the side of at least one first object106. In yet another embodiment, at least one sensor104may be integrated, connected, or attached to at least one first object106.
In some embodiments, at least one first object106may be an animate object, such as, but not limited to, a person (as shown inFIG. 1C) or an animal. In an alternate embodiment, at least one first object106may be an inanimate object, such as a game controller, or a user instrument, such as, but not limited to, musical instruments, tools, household objects, and sports equipment, such as, but not limited to, a stick, a baseball bat, a ball, a racket, a club, a paddle, a ring, and the like. In a further embodiment, at least one first object106may comprise both animate and inanimate objects, such as, but not limited to, a person holding an instrument, such as a piece of sports equipment (as shown inFIG. 1D).
Accordingly, in various embodiments, at least one first physical movement of at least one first object106may be any type of movement, including but not limited to, a dance movement, sports movement, such as, but not limited to, a throwing movement, a swinging movement, a jumping movement, a running movement, a diving movement, a flying movement, a shooting movement, a driving movement, a dancing movement, and the like, or any arbitrary movement.
In some embodiments, using at least one sensor to sense at least one first physical movement of at least one first object106comprises tracking at least one first physical movement of at least one first object106. In a further embodiment, the at least one first movement may be stored and/or transmitted to a central station or a remote station.
In some embodiments, at least one second object108may be an animate object, such as, but not limited to, a person (as shown inFIG. 1C) or an animal. In an alternate embodiment, at least one second object108may be an inanimate object, such as a game controller, or a user instrument, such as, but not limited to, musical instruments, tools, household objects, and sports equipment, such as, but not limited to, a stick, a baseball bat, a ball, a racket, a club, a paddle, a ring, and the like. In a further embodiment, at least one second object108may comprise both animate and inanimate objects, such as, but not limited to, a person holding an instrument, such as a piece of sports equipment (as shown inFIG. 1D).
Accordingly, in various embodiments, at least one second physical movement of at least one second object108may be any type of movement, including but not limited to, a dance movement, sports movement, such as, but not limited to, a throwing movement, a swinging movement, a jumping movement, flying movement, a running movement, a diving movement, a shooting movement, a driving movement, a dancing movement, and the like, or any arbitrary movement.
In some embodiments, the second physical movement may be the physical movement of at least one second object108in at least one piece of content110. Accordingly, it may be the movement of a person, such as a sports participant, or the object of a sport, such as a ball.
In some embodiments, using at least one sensor to sense at least one second physical movement of at least one second object108comprises tracking at least one second physical movement of at least one second object108. In a further embodiment, the at least one second movement may be stored and/or transmitted to a central station or a remote station.
In yet another embodiment, the computer executable instructions may be further operative to determine whether the at least one first physical movement of at least one first object106and the at least one second physical movement of at least one second object108in at least one piece of content110coincide. In one embodiment, determining whether the at least one first physical movement and the at least one second physical movement coincide comprises determining whether the at least one first physical movement and the at least one second physical movement are similar.
The term “coincide,” as used herein, refers to the relationship of the at least one first physical movement and the at least one second physical movement, and may imply that both movements somehow or other match, correspond, or are related in a way so as to be a complimentary to each other or dovetail with each other. One such example is the physical movement of a ball in a sport which coincides with the physical movement of a player or piece of equipment attempting to touch, strike, or block the ball. Another example is the physical movement of a player acting in a sport, such as by throwing a ball, running, and the like, which coincides with the physical movement of an audience member attempting to mimic, imitate, follow, or counteract the player's movements. The present disclosure should not be limited to physical movements in sports, but may be applied to other fields as well, such as the arts, including dance, and the like, and exercise, movies, or television programs.
In embodiments where at least one first object106is a person, determining whether the two physical movements coincide may include comparing the movement of a person watching a sports event to the movement of a person or object playing in the sports event. In some embodiments, this may include comparing whether a person watching the content reacts appropriately to the activity going on in the content, such as in the case of a baseball game, where the person watching the baseball game may be attempting to imitate the swing of a baseball player playing in the baseball game, or may be attempting to make a movement so as to virtually “hit” a ball that is being pitched in the game that is being played as content110, or throw a ball to a receiver in a football game. In other embodiments, the content110may be non-sport content, such as a movie, whereby a person watching the movie may be attempting to react in a certain way to the movie, such as by jumping up and down or making running movement when such movement is called for in the movie.
In a further embodiment, the computer executable instructions may be operative to conduct a game based on whether the at least one first physical movement of at least one first object106and the at least one second physical movement of the at least one second object108in at least one piece of content110coincide. In some embodiments, the game may include determining a score based on whether the at least one first physical movement of at least one first object106and the at least one second physical movement of at least one second object108in at least one piece of content coincide. In an exemplary embodiment, when it is determined that the at least one first physical movement of at least one first object106and the at least one second physical movement of the at least one second object108in at least one piece of content110coincide, a user may be allotted a certain amount of points. The amount of points may vary based on how well the movements coincide.
In yet another embodiment, the computer executable instructions may be further operative to transmit or receive information. The information may include information about content110, such as information relating to the physical movements within content110, such as the at least one second physical movement of at least one second object108in content110. In some embodiments, the information may include information relating to at least one first object106, such as, but not limited to, to the physical movements of at least one first object106. In another embodiment, the information may include game information, such as the score of the game being played by at least one object106.
In some embodiments, the computer executable instructions may be operative to transmit or receive information to or from a central station, such as a server or a television station, which may comprise a system similar to system100. In another embodiment, the computer executable instructions may be operative to transmit or receive information to or from a remote station, such as a user system, which may comprise a system similar to system100.
In another embodiment, the computer executable instructions may be operative to use a communications means for transmitting or receiving information, wherein the communications means may comprise a wired communications means, such as a land line modem, cable modem, DSL modem, and the like, or a wireless communications means, such as a wireless modem, a GSM modem, a satellite modem, and Wi-Fi adapter, and the like.
In some embodiments, system100may be integrated with or may be a component of a user system or a central station, such as a computer, a gaming console, a set top box, a display device, or a server.
In another embodiment, at least one processor302may be electronically connected to at least one sensor104. In yet another embodiment, at least one processor may be positioned near, such as in the same location as, at least one of at least one sensor104and at least one object106. In yet another embodiment, at least one processor102may be integrated with at least one sensor104. In yet another embodiment, at least one processor102may be positioned at a location remote from at least one of at least one sensor104and at least one object106, such as, but not limited to, by being located at a central station, such as a television station or a server, such as a game server.
Referring now toFIGS. 2A and 2B, an embodiment of system200is shown, wherein system200comprises at least one processor202, and computer executable instructions (not shown) readable by the at least one processor and operative to transmit or receive information about at least one physical movement of at least one object204in at least one piece of content206.
In some embodiments, at least one processor202may be any type of processor, including, but not limited to, a single core processor, a multi core processor, a video processor, and the like.
The computer executable instructions may be loaded directly on at least one processor202, or may be stored in a storage means, such as, but not limited to, computer readable media, such as, but not limited to, a hard drive, a solid state drive, a flash memory, random access memory, CD-ROM, CD-R, CD-RW, DVD-ROM, DVD-R, DVD-RW, and the like. The computer executable instructions may be any type of computer executable instructions, which may be in the form of a computer program, the program being composed in any suitable programming language or source code, such as C++, C, JAVA, JavaScript, HTML, XML, and other programming languages.
Accordingly, in various embodiments, the at least one physical movement of at least one object204may be any type of movement, such as the movement of a person or an inanimate object, including but not limited to, a dance movement, sports movement, such as, but not limited to, a throwing movement, a swinging movement, a jumping movement, flying movement, a running movement, a diving movement, a shooting movement, a driving movement, a dancing movement, and the like, or any arbitrary movement.
In some embodiments, at least one piece of content206may be any type of content, such as, but not limited to, sports content, movie content, live content, music content, arts content, news content, television content, game content, and the like. At least one piece of content206may be played on a device, such as a display device. The display device may be any type of display device, such as, but not limited to, a television, a computer monitor, a laptop screen, a mobile phone screen, a projector, a projection screen, or any other type of screen and/or display device.
In some embodiments, the information about the at least one physical movement of at least one object204may be extracted from content206using object recognition software, and/or gesture control software, such as those developed by Softkinetic S.A., 24 Avenue L. Mommaerts, Brussels, B-1140, Belgium, and Microsoft Corp. In another embodiment, information about the at least one physical movement of at least one object204may be provided by a user. In some embodiments, the at least one physical movement of at least one object204may be captured using at least one sensor, which may be similar to at least one sensor104described above with reference toFIGS. 1A through 1E.
In another embodiment, the computer executable instructions may be operative to transmit or receive information about at least one physical movement of at least one object204in at least one piece of content206to or from at least one central station, such as a television station, a server, or a user monitoring station, or at least one remote station, such as a user system, which may comprise a system similar to system100, described above with reference toFIGS. 1A through 1E.
Accordingly, in one embodiment, the computer executable instructions are operative to transmit or receive information to or from at least one remote station or at least one central station about at least one physical movement of at least one object sensed by at least one sensor. In yet a further embodiment, the computer executable instructions may be operative to compare the information transmitted or received from the at least one remote station or at least one central station with information about at least one second physical motion of at least one second object. In yet another embodiment, the computer executable instructions may be operative to determine a score based on the information transmitted or received from the at least one remote station or at least one central station.
In another embodiment, the computer executable instructions may be operative to use a communications means for transmitting or receiving information, wherein the communications means may comprise a wired communications means, such as a land line modem, cable modem, DSL modem, and the like, or a wireless communications means, such as a wireless modem, a GSM modem, a satellite modem, and Wi-Fi adapter, and the like.
Referring now toFIG. 3, an embodiment of system300is shown, wherein system300comprises at least one processor302and computer executable instructions (not shown) readable by at least one processor302and operative to use at least one first sensor304to sense at least one first physical movement of at least one first object306, use at least one second sensor308to sense at least one second physical movement of at least one second object310, and compare the at least one first physical movement of at least one first object306with the at least one second physical movement of at least one second object310.
In some embodiments, at least one processor302may be any type of processor, including, but not limited to, a single core processor, a multi core processor, a video processor, and the like.
The computer executable instructions may be loaded directly on at least one processor302, or may be stored in a storage means, such as, but not limited to, computer readable media, such as, but not limited to, a hard drive, a solid state drive, a flash memory, random access memory, CD-ROM, CD-R, CD-RW, DVD-ROM, DVD-R, DVD-RW, and the like. The computer executable instructions may be any type of computer executable instructions, which may be in the form of a computer program, the program being composed in any suitable programming language or source code, such as C++, C, JAVA, JavaScript, HTML, XML, and other programming languages.
In some embodiments, at least one piece of content206may be any type of content, such as, but not limited to, sports content, movie content, live content, music content, arts content, news content, television content, game content, and the like. At least one piece of content206may be played on a device, such as a display device. The display device may be any type of display device, such as, but not limited to, a television, a computer monitor, a laptop screen, a mobile phone screen, a projector, a projection screen, or any other type of screen and/or display device.
In some embodiments, the information about the at least one physical movement of at least one object204may be extracted from content206using object recognition software, and/or gesture control software, such as those developed by Softkinetic S.A., 24 Avenue L. Mommaerts, Brussels, B-1140, Belgium, and Microsoft Corp. In another embodiment, information about the at least one physical movement of at least one object204may be provided by a user. In some embodiments, the at least one physical movement of at least one object204may be captured using at least one sensor, which may be similar to at least one sensor104described above with reference toFIGS. 1A through 1E.
In some embodiments, sensors304and308may be any type of sensor, including, but not limited to, a camera, an infrared camera, a panoramic sensor, a stereo sensor, a three dimensional sensor and/or camera (“3D sensor”), a thermal imaging camera, a video sensor, a digital camera, and the like. Sensors304and308may include a light source, such as an infrared light source, a laser, or a flash, which may be used to illuminate the objects in the sensor's field of view. In preferred embodiments, sensors304and308may include a field of view that encompasses the same field of view as, or larger than, a device (shown as112inFIGS. 1C and 1D), such as a display device. In a further embodiment, sensors304and308may include a one hundred and eighty degree field of view, such as by including an ultra wide angle lens. In embodiments where sensors304and308comprise a 3D sensor and/or camera, sensors304and308may be any type of 3D sensor and/or camera, such as a time of flight camera, a structured light camera, a modulated light camera, a triangulation camera, and the like, including, but not limited to, those cameras developed and manufactured by PMDTechnologies, GmbH, Am Eichenhang 50, D-57076 Siegen, Germany; Canesta, Inc., 1156 Sonora Court, Sunnyvale, Calif., 94086, USA; Primesense of Israel; Optrima, NV, Witherenstraat 4, 1040 Brussels, Belgium; and the Bidirectional Screen developed by the Massachusetts Institute of Technology. In an alternate embodiment, sensors304and308may comprise an accelerometer or gyroscopic sensor.
In one embodiment, at least one processor302may be electronically connected to at least one of at least one first sensor304or at least one second sensor306.
In yet a further embodiment, at least one first sensor304may be positioned near a device312(e.g. a display device), such as, but not limited to, on top of or above device312, in front of device312, behind device312, below device312, to the side of device312, or integrated with device312, such as being built into device312(as shown inFIG. 3). In an alternate embodiment, at least one first sensor304may be positioned near at least one first object306, such as, but not limited to, on top of or above at least one first object306, in front of at least one first object306, behind at least one first object306, below at least one first object306, or to the side of at least one first object306. In yet another embodiment, at least one first sensor304may be integrated, connected, or attached to at least one first object306.
In yet a further embodiment, at least one second sensor308may be positioned near at least one second object310, such as, but not limited to, on top of or above at least one second object310, in front of at least one second object310, behind at least one second object310, below at least one second object310, or to the side of at least one second object310. In yet another embodiment, at least one sensor308may be integrated, connected, or attached to at least one second object310.
In some embodiments, at least one first object306may be an animate object, such as, but not limited to, a person (as shown inFIG. 3) or an animal. In an alternate embodiment, at least one first object306may be an inanimate object, such as a game controller, or a user instrument, such as, but not limited to, musical instruments, tools, household objects, and sports equipment, such as, but not limited to, a stick, a baseball bat, a ball, a racket, a club, a paddle, a ring, and the like. In a further embodiment, at least one first object306may comprise both animate and inanimate objects, such as, but not limited to, a person holding an instrument, such as a piece of sports equipment (as shown inFIG. 3).
Accordingly, in various embodiments, at least one first physical movement of at least one first object306may be any type of movement, including but not limited to, a dance movement, sports movement, such as, but not limited to, a throwing movement, a swinging movement, a jumping movement, a running movement, a diving movement, a flying movement, a shooting movement, a driving movement, a dancing movement, and the like, or any arbitrary movement.
In some embodiments, using at least one sensor to sense at least one first physical movement of at least one first object306comprises tracking at least one first physical movement of at least one first object306. In a further embodiment, the at least one first movement may be stored and/or transmitted to a central station or a remote station, of which at least one processor302may be a part.
In some embodiments, at least one second object310may be an animate object, such as, but not limited to, a person (as shown inFIG. 3) or an animal. In an alternate embodiment, at least one second object310may be an inanimate object, such as a game controller, or a user instrument, such as, but not limited to, musical instruments, tools, household objects, and sports equipment, such as, but not limited to, a stick, a baseball bat, a ball, a racket, a club, a paddle, a ring, and the like. In a further embodiment, at least one second object310may comprise both animate and inanimate objects, such as, but not limited to, a person holding an instrument, such as a piece of sports equipment (as shown inFIG. 3).
Accordingly, in various embodiments, at least one second physical movement of at least one second object310may be any type of movement, including but not limited to, a dance movement, sports movement, such as, but not limited to, a throwing movement, a swinging movement, a jumping movement, flying movement, a running movement, a diving movement, a shooting movement, a driving movement, a dancing movement, and the like, or any arbitrary movement.
In some embodiments, the second physical movement may be the physical movement of at least one second object310in at least one piece of content not shown. Accordingly, it may be the movement of a person, such as a sports participant, or the object of a sport, such as a ball.
In some embodiments, using at least one sensor to sense at least one second physical movement of at least one second object310comprises tracking at least one second physical movement of at least one second object310. In a further embodiment, the at least one second movement may be stored and/or transmitted to a central station or a remote station, of which at least one processor302may be a part.
In another embodiment, the computer executable instructions may be operative to determine whether the at least one first physical movement and the at least one second physical movement coincide, or, in other words, whether the physical movements of at least one first object306and at least one second object310coincide. In one embodiment, determining whether the at least one first physical movement and the at least one second physical movement coincide comprises determining whether the at least one first physical movement and the at least one second physical movement are similar.
The term “coincide,” as used herein, refers to the relationship of the at least one first physical movement and the at least one second physical movement, and may imply that both movements somehow or other match, correspond, or are related in a way so as to be a complimentary to each other or dovetail with each other. One such example is the physical movement of a ball in a sport which coincides with the physical movement of a player or piece of equipment attempting to touch, strike, or block the ball. Another example is the physical movement of a player acting in a sport, such as by throwing a ball, running, and the like, which coincides with the physical movement of an audience member attempting to mimic, imitate, follow, or counteract the player's movements. The present disclosure should not be limited to physical movements in sports, but may be applied to other fields as well, such as the arts, including dance, and the like, and exercise, movies, or television programs.
In embodiments where at least one first object306is a person, determining whether the two physical movements coincide may include comparing the movement of a person watching a sports event to the movement of a person or object playing in the sports event (shown as object310inFIG. 3). In some embodiments, this may include comparing whether a person watching the content reacts appropriately to the activity going on in the content, such as in the case of a baseball game, where the person watching the baseball game may be attempting to imitate the swing of a baseball player playing in the baseball game, or may be attempting to make a movement so as to virtually “hit” a ball that is being pitched in the game that is being played, or throw a ball to a receiver in a football game. In other embodiments, activity in which at least one second object310is participating may be non-sport content, such as a movie, whereby a person watching the movie may be attempting to react in a certain way to the movie, such as by jumping up and down or making running movement when such movement is called for in the movie.
In a further embodiment, the computer executable instructions may be operative to conduct a game based on whether the at least one first physical movement of at least one first object306and the at least one second physical movement of the at least one second object310coincide. In some embodiments, the game may include determining a score based on whether the at least one first physical movement of at least one first object306and the at least one second physical movement of at least one second object310coincide. In an exemplary embodiment, when it is determined that the at least one first physical movement of at least one first object306and the at least one second physical movement of the at least one second object310coincide, a user may be allotted a certain amount of points. The amount of points may vary based on how well the movements coincide.
In another embodiment, at least one processor302may be electronically connected to at least one of at least one sensor304and at least one sensor308. In yet another embodiment, at least one processor may be positioned near, such as in the same location as, at least one of at least one sensor304and at least one sensor308. In yet another embodiment, at least one processor302may be integrated with at least one of at least one sensor304and at least one sensor308. In yet another embodiment, at least one processor302may be positioned at a location remote from at least one of at least one sensor304and at least one sensor308, such as, but not limited to, by being located at a central station, such as a television station or a server, such as a game server.
Referring now toFIG. 4, at least one embodiment of system400is shown, wherein system400comprises at least one processor402, and computer executable instructions (not shown) readable by at least one processor402and operative to analyze at least one piece of content404to determine at least one first physical movement of at least one first object406, use at least one second sensor408to sense at least one second physical movement of at least one second object410, and compare the at least one first physical movement of at least one first object406with the at least one second physical movement of at least one second object410.
In some embodiments, at least one processor402may be any type of processor, including, but not limited to, a single core processor, a multi core processor, a video processor, and the like.
The computer executable instructions may be loaded directly on at least one processor402, or may be stored in a storage means, such as, but not limited to, computer readable media, such as, but not limited to, a hard drive, a solid state drive, a flash memory, random access memory, CD-ROM, CD-R, CD-RW, DVD-ROM, DVD-R, DVD-RW, and the like. The computer executable instructions may be any type of computer executable instructions, which may be in the form of a computer program, the program being composed in any suitable programming language or source code, such as C++, C, JAVA, JavaScript, HTML, XML, and other programming languages.
In some embodiments, at least one piece of content404may be any type of content, such as, but not limited to, sports content, movie content, live content, music content, arts content, news content, television content, game content, and the like. At least one piece of content404may be played on a device412, such as a display device. The display device may be any type of display device, such as, but not limited to, a television, a computer monitor, a laptop screen, a mobile phone screen, a projector, a projection screen, or any other type of screen and/or display device.
In one embodiment, analyzing at least one piece of content404to determine at least one first physical movement of at least one first object406, comprises using object and/or image recognition software, and/or gesture control software, such as those developed by Softkinetic S.A., 24 Avenue L. Mommaerts, Brussels, B-1140, Belgium, and Microsoft Corp. In another embodiment, analyzing at least one piece of content404to determine at least one first physical movement of at least one first object406, comprises using at least one person to analyze at least one piece of content404and inputting information about the at least one first physical movement of at least one first object406may be provided by a user. In some embodiments, the at least one physical movement of at least one first object406may be captured using at least one sensor, which may be similar to at least one sensor408.
In some embodiments, at least one sensor408may be any type of sensor, including, but not limited to, a camera, an infrared camera, a panoramic sensor, a stereo sensor, a three dimensional sensor and/or camera (“3D sensor”), a thermal imaging camera, a video sensor, a digital camera, and the like. At least one sensor408may include a light source, such as an infrared light source, a laser, or a flash, which may be used to illuminate the objects in the sensor's field of view. In preferred embodiments, at least one sensor408may include a field of view that encompasses the same field of view as, or larger than, a device (shown as412inFIG. 4), such as a display device. In a further embodiment, at least one sensor408may include a one hundred and eighty degree field of view, such as by including an ultra wide angle lens. In embodiments where at least one sensor408comprises a 3D sensor and/or camera, at least one sensor408may be any type of 3D sensor and/or camera, such as a time of flight camera, a structured light camera, a modulated light camera, a triangulation camera, and the like, including, but not limited to, those cameras developed and manufactured by PMDTechnologies, GmbH, Am Eichenhang 50, D-57076 Siegen, Germany; Canesta, Inc., 1156 Sonora Court, Sunnyvale, Calif., 94086, USA; Primesense of Israel; Optrima, NV, Witherenstraat 4, 1040 Brussels, Belgium; and the Bidirectional Screen developed by the Massachusetts Institute of Technology. In an alternate embodiment, at least one sensor408may comprise an accelerometer or gyroscopic sensor.
In one embodiment, at least one processor402may be electronically connected, or in electronic communication, with at least one sensor408.
In yet a further embodiment, at least one sensor408may be positioned near a device412(e.g. a display device), such as, but not limited to, on top of or above device412, in front of device412, behind device412, below device312, to the side of device412, or integrated with device412, such as being built into device412(as shown inFIG. 1.E). In an alternate embodiment, at least one sensor408may be positioned near at least one second object410, such as, but not limited to, on top of or above at least one second object410, in front of at least one second object410, behind at least one second object410, below at least one second object410, or to the side of at least one second object410. In yet another embodiment, at least one first sensor408may be integrated, connected, or attached to at least one second object410.
In some embodiments, at least one first object406may be an animate object, such as, but not limited to, a person (as shown inFIG. 4) or an animal. In an alternate embodiment, at least one first object406may be an inanimate object, such as a game controller, or a user instrument, such as, but not limited to, musical instruments, tools, household objects, and sports equipment, such as, but not limited to, a stick, a baseball bat, a ball, a racket, a club, a paddle, a ring, and the like. In a further embodiment, at least one first object406may comprise both animate and inanimate objects, such as, but not limited to, a person holding an instrument, such as a piece of sports equipment (as shown inFIG. 4).
Accordingly, in various embodiments, at least one first physical movement of at least one first object406may be any type of movement, including but not limited to, a dance movement, sports movement, such as, but not limited to, a throwing movement, a swinging movement, a jumping movement, a running movement, a diving movement, a flying movement, a shooting movement, a driving movement, a dancing movement, and the like, or any arbitrary movement.
In some embodiments, using at least one sensor408to sense at least one first physical movement of at least one second object410comprises tracking at least one second physical movement of at least one second object410. In a further embodiment, the at least one second physical movement may be stored and/or transmitted to a central station or a remote station, of which at least one processor402may be a part.
In some embodiments, at least one second object410may be an animate object, such as, but not limited to, a person (as shown inFIG. 4) or an animal. In an alternate embodiment, at least one second object410may be an inanimate object, such as a game controller, or a user instrument, such as, but not limited to, musical instruments, tools, household objects, and sports equipment, such as, but not limited to, a stick, a baseball bat, a ball, a racket, a club, a paddle, a ring, and the like. In a further embodiment, at least one second object410may comprise both animate and inanimate objects, such as, but not limited to, a person holding an instrument, such as a piece of sports equipment (as shown inFIG. 4).
Accordingly, in various embodiments, at least one second physical movement of at least one second object410may be any type of movement, including but not limited to, a dance movement, sports movement, such as, but not limited to, a throwing movement, a swinging movement, a jumping movement, flying movement, a running movement, a diving movement, a shooting movement, a driving movement, a dancing movement, and the like, or any arbitrary movement.
In some embodiments, the second physical movement may be the physical movement of at least one second object410in at least one piece of content not shown. Accordingly, it may be the movement of a person, such as a sports participant, or the object of a sport, such as a ball.
In some embodiments, using at least one sensor to sense at least one second physical movement of at least one second object410comprises tracking at least one second physical movement of at least one second object410. In a further embodiment, the at least one second movement may be stored and/or transmitted to a central station or a remote station, of which at least one processor402may be a part.
In another embodiment, the computer executable instructions may be operative to determine whether the at least one first physical movement and the at least one second physical movement coincide, or, in other words, whether the physical movements of at least one first object406and at least one second object410coincide. In one embodiment, determining whether the at least one first physical movement and the at least one second physical movement coincide comprises determining whether the at least one first physical movement and the at least one second physical movement are similar.
The term “coincide,” as used herein, refers to the relationship of the at least one first physical movement and the at least one second physical movement, and may imply that both movements somehow or other match, correspond, or are related in a way so as to be a complimentary to each other or dovetail with each other. One such example is the physical movement of a ball in a sport which coincides with the physical movement of a player or piece of equipment attempting to touch, strike, or block the ball. Another example is the physical movement of a player acting in a sport, such as by throwing a ball, running, and the like, which coincides with the physical movement of an audience member attempting to mimic, imitate, follow, or counteract the player's movements. The present disclosure should not be limited to physical movements in sports, but may be applied to other fields as well, such as the arts, including dance, and the like, and exercise, movies, or television programs.
In embodiments where at least one second object410is a person, determining whether the two physical movements coincide may include comparing the movement of a person watching a sports event to the movement of a person or object playing in the sports event (shown as object406inFIG. 4). In some embodiments, this may include comparing whether a person watching the content reacts appropriately to the activity going on in the content, such as in the case of a baseball game, where the person watching the baseball game may be attempting to imitate the swing of a baseball player playing in the baseball game, or may be attempting to make a movement so as to virtually “hit” a ball that is being pitched in the game that is being played, or throw a ball to a receiver in a football game. In other embodiments, activity in which at least one first object406is participating may be non-sport content, such as a movie, whereby a person watching the movie may be attempting to react in a certain way to the movie, such as by jumping up and down or making running movement when such movement is called for in the movie.
In a further embodiment, the computer executable instructions may be operative to conduct a game based on whether the at least one first physical movement of at least one first object406and the at least one second physical movement of the at least one second object410coincide. In some embodiments, the game may include determining a score based on whether the at least one first physical movement of at least one first object406and the at least one second physical movement of at least one second object410coincide. In an exemplary embodiment, when it is determined that the at least one first physical movement of at least one first object406and the at least one second physical movement of the at least one second object410coincide, a user may be allotted a certain amount of points. The amount of points may vary based on how well the movements coincide.
In another embodiment, at least one processor402may be electronically connected to at least one sensor408. In yet another embodiment, at least one processor may be positioned near, such as in the same location as, at least one sensor408. In yet another embodiment, at least one processor402may be integrated with at least one sensor408. In yet another embodiment, at least one processor402may be positioned at a location remote from at least one of at least one sensor408, such as, but not limited to, by being located at a central station, such as a television station or a server, such as a game server.
Referring now toFIG. 5, one embodiment of system500is shown, comprising at least one sensor module504to sense at least one first physical movement of at least one first object and at least one processor module502to compare the at least one first physical movement of at least one first object with at least one second physical movement of at least one second object in at least one piece of content.
In one embodiment at least one processor module502may comprise a software aspect, such as, but not limited to, processing software, such as processing computer program, an operating system, and the like. In another embodiment, at least one processor module502may comprise a hardware aspect, such as a computer processor. In some embodiments, at least one processor module502may be any type of processor, including, but not limited to, a single core processor, a multi core processor, a video processor, and the like. In yet another embodiment, at least one processor module502may comprise both a software aspect and a hardware aspect.
In some embodiments, at least one sensor module504may comprise a hardware aspect, such as, but not limited to, a sensor. In such embodiments, at least one sensor module504may be any type of sensor module, including, but not limited to, a camera, an infrared camera, a panoramic sensor, a stereo sensor, a three dimensional sensor and/or camera (“3D sensor”), a thermal imaging camera, a video sensor, a digital camera, and the like. At least one sensor module504may include a light source, such as an infrared light source, a laser, or a flash, which may be used to illuminate the objects in the sensor's field of view. In preferred embodiments, at least one sensor module504may include a field of view that encompasses the same field of view as, or larger than, a device, such as a display device. In a further embodiment, at least one sensor module504may include a one hundred and eighty degree field of view, such as by including an ultra wide angle lens. In embodiments where at least one sensor module504comprises a 3D sensor and/or camera, at least one sensor module504may be any type of 3D sensor and/or camera, such as a time of flight camera, a structured light camera, a modulated light camera, a triangulation camera, and the like, including, but not limited to, those cameras developed and manufactured by PMDTechnologies, GmbH, Am Eichenhang 50, D-57076 Siegen, Germany; Canesta, Inc., 1156 Sonora Court, Sunnyvale, Calif., 94086, USA; Primesense of Israel; Optrima, NV, Witherenstraat 4, 1040 Brussels, Belgium; and the Bidirectional Screen developed by the Massachusetts Institute of Technology. In an alternate embodiment, at least one sensor module504may comprise an accelerometer or gyroscopic sensor.
In yet another embodiment, at least one sensor module504may comprise a software aspect, such as a computer program. In such an embodiment, at least one sensor module504may comprise sensor and/or camera firmware, or object or image recognition software, and/or gesture control software, such as those developed by Softkinetic S.A., 24 Avenue L. Mommaerts, Brussels, B-1140, Belgium, and Microsoft Corp.
In yet another embodiment, at least one sensor module504may comprise both a software aspect and a hardware aspect.
In one embodiment, at least one processor module502may be electronically connected, electronic communication, or in software communication with at least one sensor module504.
In embodiments where at least one first object is a person, at least one processor module502may be operative to determine whether the two physical movements coincide, which may include comparing the movement of a person watching a sports event to the movement of a person or object playing in the sports event. In some embodiments, this may include comparing whether a person watching the at least one piece of content reacts appropriately to the activity going on in the content, such as in the case of a baseball game, where the person watching the baseball game may be attempting to imitate the swing of a baseball player playing in the baseball game, or may be attempting to make a movement so as to virtually “hit” a ball that is being pitched in the game that is being played as the at least one piece of content, or throw a ball to a receiver in a football game. In other embodiments, the at least one piece of content may be non-sport content, such as a movie, whereby a person watching the movie may be attempting to react in a certain way to the movie, such as by jumping up and down or making running movement when such movement is called for in the movie.
In a further embodiment, system500comprises at least one display module to display the at least one piece of content. The at least one display module may comprise a hardware aspect, such as a display device, which may include, but is not limited to, a television, a projector and/or projector screen, a computer monitor, and the like. In another embodiment, the at least one display module may comprise a software aspect, such as a display computer program or software, such as, but not limited to, a media player, display drivers, and the like. In yet another embodiment, the at least one display module may comprise both a hardware aspect and a software aspect.
In yet another embodiment, system500may comprise at least one communications module to transmit or receive information. The information may comprise any type of information, such as, but not limited to, game information based on the physical movements of the objects, or information on the physical movements of the at least one first object and the at least one second object, such as those types of information described above with reference toFIGS. 1 through 4.
In one embodiment, the at least one communications module may comprise a hardware aspect, such as but not limited to, wireless communications hardware, such as, but not limited to, a wireless modem, a GSM modem, a Wi-Fi modem, an antenna, a satellite modem, a Bluetooth modem, and the like, or wired hardware, such as a DSL modem, a cable modem, a network card, a telephone modem, and the like. In yet another embodiment, the at least one communications module may comprise a software aspect, such as, but not limited to, a computer program or software, such as, but not limited to, a communications program, communications protocol, and the like. In yet another embodiment, the at least one communications module may comprise both a hardware aspect and a software aspect.
Referring now toFIG. 6, at least one embodiment of system600is shown, wherein system600comprises at least one processor module602to compare at least one first physical movement of at least one first object with at least one second physical movement of at least one second object, and at least one communications module604to transmit or receive information.
In one embodiment at least one processor module602may comprise a software aspect, such as, but not limited to, processing software, such as processing computer program, an operating system, and the like. In another embodiment, at least one processor module602may comprise a hardware aspect, such as a computer processor. In some embodiments, at least one processor module602may be any type of processor, including, but not limited to, a single core processor, a multi core processor, a video processor, and the like. In yet another embodiment, at least one processor module602may comprise both a software aspect and a hardware aspect.
The information may comprise any type of information, such as, but not limited to, game information based on the physical movements of the objects, or information on the physical movements of the at least one first object and the at least one second object, such as those types of information described above with reference toFIGS. 1 through 4.
In one embodiment, at least one communications604module may comprise a hardware aspect, such as but not limited to, wireless communications hardware, such as, but not limited to, a wireless modem, a GSM modem, a Wi-Fi modem, an antenna, a satellite modem, a Bluetooth modem, and the like, or wired hardware, such as a DSL modem, a cable modem, a network card, a telephone modem, and the like. In yet another embodiment, at least one communications module604may comprise a software aspect, such as, but not limited to, a computer program or software, such as, but not limited to, a communications program, communications protocol, and the like. In yet another embodiment, at least one communications604module may comprise both a hardware aspect and a software aspect.
In yet another embodiment, system600may comprise at least one server module to host a game. In some embodiments, the game may be a game based on the at least one first physical movement of the at least one first object and the at least one second physical movement of the at least one second object. In yet another embodiment, the game may include determining whether the at least one first physical movement of the at least one first object and the at least one second physical movement of the at least one second object coincide.
In some embodiments, the at least one server module may comprise a hardware aspect, such, as but not limited to, server hardware, such as, but not limited to, a server processor, a server computer, and the like. In yet another embodiment, the at least one server module may comprise a software aspect, such as, but not limited to, a computer program and/or software, such as, but not limited to, server software, game hosting software, and the like. In a further embodiment, the at least one server module may comprise both a software aspect and a hardware aspect.
Referring now toFIG. 7A, at least one embodiment of system700is shown, wherein system700comprises at least one display module702to display at least one piece of content, and at least one input module704to receive information about at least one physical movement of at least one object in the at least one piece of content.
In one embodiment, at least one display module702may comprise a hardware aspect, such as a display device, which may include, but is not limited to, a television, a projector and/or projector screen, a computer monitor, and the like. In another embodiment, at least one display module702may comprise a software aspect, such as, but not limited to, a display computer program or software, such as, but not limited to, a media player, display drivers, and the like. In yet another embodiment, at least one display module702may comprise both a hardware aspect and a software aspect.
In yet another embodiment, at least one input module704may comprise a hardware aspect, such as input hardware and/or a device, which may include, but is not limited to, a mouse, a keyboard, buttons, a touch screen, or a gesture control system. In another embodiment, at least one input module704may comprise a software aspect, such as, but not limited to, input software or an input computer program, which may include, but is not limited to, a user interface, a gesture user interface, input software, keystroke or input interpretation software, and the like. In yet another embodiment, at least one input module704may comprise both a hardware aspect and a software aspect.
In some embodiments, at least one piece of content may be any type of content, such as, but not limited to, sports content, movie content, live content, music content, arts content, news content, television content, game content, and the like.
In other embodiments, the information about at least one physical movement of at least one object in the at least one piece of content may be any type of information, such as, but not limited to, the timing of the physical movement, the type of physical movement, the effectiveness of the physical movement, the result of the physical movement, the path of the physical movement, and the like.
In some embodiments, the at least one object may be an animate object, such as, but not limited to, a person (as shown inFIG. 3) or an animal. In an alternate embodiment, the at least one object may be an inanimate object, such as a game controller, or a user instrument, such as, but not limited to, musical instruments, tools, household objects, and sports equipment, such as, but not limited to, a stick, a baseball bat, a ball, a racket, a club, a paddle, a ring, and the like. In a further embodiment, the at least one object may comprise both animate and inanimate objects, such as, but not limited to, a person holding an instrument, such as a piece of sports equipment (as shown inFIG. 7B).
Accordingly, in various embodiments, at least one first physical movement of the at least one object may be any type of movement, including but not limited to, a dance movement, sports movement, such as, but not limited to, a throwing movement, a swinging movement, a jumping movement, a running movement, a diving movement, a flying movement, a shooting movement, a driving movement, a dancing movement, and the like, or any arbitrary movement.
In a further embodiment, system700may comprise at least one communications module to transmit or receive information. The information may include the information that is being received via the input module. In one embodiment, the at least one communications module may comprise a hardware aspect, such as but not limited to, wireless communications hardware, such as, but not limited to, a wireless modem, a GSM modem, a Wi-Fi modem, an antenna, a satellite modem, a Bluetooth modem, and the like, or wired hardware, such as a DSL modem, a cable modem, a network card, a telephone modem, and the like. In yet another embodiment, the at least one communications module may comprise a software aspect, such as, but not limited to, a computer program or software, such as, but not limited to, a communications program, communications protocol, and the like. In yet another embodiment, the at least one communications module may comprise both a hardware aspect and a software aspect.
Referring now toFIG. 7B, one embodiment of system700is shown wherein at least one display module702comprises at least one display device and/or display software, at least one input module704comprises at least one user interface hardware and/or software, such as a keyboard, at least one piece of content706comprises sport content, and at least one physical object708comprises a person and/or piece of sports equipment.
In one embodiment, system700may comprise of at least one set of or a plurality of display modules702and input modules704, wherein said modules may be located at a central monitoring station, such as at a television station or studio, or office building.
Overview of Method Embodiments
Referring now toFIG. 8, a method800is shown, wherein method800comprises using at least one processor to perform at least one of the following: tracking at least one first movement of at least one first object (block802), and comparing the at least one first physical movement of the at least one first object with at least one second physical movement of at least one second object in at least one piece of content (block804).
In one embodiment, the at least one processor may be any type of processor, such as those processors and processor modules described above with reference toFIGS. 1A through 7B.
In another embodiment, tracking at least one first movement of at least one first object may include using at least one sensor to track the at least one first movement of at least one first object. The at least one sensor may be any type of sensor, such as a 3D sensor, and those other embodiments of sensors described above with reference toFIGS. 1A through 7B.
In yet another embodiment, the at least one first physical movement and/or the at least one second physical movement may be any type of movement, such as those various physical movements described above with reference toFIGS. 1A through 7B.
In yet a further embodiment, at least one first object and/or at least one second object may be any type of object, such as those various embodiments of objects described above with reference toFIGS. 1A through 7B.
In yet a further embodiment, at least one piece of content may be any type of content, such as those various embodiments of content described above with reference toFIGS. 1A through 7B.
In yet a further embodiment, method800comprises using at least one processor to determine whether the at least one first physical movement of the at least one first object and the at least one second physical movement of the at least one second object coincide. In some embodiments, determining whether the two physical movements coincide may comprise may include using object and/or image recognition software (such as those described above with reference toFIGS. 1A through 7B) to analyze and/or extract the physical movements from an image captured using a sensor, such as those sensors described above with reference toFIGS. 1A through 7B.
In yet another embodiment, method800may further comprise using at least one processor to conduct a game based on whether the at least one first physical movement of the at least one first object and the at least one second physical movement of the at least one second physical object coincide. In some embodiments, conducting a game may comprise allotting points based on how well the two physical movements coincide.
Referring now toFIG. 9, one embodiment of method900is shown, wherein method900comprises using at least one processor to perform at least one of the following: comparing at least one first physical movement of at least one first object with at least one second physical movement of at least one second object (block902), and determining whether the at least one first physical movement of the at least one first object and the at least one second physical movement of the at least one second object coincide (block904).
In some embodiments, the at least one processor may be any type of processor and/or processing software, such as, but not limited to, those embodiments described above with reference toFIGS. 1A through 8.
In other embodiments, the at least one first physical movement and the at least one second physical movement may be any type of movement, such as, but not limited to, those types of physical movement described above with reference toFIGS. 1A through 8.
In yet other embodiments, at least one first object and at least one second object may be any type of objects, such as those described above with reference toFIGS. 1A through 8.
In one embodiment, method900may comprise using at least one processor to transmit or receive information about at least one of the at least one first physical movement of the at least one first object and the at least one second physical movement of the at least one second object.
In other embodiments, the information about at least one physical movement of at least one object in the at least one piece of content may be any type of information, such as, but not limited to, the timing of the physical movement, the type of physical movement, the effectiveness of the physical movement, the result of the physical movement, the path of the physical movement, and the like.
In a further embodiment, using at least one processor to transmit or receive information may comprise using at least one communications means to transmit or receive information, wherein the communications means may comprise a wired communications means, such as a land line modem, cable modem, DSL modem, and the like, or a wireless communications means, such as a wireless modem, a GSM modem, a satellite modem, and Wi-Fi adapter, and the like.
In another embodiment, method900may comprise using at least one processor to analyze at least one piece of content to determine information about at least one of the at least one first physical movement of the at least one first object and the at least one second physical movement of the at least one second object. In one embodiment, using at least one processor to analyze at least one piece of content may comprise using image and/or object recognition software to analyze the at least one piece of content, such as those types of image and/or object recognition software described above with reference toFIGS. 1A through 8. In some embodiments, the at least one piece of content may be any type of content, such as those described above with reference toFIGS. 1A through 8.
In a further embodiment, which may also be applied to the various embodiments of objects described throughout the present disclosure, at least one of the at least one first object and the at least one second object may be a subject of at least one piece of content. In another embodiment, which may also be applied to the various embodiments of objects described throughout the present disclosure, at least one of the at least one first object and the at least one second object may be a spectator of at least one piece of content.
Overview of Computer Readable Medium Embodiments
Another embodiment may comprise a computer readable medium having computer executable instructions operative to use at least one sensor to sense at least one first physical movement of at least one first object, and compare the at least one first physical movement of at least one first object with at least one second physical movement of at least one second object in at least one piece of content.
The computer readable medium may be any type of computer readable medium, such as, but not limited to, a memory chip, a hard drive, a CD-ROM, a DVD-ROM, a CD-RW, a USB memory stick, flash memory, random access memory, and the like.
Another embodiment may comprise a computer readable medium having computer executable instructions operative to perform any of the operations previously described with reference toFIGS. 1A through 9.
Overview of Hardware and Operating Environment
This section provides an overview of example hardware and the operating environments in conjunction with which embodiments of the inventive subject matter can be implemented.
A software program may be launched from a computer readable medium in a computer-based system to execute function defined in the software program. Various programming languages may be employed to create software programs designed to implement and perform the methods disclosed herein. The programs may be structured in an object-orientated format using an object-oriented language such as Java or C++. Alternatively the programs may be structured in a procedure-oriented format using a procedural language, such as assembly or C. The software components may communicate using a number of mechanisms, such as application program interfaces, or inter-process communication techniques, including remote procedure calls. The teachings of various embodiments are not limited to any particular programming language or environment. Thus, other embodiments may be realized, as discussed regardingFIG. 10below.
FIG. 10is a block diagram representing an article according to various embodiments. Such embodiments may comprise a computer, a memory system, a magnetic or optical disk, some other storage device, or any type of electronic device or system. The article1000may include one or more processor(s)1002coupled to a machine-accessible medium such as a memory1004(e.g., a memory including electrical, optical, or electromagnetic elements). The medium may contain associated information1006(e.g., computer program instructions, data, or both) which, when accessed, results in a machine (e.g., the processor(s)1002) performing the activities previously described herein.
The principles of the present disclosure may be applied to all types of computers, systems, and the like, include desktop computers, servers, notebook computers, personal digital assistants, and the like. However, the present disclosure may not be limited to the personal computer.
While the principles of the disclosure have been described herein, it is to be understood by those skilled in the art that this description is made only by way of example and not as a limitation as to the scope of the disclosure. Other embodiments are contemplated within the scope of the present disclosure in addition to the exemplary embodiments shown and described herein. Modifications and substitutions by one of ordinary skill in the art are considered to be within the scope of the present disclosure.
Claims
- A system comprising: i) at least one processor;and ii) computer executable instructions readable by the at least one processor and operative to: a) use a display device to display at least one live televised sporting event to at least one television audience member of the at least one live televised sporting event, wherein the at least one live televised sporting event is a television broadcast of at least one live sporting event occurring in a first physical location, and wherein the at least one television audience member is a person located at a second physical location, different and remote from the first physical location, and observing the television broadcast of the at least one live sporting event;b) use at least one first 3D camera to track at least one first physical movement of at least one first object or person participating in the at least one live sporting event, wherein the at least one first 3D camera is physically positioned in the first physical location;c) use at least one second 3D camera to track at least one second physical movement of the at least one television audience member or at least one second object used by the at least one television audience member, while the at least one television audience member is attempting to mimic the at least one first physical movement, wherein the at least one second 3D camera is physically positioned in the second physical location;d) compare the at least one first physical movement with at least one second physical movement of the at least one television audience member of the at least one live televised sporting event or the at least one object used by the at least one television audience member;and e) based on the comparing, determine, in a video game scenario, if the tracked second physical movement of the at least one television audience member or the at least one object used by the at least one television audience member mimics the tracked at least one first physical movement of the at least one first object or person participating in the at least one live televised sporting event.
- The system of claim 1 , wherein the computer executable instructions are further operative to: transmit or receive information to or from at least one central or remote station.
- The system of claim 2 , wherein the information comprises: i) information related to the at least one live televised sporting event;ii) information related to the at least one first physical movement or the at least one second physical movement;iii) information related to the video game based on the at least one first physical movement or the at least one second physical movement;iv) information related to the at least one television audience member or object used by the at least one television audience member;or v) information related to the at least one first object or person participating in the at least one live televised sporting event.
- The system of claim 1 , wherein the video game is conducted during the television broadcast of the at least one live sporting event.
- The system of claim 1 , wherein, the computer executable instructions are further operative to determine a score based on whether the at least one television audience member or the at least one object used by the at least one television audience member mimics the at least one first physical movement of the at least one first object or person participating in the at least one live televised sporting event.
- A system comprising: i) at least one first 3D camera module camera physically positioned in a first physical location to track at least one first physical movement of at least one audience member of at least one piece of televised content, or at least one object used by the at least one audience member located at the first physical location observing the at least one piece of televised content, wherein the first physical location is different and remote from a second physical location where the at least one piece of televised content is broadcasted, wherein the tracking occurs while the at least one audience member is attempting to mimic at least one second physical movement of at least one participant or subject of the at least one piece of televised content or at least one object used by the at least one participant or subject, and wherein the at least one second physical movement is tracked using at least one second 3D camera physically positioned at the second physical location;and ii) at least one processor module to: a) compare the at least one first physical movement with at least one second physical movement of at least one object or person participating in the at least one piece of televised content;and b) based on the comparing, determine, in a video game scenario, if the tracked first physical movement of the at least one audience member or the at least one object used by the at least one audience member mimics the at least one second physical movement of the at least one object or person participating in the at least one piece of televised content.
- The system of claim 6 , wherein, the video game is conducted during the broadcast of the at least one piece of televised content.
- The system of claim 6 , further comprising at least one display module to display the at least one piece of televised content to the at least one audience member.
- The system of claim 6 , further comprising at least one communications module to transmit or receive information to or from a central station or remote station.
- A method comprising: a) using at least one processor configured to perform: i) using at least one first 3D camera physically positioned in a first physical location to track at least one first physical movement of at least one television content audience member of at least one televised content, or at least one object used by the at least one television content audience member located at the first physical location observing the at least one televised content, wherein the first physical location is different and remote from a second physical location where the at least one televised content is broadcasted, wherein the tracking occurs while the at least one television content audience member is attempting to mimic at least one second physical movement of at least one participant or subject of the at least one televised content or at least one object used by the at least one participant or subject, and wherein the at least one second physical movement is tracked using at least one second 3D camera physically positioned at the second physical location;ii) comparing the at least one first physical movement of the at least one television content audience member of the at least one televised content, or the at least one object used by the at least one television content audience member with the at least one second physical movement of at least one participant or subject of the at least one televised content or at least one object used by the at least one participant or subject;iii) based on the comparing, determining, in a video game scenario, whether the tracked first physical movement of the at least one television content audience member mimics the tracked at least one second physical movement of the at least one participant or subject of the at least one televised content or at least one object used by the at least one participant or subject.
- The method of claim 10 , wherein the at least one processor is further configured to transmit or receive information about at least one of the at least one first physical movement or the at least one second physical movement.
- The method of claim 10 , wherein the at least one processor is further configured to analyze the at least one televised content, or a recording of the at least one televised content, to determine information about at least one of the at least one first physical movement or the at least one second physical movement.
- The method of claim 10 , wherein the at least one televised content comprises at least one live televised sporting event.
- The method of claim 10 , wherein the at least one participant or subject of the at least one televised content or the at least one object used by the at least one participant or subject comprises at least one player or at least one piece of sports equipment used by the at least one player.
Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.