U.S. Pat. No. 9,873,038

INTERACTIVE ELECTRONIC GAMES BASED ON CHEWING MOTION

AssigneePerfetti Van Melle Benelux BV

Issue DateDecember 11, 2015

Illustrative Figure

Abstract

An interactive game in which one or more game characters are controlled by chewing, or other mouth-related movements of one or more players. The game may be used in conjunction with marketing of a chewable product, and the game may include a feature in which data associated with a particular product may be detected by a device associated with the game to affect play of the game. The data may be in the form of graphics such as a QR code, bar code or other code on the package interior or exterior that may be read by a camera, or may be in other forms, e.g., a code provided on a website, in an e-mail or text message, or in another way, to be entered manually by the player. The game may be played on a mobile electronic device. The game play may comprise one or more actions of the character(s) that are caused by one or more different mouth movements of the player(s). The actions of the characters caused by the mouth movements of the users may comprise actions that are different from and normally unrelated to mouth movements, e.g., walking, running, jumping, swinging, spinning, falling, flying, climbing, or other movements involving travel of the character.

Description

DETAILED DESCRIPTION An interactive electronic game as described herein may be controlled by the chewing motion of a user. In particular, the electronic mobile device may include one or more cameras positioned to face one or more users and to observe and detect chewing motion by the user(s), which chewing motion is then translated into movement of one or more in-game characters on a display screen of the mobile device. The interactive electronic games described herein provide intuitive game play, requiring little or no instruction and continuous game flow, which may facilitate longer game play sessions by users. FIG. 1illustrates a graphical representation of an exemplary electronic mobile device10for running interactive games and implementing methods of controlling interactive games as described herein. The mobile device10ofFIG. 1includes a processor12electrically coupled to a power supply14, a memory16, inputs/outputs18, a display screen20, a camera22, and manual controls24. The processor12may be processor programmable to run an operating system and/or software applications on the mobile device10, as well as to execute instructions to cause software applications to perform various functions, such as during play of an interactive game stored on the mobile device10. The power supply14may be a battery or battery pack that provides power to the mobile device10. The memory16may be, for example, volatile or non-volatile memory (e.g., a non-transitory storage medium such as a hard drive) capable of storing an operating system and software programs including interactive electronic games. The inputs/outputs18may include audio inputs/outputs such as an audio receiver, a microphone and a speaker usable, for example, during phone calls using the mobile device10. The inputs/outputs18may also include wireless transmitter and receiver for sending and receiving wireless signals during phone calls and Internet access by the mobile device10. The manual controls24may include one or more keys that may be depressed by the user ...

DETAILED DESCRIPTION

An interactive electronic game as described herein may be controlled by the chewing motion of a user. In particular, the electronic mobile device may include one or more cameras positioned to face one or more users and to observe and detect chewing motion by the user(s), which chewing motion is then translated into movement of one or more in-game characters on a display screen of the mobile device. The interactive electronic games described herein provide intuitive game play, requiring little or no instruction and continuous game flow, which may facilitate longer game play sessions by users.

FIG. 1illustrates a graphical representation of an exemplary electronic mobile device10for running interactive games and implementing methods of controlling interactive games as described herein. The mobile device10ofFIG. 1includes a processor12electrically coupled to a power supply14, a memory16, inputs/outputs18, a display screen20, a camera22, and manual controls24. The processor12may be processor programmable to run an operating system and/or software applications on the mobile device10, as well as to execute instructions to cause software applications to perform various functions, such as during play of an interactive game stored on the mobile device10. The power supply14may be a battery or battery pack that provides power to the mobile device10. The memory16may be, for example, volatile or non-volatile memory (e.g., a non-transitory storage medium such as a hard drive) capable of storing an operating system and software programs including interactive electronic games. The inputs/outputs18may include audio inputs/outputs such as an audio receiver, a microphone and a speaker usable, for example, during phone calls using the mobile device10. The inputs/outputs18may also include wireless transmitter and receiver for sending and receiving wireless signals during phone calls and Internet access by the mobile device10. The manual controls24may include one or more keys that may be depressed by the user to control one or more functions of the mobile device10, such as activating and closing applications, switching between applications, and rebooting the mobile device10.

The display screen20may be, for example, an LCD, TFT-LCD, and/or LED electronic visual display that displays content running on the mobile device10to a user. The display20may be a touch-screen display or a display that relies solely on input keys. The camera22may include one or more cameras incorporated into the mobile device10. For example, the mobile device may include one “rear-facing” camera22positioned such that a user's face is visible to the camera when the mobile device10is oriented such that the user faces the display screen20, and one front-facing camera22positioned such that the camera22points away from the user's face when the mobile device10is oriented such that the display screen20is facing away from the user. In a mobile device10having a rear-facing camera22and a front-facing camera22, the interactive game100may be alternatively or simultaneously played by two users—a first user facing the rear-facing camera22and a second user facing the front-facing camera22.

FIG. 2illustrates the mobile device10displaying on its display screen20a screen shot of an interactive game100according to one exemplary embodiment. The interactive game100generates a window102overlaying a portion of the interactive game100on the display screen20. A real-time video feed of the user as captured by the camera22is displayed in the window102, as shown inFIG. 2. Portions of the user's head, face, mouth, lips, and other facial features may be visible in the window102. Because this example interactive game100is controlled by the chewing motion of the user, in one embodiment, only the mouth of the user may be displayed in the window102.

The interactive game100, using the camera22, permits a user to calibrate the user's mouth movement to provide precise control of the interactive game100via the user's chewing motion. In the embodiment shown inFIG. 2, prior to initiating play of the interactive game100, the chewing motion of the user is calibrated via a virtual grid104positioned at least in part over a mouth of the user. The virtual grid104is movable in real time in response to the chewing motion of the user. The interactive game100may include a calibration indicator105that is visible to the user and indicates to the user a whether the calibration is successful or not. In the embodiment illustrated inFIG. 2, the calibration indicator105may include an area (e.g., “Good”) correlating to a successful calibration of the chewing motion of the user relative to the virtual grid104. It is to be appreciated that while the virtual grid104is depicted as being visible to the user inFIG. 2, the virtual grid104may optionally be invisible to the user during calibration and/or game play.

For purposes of this application, “chewing motion” is defined as movement of one or more of the lips of a user when the mouth of the user is closed and/or open, movement of areas of the face of the user surrounding the lips, chin, and jaw of the user generated when chewing or imitating chewing of a chewable product such as food, gum, or the like. For example, game play of an interactive game may comprise one or more actions of the in-game character(s) that are caused by one or more different mouth movements (during actual chewing or during emulation of chewing) of the user(s) playing the game. The actions of the game characters caused by the mouth movements of the users may comprise actions that are different from and normally unrelated to mouth movements, for example, walking, running, jumping, swinging, spinning, falling, flying, climbing, or other movements involving travel of the character.

In the illustrated form, the interactive game100detects the chewing motion of the user by determining, via the camera22, position of one or more of the mouth, lips, chin, or jaw of the user relative to a position of the virtual grid104overlaying at least the mouth of the user. For example, when the mouth of the user is more closed as shown inFIG. 3, the virtual grid104has a first height, and when the mouth of the user is more open as shown inFIG. 5, the virtual grid104is at a second height that is greater than the first height.

As discussed above, the interactive game100detects chewing motion of the user to control game play. Preferably, the interactive game100includes facial recognition features that provide for minimal latency between the chewing motion of the user and game response. The interactive game100may include face tracking software, as described in more detail in reference toFIG. 11, that provides a smooth game play in light of slight occlusion, various light levels, and variable source imagery. In one approach, in a condition where ambient conditions prevent or restrict play of the interactive game100, the interactive game100may pause or shut down and inform the user of one or more issues preventing or restricting game play and a solution to the issue.

FIG. 3depicts a screenshot of the exemplary interactive game100during game play. The interactive game100is illustrated in an environment of a 2D vertically scrolling world106, where the objective is to progress movement of a game or player-controlled character108upwards while avoiding “jeopardy” such as a monster109or an environment such as a hole, fire, or water that permits the player character108to fall in and perish at the base of the display screen20. During game play of the interactive game100, the player character108is visible to the user on the display screen20of the mobile device100as shown inFIGS. 3-5.

The vertically scrolling world106of the interactive game100includes swing points110that allow the game character108to move upwards away from the monster109by swinging about one swing point110and jumping to another swing point110, as shown inFIGS. 3 and 4. The vertically scrolling world106may also include a plurality of collectible items112such as coins, food items, gems, or other collectibles that may be acquired by the game character108when the game character108comes into contact with the collectible items112while traveling between the swing points110. The interactive game100may also display a score114accumulated by the user during a game play session and a timer116indicating duration of the game play session. The game world106may expand as the player character108progresses through the interactive game100, providing a continuous game play flow, which may advantageously encourage users to participate in longer game play sessions and provide motivation to the users to play the interactive game100often.

The exemplary interactive game100detects, using the camera22, the chewing motion of the user and causes the player character108to perform one or more actions in response to the detected chewing motion of the user. For example, the user may use chewing motion to control swing speed around a swing point110and open his or her mouth to disconnect from the swing point110to move to the next swing point110, optionally collecting one or more collectible items112on the way. In one embodiment, detection of chewing increases swing speed, i.e., rotation speed, by a predetermined amount, regardless of the rate of chewing. The interactive game100may detect an increase in the speed of the chewing motion of the user and correspondingly increase the speed of rotation of the player character108about a swing point110. Similarly, the interactive game100may detect a decrease in the speed of the chewing motion of the user and correspondingly decrease the speed of rotation of the player character108about a swing point110. The chewing motion of the user may also be used to control other in-game features of the interactive game100, including, for example, navigating grind rails and alerting non-player characters such as additional monsters and the like.

Each game play session of the interactive game100may include a point scoring system combined with point multipliers to advantageously provide for point goal setting by the users, which may extend long-term playability of the interactive game100. For example, game play scores may be based on accumulating the collectibles112located throughout each level of the game world106and/or based on distance progressed through a level or levels of the interactive game100. Multipliers may allow the users to gain higher scores over a number of gaming sessions, thereby permitting the users to progress further in the interactive game100.

User scores beyond a certain predetermined number, for example, above 100,000, may be displayed in association with an alias or nickname of the user as “badges” and/or “medals.” The “badges” and/or “medals” may be stored on the mobile device10, or on a remote server including a social game center (e.g., Apple Game Center), where the “badges” and “medals” may be displayed to other users of the interactive game100. A multiplayer game center may include a leader board indicating the highest scores achieved by the users when playing the interactive game100and/or the largest distances traveled by player characters108controlled by the users of the interactive game100. Such display of scores and accomplishments of the users in a social game center or other social media may promote multiplayer competition.

At the end of each game play session when the user loses by permitting the player character108to fall into the mouth of the monster109pursuing the player character108, the interactive game100permits the user, using the camera22of the mobile device10, to snap a photographic image118of at least the face of the user. In the embodiment shown inFIG. 6, the image118is combined with artwork from the interactive game100to produce a personalized graphic120such as an image of the face and/or head of the user inside the mouth of the monster109, and optionally including the score114accumulated by the user during the game session. The interactive game100may also permit the mobile device10to save this graphic image120to the memory16of the mobile device, for example, to a “camera roll” on a hard drive as provided by mobile devices10such as IPhones and Ipads.

As shown inFIG. 7, the interactive game100permits the user to share the graphic120with one or more recipients via email122. In addition, the interactive game100permits the user to share the graphic120with one or more recipients via text messaging. In addition, the interactive game100permits the user to share the graphic120with one or more recipients via one or more social media websites, such as Facebook, Twitter, Instagram, or the like. The sharing of the graphic120by the user with other recipients via email, text messaging, and/or social media websites may interest an increasing number of users to play the interactive game100while promoting a consumer product such as chewing gum associated with the interactive game100to more and more users. As referenced in the preceding paragraph, the interactive game100may be associated with one or more consumer products such as chewing gum to advertise, market, and promote sales of such consumer products. For example, the interactive game100may be used in conjunction with marketing of a chewable product such as gum, candy, or the like. As described in more detail below, the interactive game100may include a feature in which data associated with a particular consumer product may be detected by a device associated with the interactive game100to affect play of the game. The data may be in the form of graphics such as a QR code, bar code or other code displayed on an exterior or interior of a package containing the consumer product, on a wrapper of a consumer product itself such that a camera of a hand-held electronic device (e.g., the camera22of the mobile device10) having the interactive game100installed may be used to read the QR code, bar code, or another code. Alternatively, coded data (e.g., a number, or a combination of letters and numbers) may be provided to a user of the hand-held electronic device on a website, in an e-mail or text message, or in another way, to be entered manually into the interactive game100by the user.

For example, with reference toFIGS. 8 and 9, a chewing gum product may be associated with the interactive game100by including a coded image202on its exterior or interior packaging200. The coded image202may include words recognizable by and visible to a user, as shown inFIGS. 8 and 9. Alternatively, the coded image202may include symbols that may not be read by a user (e.g., a bar code), or symbols that are not visible to the user. The coded image202, when photographed by the camera22of the mobile device10having the interactive game100installed thereon, and uploaded into the interactive game100, for example, as a patch, provides new game content204, as shown inFIG. 10, that may enhance game play of the interactive game100.

For example, with the interactive game100running on the mobile device10, the user may select an option to photograph and/or scan a coded image202on an exterior of the package200using the camera22of the mobile device10, as shown inFIG. 8. The coded image202photographed and/or scanned by the camera22is then uploaded to the mobile device10and incorporated into the interactive game100as shown inFIG. 9as data that provides additional content such as, for example, game upgrades or patches as shown inFIG. 10. The interactive game100may be used to facilitate marketing of one more identical or different chewable consumer products. For example, one coded image representing a patch for the interactive game100may be sold on one package of one type of a chewable consumer product (e.g., one type of gum), another coded image representing a different patch for the interactive game100may be sold on another package of the same consumer product, and yet another coded image representing yet another patch for the interactive game100may be sold on packaging of a different consumer product (e.g., another type of gum or a chewable product other than gum). This way, scanning different coded images on different consumer product packages using the camera22of the mobile device10provides user s with different game patches affecting game play of the interactive game100in different ways depending on which product package is scanned and which coded image is detected thereon.

An upgrade or patch acquired as a result of buying and scanning a pack of gum associated with the interactive game100may, for example, increase points and/or score multipliers associated with certain in-game actions or achievements, increase the speed of the player character108or the monster109, create a wind that facilitates the player character108in moving upwards away from the monster109, slow down the monster109, provide the user with a second chance after the player character108is swallowed by the monster109, or any other suitable upgrades that may enhance the game play. The incorporation into gum packs200of coded images and/or symbols decodable by the mobile device10as patches and/or upgrades for the interactive game100may facilitate user's interest in the interactive game100and promote sales of the gum packs200.

The mobile device10may be a cellular phone (e.g., an IPhone or the like), a tablet computer (e.g., an IPad or the like), a hand-held video game console, or the like. The mobile device10may be a smart phone, such as a touch-screen cellular phone, or a conventional cellular phone that relies solely on input keys. The mobile device10may run on any operating system, for example, Apple OS, Android, or the like. While a specific interactive electronic game100has been described above with reference toFIGS. 2-10, it will be appreciated that the interactive game100may be in a form of any interactive electronic game appropriate for control by chewing motion of a user, such as an interactive electronic game when a player character or any other character moves or otherwise performs an action responsive to chewing motion by the user of the interactive game.

An exemplary method300of detecting chewing motion of the user while calibrating and/or playing the interactive game100is illustrated inFIG. 11. In Step302of the method300, a face detector component of the interactive game100according to one embodiment searches for a human face in a video feed generated by the camera22of the mobile device10when facing a user. Such a video feed may be displayed in the window102of the interactive game as shown inFIG. 2.

In Step304, a point tracker component of the interactive game100selects one or more locations of the face of the user within a video frame and chooses key points around the face of the user to track. As the face of the user moves, the points follow the face, and adjust the region of interest used to detect a chew. Part of the process indicated in Step302may be performed on a graphics processing unit (GPU) of the mobile device10.

In Step306, a chew detector component of the interactive game100may take several strips of pixels between a bottom of the nose and the chin of the user, based on a region of interest data provided by the point tracker. The values of these pixels are stored, for example, in the memory16of the mobile device10, and compared to the values in the subsequent video frames. By comparing color values of the pixels, it may be possible for the interactive game100to detect a trend in movement of the pixels, and consequently, a trend in movement of the mouth of the user. For example, when a bottom jaw of the user is detected by the camera22to be moving upwards, such as during a motion to close the mouth of the user, a ‘chew down’ event is triggered. Similarly to Step302, part of Step304may be performed on the GPU of the mobile device10.

In Step308, a re-track trigger component of the interactive game100may, when a ‘chew down’ action is detected, restart the sequence all over again to ensure that the incoming data is as accurate as possible. A re-track may also be triggered if a face of the user cannot be found after a predetermined short period of time (e.g., 1 second, 2 seconds, 3 seconds, 5 seconds, 10 seconds, or longer) since the last ‘chew down’ action was seen by the camera22of the mobile device10.

The method may include displaying various promotional, instructional or other messages such as an “Insert Gum” message prior to initiation of game play. The method may also include providing an outline of a face in a window displaying the player's face to facilitate a player's alignment of an image of his or her face in a proper position in the display. When the player's face is properly aligned with the outline, a message such as “Let's go!” may be displayed, and calibration and/or game play may be immediately initiated. Alternatively, the game may provide the user with the opportunity to initiate play, e.g., by awaiting detection of a user input such as a chewing motion or opening of the user's mouth, and may prompt the user to provide the requisite input, e.g., by displaying a message such as “Chew once to initiate play.”

If the player's face moves relative to the camera thereafter such that the player's face is out of proper position, the game may pause and display one or more messages to instruct the player on how to correct the problem, such as “Can't see you” or “Off Center” or “Too Far” or “Too Close” and/or “Move your face to the outline” or “Keep steady.” These messages or similar messages may also be displayed prior to initial detection of the player's face in proper position. Again, once proper position of the player's face is achieved, the game may start immediately, or may provide the user with the opportunity to control resumption of play, e.g., by awaiting detection of a user input such as a chewing motion or opening of the user's mouth, and may prompt the user to provide the requisite input, e.g., by displaying a message such as “Chew once to initiate play.”

When certain thresholds are achieved, e.g., certain point levels are reached, or when a certain position in the field of play is reached, the game may recognize the achievement by providing, e.g., the opportunity for the player to unlock a feature of the game, and may notify the player by displaying a written phrase such as “New Unlock Available.”

As alternatives to visible display of words and phrases as mentioned above, the game may provide other visual cues such as icons, emoticons, pictures, etc., alone or in combination with words or phrases, and/or audible cues. The game may have alternative methods of play, e.g., the game may provide the player with the option to control play by tapping the screen instead of providing inputs detected by the camera, and may provide for premiums such as increased scoring opportunities, or a points multiplier, for play in one mode or the other, e.g. providing for double points for achievements in the mode in which game play is controlled by camera inputs.

Certain functions of the game, such as detection of chewing or other mouth movements may be implemented by software such as that referred to above and filed as an appendix to this patent application.

The methods described herein advantageously provide users of mobile devices with an ability to control play interactive electronic games installed on the mobile devices simply by detecting chewing motion of the user and translating such chewing motion into movement of a player character in the interactive game. Further, interactive games as described above advantageously provide users with an ability to photograph and/or scan packaging of consumer products associated with the interactive game to facilitate advertising and sales of such consumer products.

Claims

  1. A method of controlling an interactive game on a display screen of a mobile electronic device, the method comprising: displaying on the display screen of the mobile electronic device the interactive game including a character controllable by a user of the mobile electronic device;detecting, using a camera, chewing motion of the user;and causing the character to perform a first action on the display screen in response to the chewing motion of the user;wherein the displaying the interactive game includes displaying a window overlaying a portion of the interactive game on the display screen and displaying in the window a real time video of at least a mouth of the user captured by the camera.
  1. The method of claim 1 , further comprising displaying in the window a virtual grid overlaying at least the mouth of the user displayed in the window;and modifying the virtual grid in real time in response to chewing motion of the user.
  2. A method of controlling an interactive game on a display screen of a mobile electronic device, the method comprising: displaying on the display screen of the mobile electronic device the interactive game including a character controllable by a user of the mobile electronic device;detecting, using a camera, chewing motion of the user;and causing the character to perform a first action on the display screen in response to the chewing motion of the user;wherein the detecting chewing motion of the user includes detecting the chewing motion of the user by determining, via the camera, position of at least one of chin, jaw, lips, and mouth of the user relative to a position of a virtual grid overlaying at least the mouth of the user.
  3. The method of claim 3 , further comprising calibrating the chewing motion of the user relative to the virtual grid prior to initiating start of the interactive game.
  4. A method of controlling an interactive game on a display screen of a mobile electronic device, the method comprising: displaying on the display screen of the mobile electronic device the interactive game including a character controllable by a user of the mobile electronic device;detecting, using a camera, chewing motion of the user;and causing the character to perform a first action on the display screen in response to the chewing motion of the user;wherein the detecting the chewing motion of the user includes detecting a speed of the chewing motion of the user and causing the character on the display screen to move faster in response to detecting an increase in the speed of the chewing motion and to move slower in response to detecting a decrease in the speed of the chewing motion.
  5. A method of controlling an interactive game on a display screen of a mobile electronic device, the method comprising: displaying on the display screen of the mobile electronic device the interactive game including a character controllable by a user of the mobile electronic device;detecting, using a camera, chewing motion of the user;and causing the character to perform a first action on the display screen in response to the chewing motion of the user;further comprising detecting opening of a mouth of the user, and causing the virtual character to perform a second action in the interactive game different from the first action.
  6. A method of controlling an interactive game on a display screen of a mobile electronic device, the method comprising: displaying on the display screen of the mobile electronic device the interactive game including a character controllable by a user of the mobile electronic device;detecting, using a camera, chewing motion of the user;and causing the character to perform a first action on the display screen in response to the chewing motion of the user;further comprising taking a photograph of at least a face of the user when a session of the video game ends, and incorporating at least a portion of the photograph into a graphic of the interactive game including a score accumulated by the user while playing the interactive game, and further comprising permitting the user to share the graphic with one or more people via at least one of an electronic mail, a text message, and a social media website.
  7. A method of controlling an interactive game on a display screen of a mobile electronic device, the method comprising: displaying on the display screen of the mobile electronic device the interactive game including a character controllable by a user of the mobile electronic device;detecting, using a camera, chewing motion of the user;and causing the character to perform a first action on the display screen in response to the chewing motion of the user;further comprising scanning an image on a packaging of a consumer product using the mobile electronic device and incorporating data retrieved in response to the scanning of the image into the interactive game to affect game play of the interactive game.
  8. The method of claim 8 , wherein the mobile electronic device is at least one of a mobile phone, a tablet computer, and a hand-held video game console.
  9. An apparatus configured to allow play of an interactive game, the apparatus comprising: a display screen;a camera;a memory storing instructions relating to the interactive game;and a processing device in operative communication with the display screen, the camera, and the memory, the processing device configured to access the instructions to effect: displaying on the display screen the interactive game including a character controllable by a user of the electronic device;displaying a window overlaying a portion of the interactive game on the display screen to show an image of at least a mouth of the user captured by the camera;displaying in the window a virtual grid overlaying at least the mouth of the user and movable in real time in response to chewing motion of the mouth of the user;detecting the chewing motion of the user by determining, via the camera, position of at least one of a chin, a jaw, a lip, and a mouth of the user relative to the virtual grid;and causing the character to perform a first action on the display screen in response to the chewing motion by the user.
  10. The apparatus of claim 10 , wherein the processing device is configured to effect detecting opening of the mouth of the user and causing the character to perform a second action different from the first action on the display screen.
  11. The apparatus of claim 10 , wherein the processing device is configured to effect using the camera to take a photograph of at least a face of the user when a session of the interactive game ends and incorporate at least a portion of the photograph into a graphic of the interactive game including a score accumulated by the user while playing the interactive game, and wherein the processing device is configured to effect using the camera to scan an image on a packaging of a consumer product and incorporating data retrieved in response to the scanning of the image into the interactive game to affect game play of the interactive game.
  12. A non-transitory computer readable storage medium for storing instructions that, in response to execution by a processor, cause the processor to perform operations for displaying and controlling an interactive game on an electronic device, the operations comprising: displaying on the display screen of the electronic device the interactive game including a character controllable by a user of the electronic device;detecting, using a camera, chewing motion of the user;and causing the character to perform a first action on the display screen in response to the chewing motion of the user;further comprising instructions that, in response to execution by the processor, cause the processor to perform further operations comprising detecting opening of a mouth of the user and causing the virtual character to perform a second action different from the first action on the display screen.
  13. A non-transitory computer readable storage medium for storing instructions that, in response to execution by a processor, cause the processor to perform operations for displaying and controlling an interactive game on an electronic device, the operations comprising: displaying on the display screen of the electronic device the interactive game including a character controllable by a user of the electronic device;detecting, using a camera, chewing motion of the user;and causing the character to perform a first action on the display screen in response to the chewing motion of the user;further comprising instructions that, in response to execution by the processor, cause the processor to perform further operations comprising taking a photograph of at least a face of the user when a session of the interactive game ends and incorporating at least a portion of the photograph into a graphic of the interactive game including a score accumulated by the user while playing the interactive game, and further comprising instructions that, in response to execution by the processor, cause the processor to perform further operations comprising scanning an image on a packaging of a consumer product and incorporating data retrieved in response to the scanning of the image into the interactive game to affect game play of the interactive game.
  14. A method of interacting with customers of a chewable product through a game, comprising: marketing a plurality of different chewable products to customers;and providing a game that is playable on a hand-held device that includes a camera and a display screen;displaying on the display screen an image of a game character that is capable of travel relative to other images on the display screen;processing input from the camera to detect a chewing motion of a mouth of a player;in response to detection of the chewing motion, affecting travel of the game character displayed on the screen such that game play is controlled at least partially by detection of the chewing motion.
  15. The method of claim 15 further comprising detecting packaging of the plurality of different chewable products with the camera, and affecting the game play in response to the detection of such packaging in different ways depending on which of the plurality of different chewable products is detected.
  16. The method of claim 16 wherein the affecting movement of the character comprises accelerating movement of the game character relative to images in a 2D vertically scrolling world in response to the player's chewing.
  17. The method of claim 15 , wherein the processing input from the camera comprises detecting a nose and chin in a first frame using software running on a CPU;detecting a region darker than adjacent regions between the mouth and chin in subsequent frames using software running on a graphics processing unit;and comparing changes in and adjacent to the region from frame to frame by analysis of individual pixels using a graphics processing unit.

Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.