U.S. Pat. No. 11,068,042
Detecting and Responding to an Event within an Interactive Videogame
AssigneeRoku, Inc.
Issue DateAugust 26, 2020
U.S. Patent No. 11,068,042: Detecting and responding to an event within an interactive videogame
U.S. Patent No. 11,068,042: Detecting and responding to an event within an interactive videogame
Issued July 7, 2021, to Roku, Inc.
Filed: August 26, 2020 (claiming priority to March 12, 2013)
Oh boy, I wish I could get spam in a video game, too….
Overview:
U.S. Patent No. 11,068,042 (the ‘042 patent) relates to detecting an event within an interactive video game and displaying a notification having to do with that event. The ‘042 patent details a system monitoring play of an interactive video game, searching for matches between a reference and the gameplay to detect the happening of events. When an event occurrence is detected a notification is presented, referencing the occurrence of the event. The notifications could be sent to another user socially connected with the first user, and could consist of indicating level completion, player defeat, or other events. The notification could also offer the player game help, in-game purchases, or real-world purchases and contain a hyperlink to the information. The ‘042 patent could offer players of interactive games notifications associated with events as they occur. Commonplace today, but the patent dates back to 2013, so perhaps it wasn’t as common then.
Abstract:
As a user is being presented with interactive media by a presenting device, a separate monitoring device may be used to monitor the presentation of the interactive media and detect an event that occurs therein. Such a monitoring device may be configured and positioned to access media content from the presentation of the interactive media. For example, the monitoring device may be configured and positioned to record video content with a camera and record audio content with a microphone. Having accessed this media content, the monitoring device may generate an identifier, such as a fingerprint or watermark, of the media content and compare the generated identifier with a reference identifier that is generated from the source of the media content. Based on the generated identifier matching the reference identifier, the monitoring device may detect that an event has occurred within the interactive media presentation and present a corresponding notification.
Illustrative Claim:
- A method comprising: detecting, by a machine, an occurrence of an event within an interactive videogame, wherein detecting the occurrence of the event within the interactive videogame comprises detecting multiple matches each between a respective reference identifier of the interactive video game (“reference identifier) and a respective identifier established from presentation by a device of the interactive videogame (“established identifier”), wherein detecting the multiple matches includes detecting that a first reference identifier matches a second established identifier and detecting that a third reference identifier matches a fourth established identifier, and wherein each identifier is selected from the group consisting of a fingerprint generated from the interactive videogame and a watermark extracted from the interactive videogame; and responsive to at least detecting the occurrence of the event within the interactive videogame, causing presentation of a notification that references the occurrence of the event.
Illustrative Figure
Abstract
As a user is being presented with interactive media by a presenting device, a separate monitoring device may be used to monitor the presentation of the interactive media and detect an event that occurs therein. Such a monitoring device may be configured and positioned to access media content from the presentation of the interactive media. For example, the monitoring device may be configured and positioned to record video content with a camera and record audio content with a microphone. Having accessed this media content, the monitoring device may generate an identifier, such as a fingerprint or watermark, of the media content and compare the generated identifier with a reference identifier that is generated from the source of the media content. Based on the generated identifier matching the reference identifier, the monitoring device may detect that an event has occurred within the interactive media presentation and present a corresponding notification.
Description
DETAILED DESCRIPTION Example methods and systems are directed to detection of one or more events within interactive media (e.g., within a presentation of interactive media). Examples merely typify possible variations. Unless explicitly stated otherwise, components and functions are optional and may be combined or subdivided, and operations may vary in sequence or be combined or subdivided. In the following description, for purposes of explanation, numerous specific details are set forth to provide a thorough understanding of example embodiments. It will be evident to one skilled in the art, however, that the present subject matter may be practiced without these specific details. A device (e.g., a computer, game console, or mobile device) may be used to present interactive media (e.g., a game, such as a videogame) to a user of the device (e.g., a player of the game). The interactive media may be stored on the device (e.g., in local memory or other storage) and presented by the device (e.g., by executing a software application, applet, or app). In other situations, the interactive media may be stored by a server (e.g., a game server) and provided to the device by the server (e.g., streamed live, downloaded portion by portion, or downloaded in full) for presentation by the device. The interactive media may include media files that each store media content (e.g., video content, image content, or audio content), and a presentation of the interactive media may be generated, presented, or both, by the device based on user input that influences or controls which media files are included in the presentation. That is, the user input may fully or partially determine whether and when a particular media file is included in the presentation. As the user is being presented with the interactive media by a presenting device (e.g., first device), a monitoring ...
DETAILED DESCRIPTION
Example methods and systems are directed to detection of one or more events within interactive media (e.g., within a presentation of interactive media). Examples merely typify possible variations. Unless explicitly stated otherwise, components and functions are optional and may be combined or subdivided, and operations may vary in sequence or be combined or subdivided. In the following description, for purposes of explanation, numerous specific details are set forth to provide a thorough understanding of example embodiments. It will be evident to one skilled in the art, however, that the present subject matter may be practiced without these specific details.
A device (e.g., a computer, game console, or mobile device) may be used to present interactive media (e.g., a game, such as a videogame) to a user of the device (e.g., a player of the game). The interactive media may be stored on the device (e.g., in local memory or other storage) and presented by the device (e.g., by executing a software application, applet, or app). In other situations, the interactive media may be stored by a server (e.g., a game server) and provided to the device by the server (e.g., streamed live, downloaded portion by portion, or downloaded in full) for presentation by the device. The interactive media may include media files that each store media content (e.g., video content, image content, or audio content), and a presentation of the interactive media may be generated, presented, or both, by the device based on user input that influences or controls which media files are included in the presentation. That is, the user input may fully or partially determine whether and when a particular media file is included in the presentation.
As the user is being presented with the interactive media by a presenting device (e.g., first device), a monitoring device (e.g., second device) may be used to monitor the presentation of the interactive media and detect an event that occurs therein. Such a monitoring device may be configured and positioned to access media content from the presentation of the interactive media. For example, the monitoring device may be configured and positioned to record video content (e.g., one or more video frames, which may be still images) with a camera and record audio content with a microphone. Having accessed this media content, the monitoring device may generate an identifier, such as a fingerprint or watermark, of the media content and compare the generated identifier with a reference identifier that is generated from the source of the media content. Based on the generated identifier matching the reference identifier, the monitoring device may detect that an event has occurred within the interactive media presentation and present a corresponding notification.
Accordingly, the monitoring device may present a notification that references the occurrence of the detected event. Such a notification may be presented to the user (e.g., via the monitoring device, the presenting device, or both). The notification may be presented to another user (e.g., a socially connected friend, follower, or connection of the user, as identified by or according to a social networking system). In some example embodiments, the notification is presented by the monitoring device. However, in alternative example embodiments, the monitoring device may cause the presenting device (e.g., the device that presents the interactive media) to present the notification. In some example embodiments, the presenting device and the monitoring device are combined into a single device.
Moreover, the monitoring device may function entirely independent of any server or other source that may be providing the interactive media presentation to the presentation device. That is, the monitoring device may detect an event within the interactive media presentation and present a corresponding notification without communication from such a server or other source of the interactive media presentation. Further details are described below.
FIG. 1is a network diagram illustrating a network environment100suitable for detecting an event within interactive media, according to some example embodiments. The network environment100includes a reference server110, a database115, a social network server118, an interactive media presentation server120, and devices130,131, and150, all of which may be communicatively coupled to each other via a network190. In some example embodiments, the interactive media presentation server120is communicatively coupled to the device130by a separate network or other communication path. The reference server110, the database115, the social network server118, the interactive media presentation server120, and the devices130,131, and150may each be implemented in a computer system, in whole or in part, as described below with respect toFIG. 10. As shown inFIG. 1, the reference server110, with or without the database115, may form all or part of a network-based system105. For example, the network-based system105may be or include a cloud-based system that provides one or more network-based services (e.g., provision of reference identifiers for media content included as part of various interactive media).
Also shown inFIG. 1are users132and152. One or both of the users132and152may be a human user (e.g., a human being), a machine user (e.g., a computer configured by a software program to interact with the device130), or any suitable combination thereof (e.g., a human assisted by a machine or a machine supervised by a human). The user132is not part of the network environment100, but is associated with the device130and may be a user of the device130. For example, the device130may be a desktop computer, a vehicle computer, a tablet computer, a navigational device, a portable media device, or a smart phone belonging to the user132.
The user132may also be associated with the device131and may be a user of the device131. For example, the device130may be a desktop computer, a vehicle computer, a tablet computer, a navigational device, a portable media device, or a smart phone belonging to the user132. As shown inFIG. 1, the device131is able to monitor (e.g., by accessing or receiving) media content presented as part of an interactive media presentation by the device130. In certain example embodiments, the device131and the device130are combined into a single device. In such example embodiments, the monitoring of the media content may be performed internally by the single device (e.g., within memory).
Likewise, the user152is not part of the network environment100, but is associated with the device150. According to various example embodiments, the user152is a socially connected friend, follower, or connection of the user132(e.g., as identified or indicated by a social networking service, such as Facebook® or Twitter®). As an example, the device150may be a desktop computer, a vehicle computer, a tablet computer, a navigational device, a portable media device, or a smart phone belonging to the user152.
Any of the machines, databases, or devices shown inFIG. 1may be implemented in a general-purpose computer modified (e.g., configured or programmed) by software to be a special-purpose computer to perform the functions described herein for that machine, database, or device. For example, a computer system able to implement any one or more of the methodologies described herein is discussed below with respect toFIG. 10. As used herein, a “database” is a data storage resource and may store data structured as a text file, a table, a spreadsheet, a relational database (e.g., an object-relational database), a triple store, a hierarchical data store, or any suitable combination thereof. Moreover, any two or more of the machines, databases, or devices illustrated inFIG. 1may be combined into a single machine, and the functions described herein for any single machine, database, or device may be subdivided among multiple machines, databases, or devices.
The network190may be any network that enables communication between or among machines, databases, and devices (e.g., the reference server110and the device131). Accordingly, the network190may be a wired network, a wireless network (e.g., a mobile or cellular network), or any suitable combination thereof. The network190may include one or more portions that constitute a private network, a public network (e.g., the Internet), or any suitable combination thereof.
FIG. 2is a block diagram illustrating components of the reference server110, which may be configured to facilitate detection of an event within interactive media, according to some example embodiments. The reference server110may be a machine that, as shown, includes a media access module210, a fingerprint generation module220, a watermark extraction module230, and a notification correlation module240, all configured to communicate with each other (e.g., via a bus, shared memory, or a switch).
Any one or more of the modules described herein may be implemented using hardware (e.g., a processor of a machine) or a combination of hardware and software. For example, any module described herein may configure a processor to perform the operations described herein for that module. Moreover, any two or more of these modules may be combined into a single module, and the functions described herein for a single module may be subdivided among multiple modules. Furthermore, according to various example embodiments, modules described herein as being implemented within a single machine, database, or device may be distributed across multiple machines, databases, or devices.
The media access module210of the reference server110is configured to access media (e.g., from the database115, from the interactive media presentation server120, or from both). The accessed media may include one or more media files containing media content that may be presentable as part of the interactive media presentation. For example, the interactive media presentation server120may store such media files, and the media access module210of the reference server110may access or retrieve those media files. The database115may be used by the media access module210to temporarily or permanently store the media files (e.g., for fingerprint generation, watermark extraction, or both).
The fingerprint generation module220of the reference server110is configured to generate reference fingerprints from the media files that are accessed by the media access module210. For example, the fingerprint generation module220may apply one or more algorithms to a video file and generate a reference fingerprint that is usable to identify a presentation (e.g., playback) of that video file within an interactive media presentation. As another example, the fingerprint generation module220may apply one or more algorithms to an audio file and thus generate a reference fingerprint usable to identify a playing of that audio file within the interactive media presentation. As a further example, the fingerprint generation module220may apply one or more algorithms to an image file and accordingly generate a reference fingerprint usable to identify a displaying of that image file within the interactive media presentation. As a yet further example, the fingerprint generation module220may apply one or more algorithms to a text file and thereby generate a reference fingerprint that is usable to identify, within the interactive media presentation, an appearance of the text contained in the text file.
The watermark extraction module230of the reference server110is configured to extract reference watermarks from the media files that are accessed by the media access module210. For example, the watermark extraction module230may apply one or more algorithms to a video file and extract a reference watermark that is usable to identify a presentation (e.g., playback) of that video file within an interactive media presentation. As another example, the watermark extraction module230may apply one or more algorithms to an audio file and thus extract a reference watermark usable to identify a playing of that audio file within the interactive media presentation. As a further example, the watermark extraction module230may apply one or more algorithms to an image file and accordingly extract a reference watermark usable to identify a displaying of that image file within the interactive media presentation. As a yet further example, watermark extraction module230may apply one or more algorithms to a text file and thereby extract the reference watermark that is usable to identify, within the interactive media presentation, an appearance of the text contained in the text file.
According to various example embodiments, one or both of the fingerprint generation module220and the watermark extraction module230may be included in the reference server110. Hence, the reference server110may form all or part of a cloud-based server system (e.g., of one or more machines) that is configured to generate fingerprints from various media content, store watermarks for various media content, or any suitable combination thereof.
The notification correlation module240of the reference server110is configured to correlate a media file (e.g., accessed by the media access module210and processed by the fingerprint generation module220, the watermark extraction module230, or both) with a notification that references an event which may occur within the interactive media presentation. For example, supposing the interactive media presentation is a videogame, the media file may contain video content that shows an in-game character congratulating the user (e.g., game player) on completing a difficult level of the game. In such a case, the completion of the difficult level of the game is the event that may occur within the interactive presentation, and this event, this media file, or both may be correlated with a notification that references the completion of this level of the game. The notification correlation module240may access event data that correlates the event with the media file (e.g., from the interactive media presentation server, from the database115, or from both). Based on such event data, the notification correlation module240may map the event, the media file, or both, to the corresponding notification, which may be stored in the database115(e.g., after being automatically or manually generated based on the media file). This correspondence relationship (e.g., map) may be stored in the database115.
Thus, the reference server110, the database115, or both, may be configured to provide (e.g., to any one or more of devices130,131, and150) a reference identifier (e.g., fingerprint or watermark) of the media file and a notification that corresponds to an event signified by the media file being presented within the interactive media presentation. The reference identifier and the notification may be provided as part of a network-based service that supplements the interactive media presentation with additional information (e.g., the notification) upon detection of the event occurring. In some example embodiments, the network-based system105provides such a service.
Moreover, such a service may be provided without any cooperation, assistance, or other communication from the interactive media presentation server120or other source of the interactive media presentation. Indeed, some or all of the network-based system105may obtain and provide reference identifiers of various media files and the media content thereof, by accessing such media content (e.g., as a user) during a presentation of the interactive media presentation. For example, reference identifiers may be obtained from playing a computer game (e.g., to completion, automatically, by executing software scripts to simulate user input), and corresponding notifications may be generated (e.g., automatically or manually) and mapped to the obtained reference identifiers. As a result, the network-based system105may provide a supplemental information service that complements the interactive media presentation, but is separate from the interactive media presentation and produced independently (e.g., without collaboration with the author or source of the interactive media presentation).
FIG. 3is a block diagram illustrating components of the device131, which may be configured to detect an event within interactive media, according to some example embodiments. The device131may be a machine that, as shown, includes a reference module310, a fingerprint module320, a watermark module330, a detection module340, and a presentation module350, all configured to communicate with each other (e.g., via a bus, shared memory, or a switch).
As noted above, any one or more of the modules described herein may be implemented using hardware (e.g., a processor of a machine) or a combination of hardware and software. For example, any module described herein may configure a processor to perform the operations described herein for that module. Moreover, any two or more of these modules may be combined into a single module, and the functions described herein for a single module may be subdivided among multiple modules. Furthermore, according to various example embodiments, modules described herein as being implemented within a single machine, database, or device may be distributed across multiple machines, databases, or devices.
As noted above, the device131may monitor an interactive media presentation being presented by the device130(e.g., by recording or otherwise accessing presented media content included in the interactive media presentation). Media content being presented as part of the interactive media presentation may thus be accessed by the device131(e.g., for detection of the event that corresponds to the media content). For example, the device131may include a camera that is configured to capture video content, image content, text content, or any suitable combination thereof, that appears in the interactive media presentation. As another example, the device131may include a microphone is configured to capture audio content that is played within the interactive media presentation.
The reference module310of the device131is configured to access a reference identifier (e.g., reference fingerprint or reference watermark) that is generated or extracted (e.g., by the reference server110) from media content which is presentable as part of the interactive media presentation. The reference identifier may be accessed from the network-based system105(e.g., from the reference server110or from the database115). The reference identifier may be denoted herein as a “first identifier.” As noted above, within the interactive media presentation, an event that corresponds to the media content may be configured to occur in response to a user input (e.g., generated by the user132and submitted via the device130, as the user132is interacting with the interactive media presentation).
The fingerprint module320of the device131is configured to generate a fingerprint from a playback of the media content as part of the interactive media presentation. For example, the fingerprint module320may apply one or more algorithms to a video file and generate a fingerprint that may be compared to a reference fingerprint for that video file and thereby identify a presentation (e.g., playback) of that video file within an interactive media presentation. As another example, the fingerprint module320may apply one or more algorithms to an audio file and thus generate a fingerprint that may be compared to a reference fingerprint for that audio file and thereby identify a playing of that audio file within the interactive media presentation. As a further example, the fingerprint module320may apply one or more algorithms to an image file and accordingly generate a fingerprint that may be compared to a reference fingerprint for that image file and thereby identify a displaying of that image file within the interactive media presentation. As a yet further example, the fingerprint module320may apply one or more algorithms to a text file and thereby generate a fingerprint that may be compared to a reference fingerprint for that text file and thereby identify, within the interactive media presentation, an appearance of the text contained in the text file.
The watermark module330of the device131is configured to extract a watermark from the playback of the media content as part of the interactive media presentation. For example, the watermark module330may apply one or more algorithms to a video file and extract a watermark that may be compared to a reference watermark for that video file and thereby identify a presentation (e.g., playback) of that video file within an interactive media presentation. As another example, the watermark module330may apply one or more algorithms to an audio file and thus extract a watermark that may be compared to a reference watermark for that audio file and thereby identify a playing of that audio file within the interactive media presentation. As a further example, the watermark module330may apply one or more algorithms to an image file and accordingly extract a watermark that may be compared to a reference watermark for that image file and thereby identify a displaying of that image file within the interactive media presentation. As a yet further example, the watermark module330may apply one or more algorithms to a text file and thereby extract a watermark that may be compared to a reference watermark for that text file and thereby identify, within the interactive media presentation, an appearance of the text contained in the text file.
According to various example embodiments, one or both of the fingerprint module320and the watermark module330may be included in the device131. Hence, the device131may be configured to generate fingerprints from various media content monitored by the device131, extract watermarks for such media content, or any suitable combination thereof.
The detection module340of the device131is configured to detect an occurrence of the event that corresponds to the media content. As noted above, the event may be configured to occur in response to a user input (e.g., from the user132). The detection of this occurrence of the event may be based on the identifier generated by the device131matching the reference identifier accessed by the device131. For example, the occurrence of the event may be detected based on a match between a fingerprint generated by the fingerprint module320and a reference fingerprint accessed by the reference module310. As another example, the occurrence of the event may be detected based on a match between a watermark extracted by the watermark module330and a reference watermark accessed by the reference module310.
The presentation module350of the device131is configured to present a notification (e.g., accessed from the network-based system105) that references the occurrence of the event. The notification may be presented based on the detecting of the event's occurrence, based on the identifier generated by the device131matching the reference identifier accessed by the device131, or based on both.
FIG. 4is a conceptual diagram illustrating detection of an event432within an interactive media presentation430, according to some example embodiments. As shown, the interactive media presentation430is generated, presented, or both (e.g., by the device130) based on user input420and based on media405. The user input420may be received from the device130that is presenting the interactive media presentation430. For example, the interactive media presentation430may be a game (e.g., multimedia game) presented by game software that is executing on the device130, and the user input420may be or include control signals, commands, or choices generated by the user132as part of playing the game.
As shown inFIG. 4, the media405may take the form of media files that contain media content410,412,414, and416. Examples of media content (e.g., media content410) include video, an image, audio, text, or any suitable combination thereof. The arrows pointing to the interactive media presentation430from the user input420and from the media405indicate that the interactive media presentation430is influenced, at least in part, by the user input420and the media405.
In the example illustrated inFIG. 4, the event432occurs within the interactive media presentation430(e.g., as a result of the user input420). Because the event432is occurring, the interactive media presentation430includes (e.g., incorporates) the media content410, which signifies the occurrence of the event432. As a result, a playback of the media content410is initiated by the interactive media presentation430(e.g., via the device130).
Since the device131is monitoring the interactive media presentation430and the playback of various media content included therein, a detection440of the event432that corresponds to the media content410may be performed (e.g., by the device131). For example, the device131may be configured by event detection software that executes on the device131and performs the detection440. According to various example embodiments, a method discussed below with respect toFIG. 7-9may be implemented by such event detection software.
FIG. 5is a layout diagram illustrating a notification520that may be displayed with (e.g., within) an interactive media presentation, according to some example embodiments. InFIG. 5, a user interface500is depicted in the example form of a graphical window. Such a graphical window may be displayed on a display screen of the device130, while the device130is presenting the interactive media presentation430(e.g., within the user interface500). In some example embodiments, the interactive media presentation430is a game (e.g., titled “Majestic Fantasy 4: The Unkempt Realms”), and the user interface500is used to present various media content of the game (e.g., media content410).
In response to the detection440of the event432that corresponds to the media content410, the notification520may be presented.FIG. 5depicts the notification520being presented within the user interface500. In alternative example embodiments, the notification520may be presented outside the user interface500(e.g., elsewhere on a display screen of the device130, on a display screen of the device131, or on a display screen of the device150). The notification520references the event432(e.g., the occurrence of the event432within the interactive media presentation430).
As shown inFIG. 5, some example embodiments of the user interface500include a help button510(e.g., labeled “Get Help for This Level”). In some example embodiments, the help button510appears in the user interface500in response to the detection440of the event432. For example, the event432may be an in-game defeat of the user132(e.g., player), and the detection440of such a defeat may cause the help button510to appear within the user interface500. If the user132clicks on the help button510, the notification520appears within the user interface500(e.g., to provide information that may be helpful to avoid another such defeat). As another example, the event432may be an in-game promotion of the user132to a more difficult level of the game, and the detection440of such a promotion may trigger the appearance of the help button510. In response to the user132clicking on the help button510, the notification520may be shown (e.g., to provide strategy for playing the more difficult level of the game).
FIG. 6is a layout diagram illustrating the notification520, according to some example embodiments. As noted above, the notification references the event432within the interactive media presentation430(e.g., references the occurrence of the event432), and according to various example embodiments, the notification520may be presented to one or more of the users132and152(e.g., via one or more of the devices130,131, and150).
In general, the notification520may include any information that is pertinent to the occurrence of the event432within the interactive media presentation430. As shown inFIG. 6, the notification520may include a map610(e.g., an in-game map of a player's current level within a game). The notification520may include one or more pieces of information620,630,640, and650, which each may reference the occurrence of the event432. Examples of such information include a help document, a guide, an encouragement (e.g., to a player of a game, that the player persevere in attempting to win a difficult section or level of the game), a suggestion (e.g., that the player of the game attempt a different strategy), or an advertisement (e.g., that the player purchase a virtual in-game item that may enhance the player's enjoyment of the current section or level of the game). Additional examples of such information include an offer for the purchase of one or more virtual goods (e.g., as an in-game purchase, an in-app purchase, downloadable level content (DLC), or any suitable combination thereof), as well as some or all of a user interface (e.g., an electronic storefront) operable to initiate such a purchase. Further examples of such information include an offer for the purchase of one or more physical goods (e.g., related or recommended merchandise, games, memorabilia, soundtracks, or other physical items), as well as some or all of a user interface operable to initiate such a purchase. In some example embodiments, the notification520includes a hyperlink to such information.
As noted inFIG. 6, some or all of the information620,630,640, or650may refer to an achievement that is signified by the event432occurring within the interactive media presentation430(e.g., completion of a level in a game, as signified by a video “cut scene” that appears at the end of the level). Any of the information620,630,640, or650may describe a virtual item in a virtual world (e.g., an upgraded sword within a fantasy adventure game, as signified by special music that plays upon acquisition of the upgraded sword).
Some or all of the information620,630,640, or650may be included in the notification520based on a level of progress within a storyline of the interactive media presentation (e.g., a plot of a game). Similarly, any of the information620,630,640, or650may be included based on a level of advancement attained by the user132(e.g., within a character arc of an in-game character or avatar of the user132). Accordingly, any of the information620,630,640, or650may refer to such a level of progress or level of advancement.
In some example embodiments, some or all of the information620,630,640, or650corresponds to a virtual location, virtual orientation, or both, within a virtual world. For example, suppose the interactive media presentation430is an immersive game within a three-dimensional virtual world. The event432may be an arrival of the player at a particular location within the virtual world (e.g., a waterfall that hides a treasure chest), and the event432may be detected by the corresponding playback of the media content410(e.g., a particular audio pattern of water splashing sounds, a particular visual pattern of rock formations, or both). Based on this, any of the information620,630,640, or650may reference that particular location (e.g., the waterfall) and provide strategy, hints, or advertisements pertinent thereto (e.g., “Look behind the waterfall to find treasure!” or “If you want to search behind the waterfall, would you like to buy some goggles or an umbrella?”).
In certain situations, the media content410may be spatialized (e.g., by inclusion of multi-channel audio content) with respect to the three-dimensional virtual world. Accordingly, the event432may be detected by a corresponding playback of the media content410with a particular virtual orientation (e.g., a particular audio pattern of water splashing sounds whose frequency distribution indicates that the player is facing towards the waterfall that hides the treasure chest, a particular visual pattern of rock formations indicating that the player is facing the waterfall, or both). Based on this, any of the information620,630,640, or650may reference that particular orientation (e.g., facing towards the waterfall) and provide strategy, hints, or advertisements pertinent thereto (e.g., “Don't get distracted by the waterfall! Enemies may be lurking behind you!” or “If you want to search behind the waterfall, would you like to hire a helper to watch your back?”).
In example embodiments where the notification520is presented to the user152via the device150, the notification520may include a reference660to the user132. As noted above, the users132and152may be socially connected to each other (e.g., as friends, followers, or connections, such as may be indicated by a social networking service). Accordingly, the notification520with the reference660may notify the user152that the user132has experienced the event432within the interactive media presentation430. For example, the notification520may thus tell the user152that the user132has been defeated in playing a game, has been promoted to a more difficult level of the game, has acquired a virtual object within the game, has advanced to a particular point in the game's storyline, or has arrived at a particular virtual location within a virtual world in which the game is played.
FIG. 7-9are flowcharts illustrating operations of the device131in performing a method700of detecting the event432within the interactive media presentation430, according to some example embodiments. Operations in the method700may be performed by the device131(e.g., separately from the device130, or sharing one or more operations with the device130), using modules described above with respect toFIG. 3. As shown inFIG. 7, the method700includes operation710,720,730, and740.
In operation710, the reference module310of the device131accesses a first identifier (e.g., a first fingerprint or a first watermark). The first identifier may be a reference identifier and may be accessed from the database115, and the first identifier may be obtained from the media content410, which is presentable as part of the interactive media presentation430. In some example embodiments, the reference module310accesses a first fingerprint generated from the media content410(e.g., by the fingerprint generation module220of the reference server110). In certain example embodiments, the reference module310may access a first watermark extracted from the media content410(e.g., by the watermark extraction module230of the reference server110).
In operation720, according to some example embodiments, the fingerprint module320of the device131generates a second identifier (e.g., a second fingerprint) from a playback of the media content410as part of the interactive media presentation430. The second identifier may be called a generated identifier. In certain example embodiments, the watermark module330of the device131extracts the second identifier (e.g., a second watermark) from the playback of the media content410as part of the interactive media presentation430.
In operation730, the detection module340of the device131detects an occurrence of the event432within the interactive media presentation430. This detection may be based on the second identifier (e.g., generated identifier) matching the first identifier (e.g., reference identifier). For example, the detection module340may detect the occurrence of the event432by comparing an accessed first fingerprint (e.g., reference fingerprint) of the media content410to a generated second fingerprint (e.g., generated fingerprint) of the media content410. Based on the accessed first fingerprint matching the generated second fingerprint, the detection module340may detect the occurrence of the event432. As another example, the detection module340may detect the occurrence of the event432by comparing an accessed first watermark (e.g., reference watermark) of the media content410to an extracted second watermark (e.g., extracted watermark) of the media content410. Based on the accessed first watermark matching the extracted second watermark, the detection module340may detect the occurrence of the event432.
In operation740, the presentation module350of the device131presents the notification520, which may reference the occurrence of the event432within the interactive media presentation430. As noted above, the event432, the occurrence thereof, or both, may be detected based on the second identifier being determined to match the first identifier (e.g., in operation730). For example, the presentation module350may present the notification520on a display screen of the device131(e.g., within an alert or other message). Accordingly, the notification520may be presented to the user132on the device131, which may be functioning as a supplementary device that monitors the interactive media presentation430and provides notifications (e.g., notification520) in response to detected events (e.g., event432) therein.
In some example embodiments, the presentation module250of the device131causes the device130to present the notification520(e.g., within the user interface500, in which the interactive media presentation430may be presented). Accordingly, the notification520may be presented to the user132on the device130, which may be both presenting the interactive media presentation430and providing notifications (e.g., notification520) in response to detected events (e.g., event432) therein.
In certain example embodiments, the presentation module350causes the device150to present the notification520(e.g., within an alert or other message). For example, the notification520may be presented with the reference660(e.g., to the user132). Accordingly, the notification520may be presented to the user152on the device150, which may notify the user152that the user132has experienced the event432within the interactive media presentation430, as indicated by detection of the playback of the media content410as part of the interactive media presentation430.
As shown inFIG. 8, the method700may include one or more of operations802,810,820,826, and828. Operation802may be performed prior to operation710, and operation710may be performed in response to operation802. In operation802, the detection module340of the device131receives a request that a current portion of the interactive media presentation430be identified. For example, the detection module340may detect that the help button510(e.g., labeled “Get Help For This Level”) has been activated (e.g., clicked or touched), where activation of the help button510initiates such a request to identify a current portion (e.g., a current level of a game) of the interactive media presentation430. In example embodiments that include operation802, one or more of operations710,720, and740may be performed based on the received request. Moreover, the presenting of the notification520in operation740may identify the current portion of the interactive media presentation430(e.g., the current level of the game). For example, the current portion of the interactive media presentation430may be named in the notification520(e.g., within information620) or shown in the notification520(e.g., within the map610).
In some example embodiments, the media content410is unique within the interactive media presentation430, and the event432corresponds exclusively to the media content410. In such example embodiments, occurrence of the event432is always accompanied by a playback of the media content410, and a playback of the media content410always signifies the occurrence of the event432. For example, the media content410may be a video (e.g., a “cut scene”) that is played only between Level 3 and Level 4 (e.g., upon completion of Level 3) in a multi-level computer game. As another example, the media content410may be a sound (e.g., a trumpet fanfare) that is played only whenever a player is promoted to a higher rank in a military simulation game. In such example embodiments, the detecting of the occurrence of the event432in operation730may be performed based simply on the second identifier (e.g., generated fingerprint) matching the first identifier (e.g., reference fingerprint).
In alternative example embodiments, however, the media content410is not unique within the interactive media presentation430. The event432may correspond nonexclusively to the media content410. In such example embodiments, a matching of the second identifier to the first identifier in operation730may be insufficient to detect the event432. Accordingly, one or more additional comparisons of generated identifiers to reference identifiers may be used by the device131to detect the event432.
In operation810, the reference module310of the device131accesses a third identifier in a manner similar to that described above with respect to operation710. The third identifier may be a further reference identifier and may be accessed from the database115, and the third identifier may be obtained from the media content412, which is presentable as part of the interactive media presentation430. In some example embodiments, the reference module310accesses a third fingerprint generated from the media content412(e.g., by the fingerprint generation module220of the reference server110). In certain example embodiments, the reference module310may access a third watermark extracted from the media content412(e.g., by the watermark extraction module230of the reference server110).
In operation820, according to some example embodiments, the fingerprint module320of the device131generates a fourth identifier (e.g., a fourth fingerprint) from a playback of the media content412as part of the interactive media presentation430. The fourth identifier may be called a further generated identifier. In certain example embodiments, the watermark module330of the device131extracts the fourth identifier (e.g., a fourth watermark) from the playback of the media content412as part of the interactive media presentation430.
In example embodiments that include operations810and820, the detecting of the occurrence of the event432in operation730may be based on the third identifier (e.g., further reference identifier) matching the fourth identifier (e.g., further generated identifier). For example, the detection module340may detect the occurrence of the event432by comparing accessed first and third fingerprints (e.g., reference fingerprints) to generated second and fourth fingerprints (e.g., generated fingerprints). Based on the accessed first fingerprint matching the generated second fingerprint and the accessed third fingerprint matching the generated fourth fingerprint, the detection module340may detect the occurrence of the event432. As another example, the detection module340may detect the occurrence of the event432by comparing accessed first and third watermarks (e.g., reference watermarks) to extracted second and fourth watermarks (e.g., extracted watermarks). Based on the accessed first watermark matching the extracted second watermark and the third watermark matching the fourth watermark, the detection module340may detect the occurrence of the event432.
In some example embodiments, the media content410is not unique within the interactive media presentation430, but the event432corresponds exclusively to the playback of the media content410being contemporaneous (e.g., within a five-second window) with the playback of the media content412(e.g., further media content). In such example embodiments, the detecting of the occurrence of the event432in operation730may be based on the event432corresponding exclusively to the contemporaneous playback of the media content410with the media content412. In other words, the event432may be detected based on the event432corresponding exclusively to the fact that the playback of the media content410is contemporaneous with the playback of the media content412.
In certain example embodiments, the media content410is not unique within the interactive media presentation430, but the event432corresponds nonexclusively to the playback of the media content410being contemporaneous (e.g., within a two-second window) with the playback of the media content412(e.g., further media content). In such example embodiments, the detecting of the occurrence of the event432in operation730may be based on a probability that the playback of the media content410is contemporaneous with the playback of the media content412. In operation826, the detection module340of the device131accesses such a probability (e.g., from the network-based system105or any portion thereof). For example, the detection module340may access a 90% probability that the event432is accompanied by a contemporaneous playback of the media content410with the media content412. Accordingly, the detection module340may perform operation730based on the accessed probability.
Alternatively, in such example embodiments, the detecting of the occurrence of the event432in operation730may be based on a history of events that occurred within the interactive media presentation430prior to the playback of the media content412within the interactive media presentation430. In operation828, the detection module340of the device131accesses such a history of detected events (e.g., from the network-based system105or any portion thereof). For example, the detection module340may access a log of events that have been previously detected by the device131while monitoring the interactive media presentation430, and the log of events may indicate that other events (e.g., aside from the event432) signified by a playback of the media content410, the media content412, or both, have already been detected as already having occurred. Thus, the other events that have already occurred may be eliminated as potential candidates for detection in operation730. Accordingly, the detection module340may perform operation730based on the accessed history of detected events.
As shown inFIG. 9, the method700may include one or more of operations840,841,842,843,844,845,846, and850. One or more of operations840-846may be performed as part (e.g., a precursor task, a subroutine, or a portion) of operation740, in which the presentation module350of the device131presents the notification520.
In some example embodiments, the event432includes (e.g., indicates or signifies) an achievement within a game by a player of the game (e.g., an in-game achievement by the user132). Examples of such an achievement include completing a level of the game, performing a particular set of tasks within the game, gaining access to a feature of the game (e.g., discovering or unlocking hidden content or “Easter eggs”), or any suitable combination thereof. In operation840, the presentation module350presents information630, which may reference the achievement by the player. For example, the information630may include a congratulatory message that mentions the achievement. As another example, the information630may include a suggestion that the user132(e.g., player) share the achievement with a socially connected friend (e.g., by sending a message that references the achievement).
In certain example embodiments, the event432includes (e.g., indicates or signifies) an acquisition of a virtual item within a virtual world. Examples of such an acquisition include obtaining a significant talisman or weapon (e.g., within a fantasy adventure game that is set within a three-dimensional virtual world), gaining access to an upgrade to an existing virtual item (e.g., a faster car in a racing game), receiving a large sum of virtual money (e.g., treasure or prizes), or any suitable combination thereof. In operation841, the presentation module350presents information630, which may reference the acquisition of the virtual item. For example, the information630may include a congratulatory message that mentions the acquisition. As another example, the information630may include a suggestion that the user132(e.g., player) shares news of the acquisition with a socially connected friend (e.g., by sending a message that references the acquisition).
In some example embodiments, the event432includes (e.g., indicates or signifies) a level of progress within a storyline of a game. For example, the media content410may include video content, audio content, or both, that indicates the level of progress (e.g., special music that is specific to a particular section of the storyline). In operation842, the presentation module350presents information640, which may be based on, and may reference, the level of progress within the storyline. For example, the information640may include a summary of the storyline up to the current level of progress, a preview of the next level of progress in the storyline, advice, tips, suggestions, encouragements, or any suitable combination thereof.
In certain example embodiments, the event432includes (e.g., indicates or signifies) a level of advancement within the game by a player of the game (e.g., a new in-game rank attained by the user132). For example, the media content410may include video content, audio content, or both, that indicates the level of advancement (e.g., special insignia presented on the screen or special sound effects). In operation842, the presentation module350presents information640, which may be based on, and may reference, the level of advancement within the game. For example, the information640may include a description of a rank to which the player has been promoted, a description of new abilities or powers accorded to the level of advancement, an indication of progress toward the next level of advancement, indication of effort (e.g., measured in time, actions, or both) expended in reaching the level of advancement, or any suitable combination thereof.
In some example embodiments, the media content410includes content (e.g., audio content or video content) that indicates a virtual location within a virtual world. As noted above, the notification520may include information (e.g., information650) that corresponds to a virtual location, virtual orientation, or both, within a virtual world. Accordingly, in operation843, the presentation module350of the device131presents information (e.g., information650) that corresponds to (e.g., describes or references) the virtual location within the virtual world.
Moreover, as noted above, the media content410may be spatialized (e.g., by inclusion of multi-channel audio content) with respect to the three-dimensional virtual world. Hence, in operation844, the presentation module350presents information (e.g., information650) that corresponds to the virtual orientation (e.g., at the virtual location) within the virtual world.
According to various example embodiments, as noted above, the notification520may include information (e.g., information620) that contains a help document, a guide, an encouragement, a suggestion, an advertisement, or any suitable combination thereof. Accordingly, in operation845, the presentation module350of the device151presents a help document, a guide, an encouragement, a suggestion, an advertisement, or any suitable combination thereof, in performance of operation740. In some example embodiments, the media content410indicates a failure to achieve a goal within the interactive media presentation430(e.g., a failure by a player of a videogame to achieve a goal within the game). In such example embodiments, operation845may involve presenting a suggestion on achieving the goal (e.g., in the next attempt or some future attempt), an encouragement to the player (e.g., to the user132, that the user132try again to achieve a goal), or an advertisement for a virtual item within the interactive media presentation430(e.g., a purchasable virtual item that may facilitate achieving the goal), or any suitable combination thereof.
In operation846, the presentation module350of the device131communicates the notification520with the reference660, which may describe the user132. As noted above, the notification520may be communicated to the user152(e.g., via the device150), who may be socially connected (e.g., as a friend, follower, or connection) to the user132via one or more social networking services. This may have the effect of notifying the user152that the user132has experienced the event432within the interactive media presentation430.
Operation850may be performed after operation730, in which the detection module340of the device131detects the occurrence of the event432. As shown inFIG. 9, operation850may follow operation740, in which the presentation module350of the device131presents the notification520. In operation850, the detection module340stores a reference to the occurrence of the event432. The reference may be stored in a data record (e.g., within the database115) that corresponds to the user132(e.g., a player of a videogame) from whom the user input420may be received. As noted above, the user input420may be a basis (e.g., an influence, a factor, or a control signal) for the interactive media presentation430. Accordingly, the stored reference may be usable to identify a portion (e.g., section, chapter, level, or part) of the interactive media presentation430that contains the event432. That is, the stored reference may indicate, designate, or define the portion within which the event432is configured to occur (e.g., in response to the user input420) within the interactive media presentation430. This may have the effect making annotation that the user132has been presented with this portion of the interactive media presentation430. In example embodiments where the interactive media presentation430is a game, operation850results in the detection of the event432within the interactive media presentation430being recorded (e.g., in the database115) in a game history of the user132.
According to various example embodiments, one or more of the methodologies described herein may facilitate detection of an event within a presentation of interactive media. Moreover, one or more of the methodologies described herein may facilitate presentation of a notification that references an event within such interactive media. Hence, one or more the methodologies described herein may facilitate provision of a supplementary information service that complements interactive media, independently, with or without communication from a source of the interactive media.
When these effects are considered in aggregate, one or more of the methodologies described herein may obviate a need for certain efforts or resources that otherwise would be involved in detecting events within interactive media presentations. Efforts expended by a user in identifying a current portion of an interactive media presentation may be reduced by one or more of the methodologies described herein. Computing resources used by one or more machines, databases, or devices (e.g., within the network environment100) may similarly be reduced. Examples of such computing resources include processor cycles, network traffic, memory usage, data storage capacity, power consumption, and cooling capacity.
FIG. 10is a block diagram illustrating components of a machine1000, according to some example embodiments, able to (e.g., configured to) read instructions from a machine-readable medium (e.g., a machine-readable storage medium, a computer-readable storage medium, or any suitable combination thereof) and perform any one or more of the methodologies discussed herein, in whole or in part. Specifically,FIG. 10shows a diagrammatic representation of the machine1000in the example form of a computer system and within which instructions1024(e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine1000to perform any one or more of the methodologies discussed herein may be executed, in whole or in part. In alternative embodiments, the machine1000operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine1000may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a distributed (e.g., peer-to-peer) network environment. The machine1000may be a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a smartphone, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions1024, sequentially or otherwise, that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include a collection of machines that individually or jointly execute the instructions1024to perform all or part of any one or more of the methodologies discussed herein.
The machine1000includes a processor1002(e.g., a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a radio-frequency integrated circuit (RFIC), or any suitable combination thereof), a main memory1004, and a static memory1006, which are configured to communicate with each other via a bus1008. The machine1100may further include a graphics display1010(e.g., a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)). The machine1000may also include an alphanumeric input device1012(e.g., a keyboard), a cursor control device1014(e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument, such as a brain-control interface (BCI) or any other sensor capable of recording human biometrics as input for controlling a cursor), a storage unit1016, a signal generation device1018(e.g., a speaker), and a network interface device1020.
The storage unit1016includes a machine-readable medium1022on which is stored the instructions1024embodying any one or more of the methodologies or functions described herein. The instructions1024may also reside, completely or at least partially, within the main memory1004, within the processor1002(e.g., within the processor's cache memory), or both, during execution thereof by the machine1000. Accordingly, the main memory1004and the processor1002may be considered as machine-readable media. The instructions1024may be transmitted or received over a network1026(e.g., network190) via the network interface device1020.
As used herein, the term “memory” refers to a machine-readable medium able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium1022is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions for execution by a machine (e.g., machine1000), such that the instructions, when executed by one or more processors of the machine (e.g., processor1002), cause the machine to perform any one or more of the methodologies described herein. Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, one or more data repositories in the form of a solid-state memory, an optical medium, a magnetic medium, or any suitable combination thereof.
Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A “hardware module” is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner. In various example embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
In some embodiments, a hardware module may be implemented mechanically, electronically, or any suitable combination thereof. For example, a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations. For example, a hardware module may be a special-purpose processor, such as a field programmable gate array (FPGA) or an ASIC. A hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, a hardware module may include software encompassed within a general-purpose processor or other programmable processor. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
Accordingly, the phrase “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware modules) at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein. As used herein, “processor-implemented module” refers to a hardware module implemented using one or more processors.
Similarly, the methods described herein may be at least partially processor-implemented, a processor being an example of hardware. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. Moreover, the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an application program interface (API)).
The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
Some portions of the subject matter discussed herein may be presented in terms of algorithms or symbolic representations of operations on data stored as bits or binary digital signals within a machine memory (e.g., a computer memory). Such algorithms or symbolic representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. As used herein, an “algorithm” is a self-consistent sequence of operations or similar processing leading to a desired result. In this context, algorithms and operations involve physical manipulation of physical quantities. Typically, but not necessarily, such quantities may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine. It is convenient at times, principally for reasons of common usage, to refer to such signals using words such as “data,” “content,” “bits,” “values,” “elements,” “symbols,” “characters,” “terms,” “numbers,” “numerals,” or the like. These words, however, are merely convenient labels and are to be associated with appropriate physical quantities.
Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or any suitable combination thereof), registers, or other machine components that receive, store, transmit, or display information. Furthermore, unless specifically stated otherwise, the terms “a” or “an” are herein used, as is common in patent documents, to include one or more than one instance. Finally, as used herein, the conjunction “or” refers to a non-exclusive “or,” unless specifically stated otherwise.
The following enumerated descriptions define various example embodiments of methods, machine-readable media, and systems (e.g., apparatus) discussed herein:
1. A method comprising:
accessing a first fingerprint generated from media content that is presentable as part of an interactive media presentation within which an event that corresponds to the media content is configured to occur in response to a user input;
generating a second fingerprint from a playback of the media content as part of the interactive media presentation;
detecting an occurrence of the event within the interactive media presentation based on the second fingerprint matching the first fingerprint,
the detecting of the occurrence of the event being performed by a processor of a machine; and presenting a notification that references the occurrence of the event within the interactive media presentation detected based on the second fingerprint matching the first fingerprint.
2. The method of description 1, wherein:
the media content is unique within the interactive media presentation; and
the event corresponds exclusively to the media content.
3. The method of description 1, wherein:
the media content is not unique within the interactive media presentation;
the event corresponds nonexclusively to the media content; and
the detecting of the occurrence of the event is based on a third fingerprint generated from further media content of the interactive media presentation matching a fourth fingerprint generated from a playback of the further media content as part of the interactive media presentation.
4. The method of description 3, wherein:
the event corresponds exclusively to the playback of the media content being contemporaneous with the playback of the further media content; and
the detecting of the occurrence of the event is based on the event corresponding exclusively to the playback of the media content being contemporaneous with the playback of the further media content.
5. The method of description 3, wherein:
the event corresponds nonexclusively to the playback of the media content being contemporaneous with the playback of the further media content; and
the detecting of the occurrence of the event is based on a probability that the playback of the media content is contemporaneous with the playback of the further media content.
6. The method of description 3 or description 5, wherein:
the event corresponds nonexclusively to the playback of the media content being contemporaneous with the playback of the further media content; and
the detecting of the occurrence of the event is based on a history of events that occurred within the interactive media presentation prior to the playback of the further media content.
7. The method of any of descriptions 1-6, wherein:
the generating of the second fingerprint is in response to a request that a current portion of the interactive media presentation be identified;
the presenting of the notification identifies the current portion of the interactive media presentation; and the method further comprises
receiving the request that the current portion of the interactive media presentation be identified.
8. The method of any of descriptions 1-7, wherein:
the event includes an achievement within a game by a player of the game; and
the presenting of the notification presents information that references the achievement by the player.
9. The method of any of descriptions 1-8, wherein:
the event includes acquisition of a virtual item within a virtual world; and
the presenting of the notification presents information that describes the virtual item.
10. The method of any of descriptions 1-9, wherein:
the media content includes video content that indicates a level of progress within a storyline of a game; and
the presenting of the notification presents information based on the level of progress within the storyline of the game.
11. The method of any of descriptions 1-10, wherein:
the media content includes audio content that indicates a level of advancement within a game by a player of the game; and
the presenting of the notification presents information based on the level of advancement within the game by the player.
12. The method of any of descriptions 1-11, wherein:
the media content includes audio content that indicates a virtual location within a virtual world; and
the presenting of the notification presents information that corresponds to the virtual location within the virtual world.
13. The method of any of descriptions 1-12, wherein:
the media content includes multi-channel audio content that indicates a virtual orientation within a virtual world; and
the presenting of the notification presents information that corresponds to the virtual orientation within the virtual world.
14. The method of any of descriptions 1-13, wherein:
the media content indicates a failure to achieve a goal within a game by a player of the game; and
the presenting of the notification presents at least one of a suggestion on achieving the goal, an encouragement to the player, or an advertisement for a virtual item within the game.
15. The method of any of descriptions 1-14, wherein:
a first user that submitted the user input is socially connected to a second user according to a social network; and
the presenting of the notification includes communicating the notification with a reference to the first user to a device of the second user.
16. The method of any of descriptions 1-15 further comprising:
storing a reference to the occurrence of the event in a data record that corresponds to a user that submitted the user input,
the stored reference being usable to identify the part of the interactive media presentation within which the event is configured to occur.
17. A non-transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
accessing a first fingerprint generated from media content that is presentable as part of an interactive media presentation within which an event that corresponds to the media content is configured to occur in response to a user input;
generating a second fingerprint from a playback of the media content as part of the interactive media presentation;
detecting an occurrence of the event within the interactive media presentation based on the second fingerprint matching the first fingerprint,
the detecting of the occurrence of the event being performed by the one or more processors of the machine; and
presenting a notification that references the occurrence of the event within the interactive media presentation detected based on the second fingerprint matching the first fingerprint.
18. The non-transitory machine-readable storage medium of description 17, wherein:
the media content is not unique within the interactive media presentation;
the event corresponds nonexclusively to the media content; and
the detecting of the occurrence of the event is based on a third fingerprint generated from further media content of the interactive media presentation matching a fourth fingerprint generated from a playback of the further media content as part of the interactive media presentation.
19. A system comprising:
a reference module configured to access a first fingerprint generated from media content that is presentable as part of an interactive media presentation within which an event that corresponds to the media content is configured to occur in response to a user input;
a fingerprint module configured to generate a second fingerprint from a playback of the media content as part of the interactive media presentation;
a processor configured by a detection module to detect an occurrence of the event within the interactive media presentation based on the second fingerprint matching the first fingerprint; and
a presentation module configured to present a notification that references the occurrence of the event within the interactive media presentation detected based on the second fingerprint matching the first fingerprint.
20. The system of description 19, wherein:
the media content is not unique within the interactive media presentation;
the event corresponds nonexclusively to the media content; and
the detection module configures the processor to detect the occurrence of the event based on a third fingerprint generated from further media content of the interactive media presentation matching a fourth fingerprint generated from a playback of the further media content as part of the interactive media presentation.
21. A non-transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
accessing a first watermark embedded within media content that is presentable as part of an interactive media presentation within which an event that corresponds to the media content is configured to occur in response to a user input;
detecting a second watermark within a playback of the media content as part of the interactive media presentation;
detecting an occurrence of the event within the interactive media presentation based on the second watermark matching the first watermark,
the detecting of the occurrence of the event being performed by the one or more processors of the machine; and
presenting a notification that references the occurrence of the event within the interactive media presentation detected based on the second watermark matching the first watermark.
Claims
- A method comprising: detecting, by a machine, an occurrence of an event within an interactive videogame, wherein detecting the occurrence of the event within the interactive videogame comprises detecting multiple matches each between a respective reference identifier of the interactive video game (“reference identifier) and a respective identifier established from presentation by a device of the interactive videogame (“established identifier”), wherein detecting the multiple matches includes detecting that a first reference identifier matches a second established identifier and detecting that a third reference identifier matches a fourth established identifier, and wherein each identifier is selected from the group consisting of a fingerprint generated from the interactive videogame and a watermark extracted from the interactive videogame;and responsive to at least detecting the occurrence of the event within the interactive videogame, causing presentation of a notification that references the occurrence of the event.
- The method of claim 1 , further comprising: establishing each established identifier from the presentation of the interactive videogame.
- The method of claim 1 , wherein causing presentation of the notification that references the occurrence of the event comprises causing presentation of the notification on a user interface of the device.
- The method of claim 1 , wherein a first user is playing the interactive videogame, and wherein causing presentation of the notification comprises causing presentation of the notification to a second user that is socially connected with the first user.
- The method of claim 1 , wherein the event within the interactive videogame is selected from the group consisting of player-completion of a level of game play and in-game defeat of a player of the interactive videogame.
- The method of claim 1 , wherein the interactive videogame comprises a videogame within a three-dimensional virtual world, and wherein the event is selected from the group consisting of in-game arrival of a player at a particular location within the virtual world and in-game orientation of a player with a particular orientation within the virtual world.
- The method of claim 1 , wherein the notification includes information selected from the group consisting of videogame help, an offer for purchase of one or more virtual goods, and an offer for purchase of one or more physical goods.
- The method of claim 7 , wherein the notification comprises a hyperlink to the information.
- A system comprising: a processing unit;and non-transitory data storage holding instructions that, when executed by at least the processing unit, cause the system to perform operations including: detecting, by a machine, an occurrence of an event within an interactive videogame, wherein detecting the occurrence of the event within the interactive videogame comprises detecting multiple matches each between a respective reference identifier of the interactive video game (“reference identifier) and a respective identifier established from presentation by a device of the interactive videogame (“established identifier”), wherein detecting the multiple matches includes detecting that a first reference identifier matches a second established identifier and detecting that a third reference identifier matches a fourth established identifier, and wherein each identifier is selected from the group consisting of a fingerprint generated from the interactive videogame and a watermark extracted from the interactive videogame, and responsive to at least detecting the occurrence of the event within the interactive videogame, causing presentation of a notification that references the occurrence of the event.
- The system of claim 9 , wherein the operations further include: establishing each established identifier from the presentation of the interactive videogame.
- The system of claim 9 , wherein causing presentation of the notification that references the occurrence of the event comprises causing presentation of the notification on a user interface of the device.
- The system of claim 9 , wherein the presentation by the device of the interactive videogame is when a first user is playing the interactive videogame, and wherein causing presentation of the notification comprises causing presentation of the notification to a second user that is socially connected with the first user.
- The system of claim 9 , wherein the event within the interactive videogame is selected from the group consisting of player-completion of a level of game play and in-game defeat of a player of the interactive videogame.
- The system of claim 9 , wherein the interactive videogame comprises a videogame within a three-dimensional virtual world, and wherein the event is selected from the group consisting of in-game arrival of a player at a particular location within the virtual world and in-game orientation of a player with a particular orientation within the virtual world.
- The system of claim 9 , wherein the notification includes information selected from the group consisting of videogame help, an offer for purchase of one or more virtual goods, and an offer for purchase of one or more physical goods.
- The system of claim 15 , wherein the notification comprises a hyperlink to the information.
- A non-transitory machine-readable medium storing instructions executable by a processor to cause a machine to carry out operations including: detecting, by a machine, an occurrence of an event within an interactive videogame, wherein detecting the occurrence of the event within the interactive videogame comprises detecting multiple matches each between a respective reference identifier of the interactive video game (“reference identifier) and a respective identifier established from presentation by a device of the interactive videogame (“established identifier”), wherein detecting the multiple matches includes detecting that a first reference identifier matches a second established identifier and detecting that a third reference identifier matches a fourth established identifier, and wherein each identifier is selected from the group consisting of a fingerprint generated from the interactive videogame and a watermark extracted from the interactive videogame;and responsive to at least detecting the occurrence of the event within the interactive videogame, causing presentation of a notification that references the occurrence of the event.
- The non-transitory machine-readable medium of claim 17 , wherein the operations further include: establishing each established identifier from the presentation of the interactive videogame.
- The non-transitory machine-readable medium of claim 17 , wherein the device is a first device, and wherein the non-transitory machine-readable medium is disposed in a second device positioned to receive media of the interactive videogame output by the first device during presentation of the interactive videogame by the first device.
- The non-transitory machine-readable medium of claim 17 , wherein the event within the interactive videogame is selected from the group consisting of player-completion of a level of game play, in-game defeat of a player of the interactive videogame, in-game arrival at a particular location within a three-dimensional virtual world virtual world of the videogame, and in-game orientation with a particular orientation within the three-dimensional virtual world, and wherein the notification includes information selected from the group consisting of videogame help, an offer for purchase of one or more virtual goods, and an offer for purchase of one or more physical goods.
Disclaimer: Data collected from the USPTO and may be malformed, incomplete, and/or otherwise inaccurate.
