US20070265089A1 - Simulated phenomena interaction game - Google Patents

Simulated phenomena interaction game Download PDF

Info

Publication number
US20070265089A1
US20070265089A1 US11/147,408 US14740805A US2007265089A1 US 20070265089 A1 US20070265089 A1 US 20070265089A1 US 14740805 A US14740805 A US 14740805A US 2007265089 A1 US2007265089 A1 US 2007265089A1
Authority
US
United States
Prior art keywords
game
location
interaction
mobile device
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/147,408
Inventor
James Robarts
Cesar Alvarez
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GLOVENTURES LLC
Consolidated Global Fun Unlimited LLC
Original Assignee
Consolidated Global Fun Unlimited LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/438,172 external-priority patent/US20040002843A1/en
Priority claimed from US10/845,584 external-priority patent/US20050009608A1/en
Application filed by Consolidated Global Fun Unlimited LLC filed Critical Consolidated Global Fun Unlimited LLC
Priority to US11/147,408 priority Critical patent/US20070265089A1/en
Assigned to GLOVENTURES LLC reassignment GLOVENTURES LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ALVAREZ, CESAR A., ROBARTS, JAMES O.
Publication of US20070265089A1 publication Critical patent/US20070265089A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • A63F13/12
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/216Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/33Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
    • A63F13/332Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using wireless networks, e.g. cellular phone networks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/406Transmission via wireless network, e.g. pager or GSM
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • A63F2300/5573Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history player location
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video

Definitions

  • Computerized devices such as portable computers, wireless phones, personal digital assistants (PDAs), global positioning system devices (GPSes) etc.
  • PDAs personal digital assistants
  • GPSes global positioning system devices
  • Computerized devices are becoming compact enough to be easily carried and used while a user is mobile. They are also becoming increasingly connected to communication networks over wireless connections and other portable communications media, allowing voice and data to be shared with other devices and other users while being transported between locations.
  • devices are also able to determine a variety of aspects of the user's surroundings, including the absolute location of the user, and the relative position of other devices, these capabilities have not yet been well integrated into applications for these devices.
  • FIG. 4 is an example mobile device display of the results of an interaction request to a simulation engine used in a game, which involves communication with a simulated phenomenon.
  • FIG. 7 is an example block diagram of an alternative embodiment of components of an example simulation engine.
  • FIG. 24 is an example block diagram of an example Simulated Phenomena Interaction System integrated into components of a commerce-enabled environment.
  • FIG. 3 is an example mobile device display of the results of an interaction request to a simulation engine used in a game, which involves both detection and measurement of simulated phenomena.
  • Mobile device 300 includes a detection and measurement display area 304 and a feedback and input area 302 .
  • mobile device 300 shows the results of interacting with a series of ghosts (the simulated phenomena) as shown in detection and measurement display area 304 .
  • the interaction request being processed corresponds to both detection and measurement operations (e.g., “show me where all the ghosts are”).
  • the simulation engine sends back information regarding the detected simulated phenomena (“SPs”) and where they are relative to the physical location of the mobile device 300 .
  • SPs detected simulated phenomena
  • the simulation engine may further comprise a narrative with data and event logic, a simulated phenomena characterizations data repository, and a narrative engine (e.g., to implement a state machine for the simulation).
  • the narrative engine uses the narrative and the simulated phenomena characterizations data repository to determine whether an indicated interaction is permissible, and, if so, to perform that interaction with a simulated phenomenon.
  • the simulation engine may comprise other data repositories or store other data that characterizes the state of the mobile device, information about the operator, the state of the narrative, etc.
  • FIG. 7 shows an example that uses an environment model 705
  • FIG. 7 shows a corresponding environment data repository 709 , which stores the state (real or otherwise) of various attributes being tracked in the environment.
  • SPIS mobile personal area network
  • PDAs personal area network
  • GPSes portable computing devices
  • infrared devices 3-D wireless (e.g., headmounted) glasses
  • virtual reality devices other handheld devices and wearable devices
  • network communication may be provided over cell phone modems, IEEE 802.11b protocol, Bluetooth protocol or any other wireless communication protocol or equivalent.
  • IRDA infrared
  • Palm Computing also present more complicated modeling considerations and allows additionally for the detection of device orientation.
  • this PDA supports multiple wireless networking functions (e.g., Bluetooth & Wi-Fi expansion card)
  • the IRDA version utilizes its Infrared Port for physical location and spatial orientation determination.
  • an infrared transceiver which may be an installed transceiver, such as in a wall in a room, or another infrared device, such as another player using a PDA/IRDA device
  • the direction the user is facing can be supplied to the simulation engine for modeling as well. This measurement may result in producing more “realistic” behavior in the simulation.
  • FIG. 18 is an example illustration of a display on a mobile device that indicates the location of a simulated phenomenon relative to a user's location as a function of the physical location of the mobile device.
  • the mobile device 1800 is displaying on the display screen area 1801 an indication in the “spectral detection field” 1802 of the location of a particular SP 1804 relative to the user's location 1803 .
  • the location of the SP 1804 would be returned from the narrative engine in response to a detection interaction request.
  • the relative SP location shown is not likely an absolute physical distance and may not divulge any information to the user about the location modeling being employed in the narrative engine.
  • step 2201 the routine determines whether it is possible to manipulate the designated SP given the state of the narrative, particular device and user, etc. and, if so, the routine continues in step 2204 , else continues in step 2202 .
  • This determination is conducted from the point of view of the narrative, not the mobile device. Thus, although the mobile device appears to be working correctly, the narrative may dictate a state in which the device appears to be malfunctioning.
  • step 2202 because manipulation with the SP is not currently available, the routine determines whether the device has designated or previously indicated in some manner that the reporting of status information is desirable. If so, the routine continues in step 2203 to report the status information to the mobile device (via the narrative engine) and then returns.
  • device display area 2601 represents a part of the video output hardware of the mobile device that is used to provide the described user interface.
  • Interaction A Range 2602 and Interaction B Range 203 represent range limits for two different interactions.
  • Simulated Phenomenon indicator 2603 represents a location of the SP.
  • the capturing interaction may be automatic once the SP is within the range or a particular portion of the range, or capture may additionally require the user's manipulation of a device input control, such as a separate button.
  • the capturing of an SP may advance the game narrative in some way, such as by accumulating points or eliminating an SP from further interactions or other types of interactions.
  • One mechanism is have the system display the total accumulated points for each game playing session. These totals may include all of the scores (perhaps organized by sequence or user identification) or a selection of scores (such as the top 10). The scores can be associated with particular players.
  • Enhancements to the Simple Game can include those that maintain the user's full participation in the shared narrative for extended periods even when the user is connected only intermittently.
  • the SP path determining logic can be simple enough to run on either mobile or remote computing system running remote simulation engine software. This allows the user to interact with SP's in their vicinity by using the narrative logic executing on their machine. However, other users who maintain connection with the remote system will likely lose the ability to track the location of the disconnected mobile user.
  • location-based systems that make use of higher location-resolution (finer granularity), such as less than 20 meters, can support types of interactions that are not so easily localized.
  • Finer granularity location-resolution
  • maps rarely convey significant situation context such as attendant physical hazards like automobile or pedestrian traffic; environmental details of potential interest to game creators such as street signage; or details relevant to social context such as a gathering place for particular types of activities.
  • SPIS-based systems can also be similarly enhanced using tactile feedback, such as vibration frequencies, pitch, etc.

Abstract

Methods and systems for interacting with simulated phenomena are provided. Example embodiments provide a Simulated Phenomena Interaction System “SPIS,” which enables a user to incorporate simulated phenomena into a real world environment by interacting with the simulated phenomena. In one embodiment, the SPIS comprises a mobile environment (e.g., a mobile device) and a simulation engine. The mobile environment may be configured as a thin client that remotely communicates with the simulation engine, or it may be configured as a fat client that incorporates one or more of the components of the simulation engine into the mobile device. These components cooperate to define the characteristics and behavior of the simulated phenomena and interact with users via mobile devices. The characteristics and behavior of the simulated phenomena are based in part upon values sensed from the real world, thus achieving a more integrated correspondence between the real world and the simulated world.

Description

    TECHNICAL FIELD
  • The present invention relates to methods and systems for incorporating computer-controlled representations into a real world environment and, in particular, to methods and systems for using one or more mobile devices to interact with simulated phenomena.
  • BACKGROUND
  • Computerized devices, such as portable computers, wireless phones, personal digital assistants (PDAs), global positioning system devices (GPSes) etc., are becoming compact enough to be easily carried and used while a user is mobile. They are also becoming increasingly connected to communication networks over wireless connections and other portable communications media, allowing voice and data to be shared with other devices and other users while being transported between locations. Interestingly enough, although such devices are also able to determine a variety of aspects of the user's surroundings, including the absolute location of the user, and the relative position of other devices, these capabilities have not yet been well integrated into applications for these devices.
  • For example, applications such as games have been developed to be executed on such mobile devices. They are typically downloaded to the mobile device and executed solely from within that device. Alternatively, there are multi-player network based games, which allow a user to “log-in” to a remotely-controlled game from a portable or mobile device; however, typically, once the user has logged-on, the narrative of such games is independent from any environment-sensing capabilities of the mobile device. At most, a user's presence through addition of an avatar that represents the user may be indicated in an on-line game to other mobile device operators. Puzzle type gaming applications have also been developed for use with some portable devices. These games detect a current location of a mobile device and deliver “clues” to help the user find a next physical item (like a scavenger hunt).
  • GPS mobile devices have also been used with navigation system applications such as for nautical navigation. Typical of these applications is the idea that a user indicates to the navigation system a target location for which the user wishes to receive an alert. When the navigation system detects (by the GPS coordinates) that the location has been reached, the system alerts the user that the target location has been reached.
  • Computerized simulation applications have also been developed to simulate a nuclear, biological, or chemical weapon using a GPS. These applications mathematically represent, in a quantifiable manner, the behavior of dispersion of the weapon's damaging forces (for example, the detection area is approximated from the way the wind carries the material emanating from the weapon). A mobile device is then used to simulate detection of this damaging force when the device is transported to a location within the dispersion area.
  • None of these applications take advantage of or integrate a device's ability to determine a variety of aspects of the user's surroundings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of a Simulated Phenomena Interaction System used to enhance the real world environment.
  • FIG. 2 is a block diagram of an overview of example Simulated Phenomena Interaction System in operation.
  • FIG. 3 is an example mobile device display of the results of an interaction request to a simulation engine used in a game, which involves both detection and measurement of simulated phenomena.
  • FIG. 4 is an example mobile device display of the results of an interaction request to a simulation engine used in a game, which involves communication with a simulated phenomenon.
  • FIG. 5 is an example mobile device display of the results of an interaction request to a simulation engine used in a game, which involves manipulation of a simulated phenomenon.
  • FIG. 6 is an example block diagram of components of an example Simulated Phenomena Interaction System.
  • FIG. 7 is an example block diagram of an alternative embodiment of components of an example simulation engine.
  • FIG. 8 is an overview flow diagram of example steps to process interaction requests within a simulation engine of a Simulated Phenomena Interaction System.
  • FIG. 9 is an overview flow diagram of example steps to process interactions within a mobile device used with a Simulated Phenomena Interaction System.
  • FIG. 10 is an example block diagram of a general purpose computer system for practicing embodiments of a simulation engine of a Simulated Phenomena Interaction System.
  • FIG. 11 illustrates an embodiment of a “thin” client mobile device, which interacts with a remote simulation engine running for example on a general purpose computer system, as shown in FIG. 10.
  • FIG. 12 illustrates an embodiment of a “fat” client mobile device in which one or more portions of the simulation engine reside as part of the mobile device environment itself.
  • FIG. 13 is an example block diagram of an event loop for an example simulation engine of a Simulated Phenomena Interaction System.
  • FIG. 14 is an example flow diagram of an example detection interaction routine provided by a simulation engine of a Simulated Phenomena Interaction System.
  • FIG. 15 is an example diagram illustrating simulation engine modeling of a mobile device that is able to sense its location by detecting electromagnetic broadcasts.
  • FIG. 16 is an example illustration of an example field of vision on a display of a wearable device.
  • FIG. 17 is an example diagram illustrating simulation engine modeling of a mobile device enhanced with infrared capabilities whose location is sensed by infrared transceivers.
  • FIG. 18 is an example illustration of a display on a mobile device that indicates the location of a simulated phenomenon relative to a user's location as a function of the physical location of the mobile device.
  • FIG. 19 contains a set of diagrams illustrating different ways to determine and indicate the location of a simulated phenomenon relative to a user when a device has a different physical range from its apparent range as determined by the simulation engine.
  • FIG. 20 is an example flow diagram of an example measurement interaction routine provided by a simulation engine of a Simulated Phenomena Interaction System.
  • FIG. 21 is an example flow diagram of an example communicate interaction routine provided by a simulation engine of a Simulated Phenomena Interaction System.
  • FIG. 22 is an example flow diagram of an example manipulation interaction routine provided by a simulation engine of a Simulated Phenomena Interaction System.
  • FIG. 23 is an example block diagram of an authoring system used with the Simulated Phenomena Interaction System.
  • FIG. 24 is an example block diagram of an example Simulated Phenomena Interaction System integrated into components of a commerce-enabled environment.
  • FIG. 25 is an overview flow diagram of example steps to process spectator requests within a simulation engine of a Simulated Phenomena Interaction System.
  • FIG. 26 is an example diagram of an example mobile device display showing two distinct simulated phenomena interaction ranges.
  • FIG. 27 is an example diagram of an example location-based system's reference grid with indication of an initial location determination.
  • FIG. 28 is an example diagram of an example location-based system's reference grid with indication of an initial SP location.
  • FIG. 29 is an example diagram of an example location-based system's reference grid with indication of an initial interaction area.
  • FIG. 30 is an example diagram of an example location-based system's reference grid with an indication of a subsequent interaction area.
  • FIG. 31 is an overview flow diagram of an example Determine Location routine that incorporates GPS transient error suppression.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Embodiments of the present invention provide enhanced computer- and network-based methods and systems for interacting with simulated phenomena using mobile devices. Example embodiments provide a Simulated Phenomena Interaction System (“SPIS”), which enables users to enhance their real world activity with computer-generated and computer-controlled simulated entities, circumstances, or events, whose behavior is at least partially based upon the real world activity taking place. The Simulated Phenomena Interaction System is a computer-based environment that can be used to offer an enhanced gaming, training, or other simulation experience to users by allowing a user's actions to influence the behavior of the simulated phenomenon including the simulated phenomenon's simulated responses to interactions with the simulated phenomenon. In addition, the user's actions may influence or modify a simulation's narrative, which is used by the SPIS to assist in controlling interactions with the simulated phenomenon, thus providing an enriched, individualized, and dynamic experience to each user.
  • For the purposes of describing a Simulated Phenomena Interaction System, a simulated phenomenon includes any computer software controlled entity, circumstance, occurrence, or event that is associated with the user's current physical world, such as persons, objects, places, and events. For example, a simulated phenomenon may be a ghost, playmate, animal, particular person, house, thief, maze, terrorist, bomb, missile, fire, hurricane, tornado, contaminant, or other similar real or imaginary phenomenon, depending upon the context in which the SPIS is deployed. Also, a narrative is sequence of events (a story—typically with a plot), which unfold over time. For the purposes herein, a narrative is represented by data (e.g., the current state and behavior of the characters and the story) and logic which dictates the next “event” to occur based upon specified conditions. A narrative may be rich, such as an unfolding scenario with complex modeling capabilities that take into account physical or imaginary characteristics of a mobile device, simulated phenomena, and the like. Or, a narrative may be more simplified, such as merely the unfolding of changes to the location of a particular simulated phenomenon over time.
  • FIG. 1 is a block diagram of a Simulated Phenomena Interaction System used to enhance the real world environment. In FIG. 1, operators 101, 102, and 103 interact with the Simulated Phenomena Interaction System (“SPIS”) 100 to interact with simulated phenomenon of many forms. For example, FIG. 1 shows operators 101, 102, and 103 interacting with three different types of simulated phenomena: a simulated physical entity, such as a metering device 110 that measures the range of how close a simulated phenomena is to a particular user; an imaginary simulated phenomenon, such as a ghost 111; and a simulation of a real world event, such as a lightning storm 112. Note that, for the purposes of this description, the word “operator” is used synonymously with user, player, participant, etc. Also, one skilled in the art will recognize that a system such as the SPIS can simulate basically any real or imaginary phenomenon providing that the phenomenon's state and behavior can be specified and managed by the system.
  • In one example embodiment, the Simulated Phenomena Interaction System comprises one or more functional components/modules that work together to support a single or multi-player computer gaming environment that uses one or more mobile devices to “play” with one or more simulated phenomena according to a narrative. The narrative is potentially dynamic and influenced by players' actions, external personnel, as well as the phenomena being simulated. One skilled in the art will recognize that these components may be implemented in software or hardware or a combination of both. In another example embodiment, the Simulated Phenomena Interaction System comprises one or more functional components/modules that work together to provide a hands-on training environment that simulates real world situations, for example dangerous or hazardous situations, such as contaminant and air-born pathogen detection and containment, in a manner that safely allows operators trial experiences that more accurately reflect real world behaviors. In another example embodiment, the Simulated Phenomena Interaction System one or more functional components/modules that work together to provide a commerce-enabled application that generates funds for profit and non-profit entities. For example, in one embodiment, spectators are defined that can participate in an underlying simulation experience by influencing or otherwise affecting interactions with Simulated Phenomena Interaction System based upon financial contributions to a charity or to a for-profit entity.
  • For use in all such simulation environments, a Simulated Phenomena Interaction System comprises a mobile device or other mobile computing environment and a simulation engine. The mobile device is typically used by an operator to indicate interaction requests with a simulated phenomenon. The simulation engine responds to such indicated requests by determining whether the indicated interaction request is permissible and performing the interaction request if deemed permissible. The simulation engine comprises additional components, such as a narrative engine and various data repositories, which are further described below and which provide sufficient data and logic to implement the simulation experience. That is, the components of the simulation engine implement the characteristics and behavior of the simulated phenomena as influenced by a simulation narrative.
  • FIG. 2 is a block diagram of an overview of example Simulated Phenomena Interaction System in operation. In FIG. 2, the Simulated Phenomena Interaction System (SPIS) includes a mobile device 201 shown interacting with a simulation engine 202. Mobile device 201 forwards (sends or otherwise indicates, depending upon the software and hardware configuration) an interaction request 205 to the simulation engine 202 to interact with one or more simulated phenomena 203. The interaction request 205 specifies one or more of the operations of detection, measurement, communication, and manipulation. These four operations are the basic interactions supported by the Simulated Phenomena Interaction System. One skilled in the art will recognize that other interactions may be defined separately or as subcomponents, supersets, or aggregations of these operations, and the choice of operations is not intended to be exclusive. In one embodiment of the system, at least one of the interaction requests 205 to the simulation engine 202 indicates a value that has been sensed by some device or function 204 in the user's real world. Sensing function/device 204 may be part of the mobile device 201, or in proximity of the mobile device 201, or completely remote to the location of both the mobile device 201 and/or the simulation engine 202. Once the interaction request 205 is received by simulation engine 202, the simulation engine determines an interaction response 206 to return to the mobile device 201, based upon the simulated phenomena 203, the previously sensed value, and a narrative 207 associated with the simulation engine 202. The characterizations (attribute values) of the simulated phenomena 203, in cooperation with events and data defined by the narrative 207, determine the appropriate interaction response 206. Additionally, the simulation engine 202 may take other factors into account in generating the interaction response 206, such as the state of the mobile device 201, the particular user initiating the interaction request 205, and other factors in the simulated or real world environment. At some point during the processing of the interaction request 205, the simulation provided by simulation engine 202 is affected by the sensed value and influences the interaction response 206. For example, the characterizations of the simulated phenomena 203 themselves may be modified as a result of the sensed value; an appropriate interaction response selected based upon the sensed value; or the narrative logic itself modified as a result. Other affects and combinations of affects are possible.
  • FIGS. 3, 4, and 5 are example mobile device displays associated with interaction requests and responses in a gaming environment. These figures correspond to an example embodiment of a gaming system, called “Spook,” that incorporates techniques of the methods and systems of the Simulated Phenomena Interaction System to enhance the gaming experience. In summary, Spook defines a narrative in which ghosts are scattered about a real world environment in which the user is traveling with the mobile device, for example, a park. The game player, holding the mobile device while traveling, interacts with the game by initiating interaction requests and receiving feedback from the simulation engine that runs the game. In one example, the player's goal is to find a particular ghost so that the ghost can be helped. In that process, the player must find all the other ghosts and capture them in order to enhance the detection capabilities of the detection device so that it can detect the particular ghost. As the player travels around the park, the ghosts are detected (and can be captured) depending upon the actual physical location of the player in the park. The player can also team up with other players (using mobile devices) to play the game.
  • FIG. 3 is an example mobile device display of the results of an interaction request to a simulation engine used in a game, which involves both detection and measurement of simulated phenomena. Mobile device 300 includes a detection and measurement display area 304 and a feedback and input area 302. In FIG. 3, mobile device 300 shows the results of interacting with a series of ghosts (the simulated phenomena) as shown in detection and measurement display area 304. The interaction request being processed corresponds to both detection and measurement operations (e.g., “show me where all the ghosts are”). In response to this request, the simulation engine sends back information regarding the detected simulated phenomena (“SPs”) and where they are relative to the physical location of the mobile device 300. Accordingly, the display area 304 shows a “spectra-meter” 301 (a spectral detector), which indicates the locations of each simulated phenomena (“SP”) that was detectable and detected by the device 300. In this example, the line of the spectra-meter 301 indicates a direction of travel of the user of the mobile device 300 and the SPs' locations are relative to device location. An observation “key” to the detected SPs is shown in key area 303. The display area 304 also indicates that the current range of the spectra-meter 301 is set to exhibit a 300 foot range of detection power. (One skilled in the art will recognize that this range may be set by the simulation engine to be different or relative to the actual physical detection range of the device—depending upon the narrative logic and use of SPIS.) Using the current range, the spectra-meter 301 has detected four different ghosts, displayed in iconic form by the spectra-meter 301. As a result of the detection and measurement request, the simulation engine has also returned feedback (in the form of a hint) to the user which is displayed in feedback and input area 302. This hint indicates a current preference of one of the ghosts called “Lucky Ghost.” The user can then use this information to learn more about Lucky Ghost in a future interaction request (see FIG. 4). Once skilled in the art will recognize that the behaviors and indications shown by mobile device 300 are merely examples, and that any behavior and manner of indicating location of an SP is possible as long as it can be implemented by the SPIS. For example, the pitch of an audio tone, other visual images, or tactile feedback (e.g., device vibration), may be used to indicate the presence of and proximity of a ghost. In addition, other attributes that characterize the type of phenomenon being detected, such as whether the SP is friendly or not, may also be shown.
  • FIG. 4 is an example mobile device display of the results of an interaction request to a simulation engine used in a game, which involves communication with a simulated phenomenon. Mobile device 400 includes a question area 401, an answer area 402, and a special area 403, which is used to indicate a reliability measurement of the information just received from the ghosts. Mobile device 400 also includes an indication of the current SP being communicated with in the header area 404 (here the “Lucky Ghost”). In the specific example shown, the operator selects between the three questions displayed in question area 401, using whatever navigational input is available on the mobile device 400 (such as arrow keys in combination with the buttons in input area 405). One skilled in the art will recognize that, using other types of mobile devices, alternate means for input and thus alternative indication of communications is possible and desirable. For example, using a device with a keyboard, the user might type in (non preformed) questions that utilize a system of keyword matching. A response, which is not shown, would be displayed by mobile device 400 in the answer area 402 when it is received from the simulation engine. Also, the truth detector shown in special area 403 would register a value (not shown) indicating the reliability of the SP response.
  • FIG. 5 is an example mobile device display of the results of an interaction request to a simulation engine used in a game, which involves manipulation of a simulated phenomenon. Mobile device 500 includes a feedback and input area 503. In FIG. 5, mobile device 500 illustrates the result of performing a “vacuuming operation” on a previously located ghost. Vacuuming is a manipulation operation provide by the Spook game to allow a user a means of capturing a ghost. The spectra-meter 502 shows the presence of a ghost (SP) currently to the left of the direction the user is traveling. Depending upon the rules of the narrative logic of the game, the ghost may be close enough to capture. When the user initiates a vacuuming operation with the simulation engine, then the vacuuming status bar area 501 is changed to show the progress of vacuuming up the ghost. If the ghost is not within manipulation range, this feedback (not shown) is displayed in the feedback and input area 503.
  • In a hands-on training environment that simulates real world situations, such as a contaminant detection simulation system, the interaction requests and interaction responses and processed by the mobile device are appropriately modified to reflect the needs of the simulation. For example, techniques of the Simulated Phenomena Interaction System may be used to provide training scenarios which address critical needs related to national security, world health, and the challenges of modern peacekeeping efforts. In one example embodiment, the SPIS is used to create a Biohazard Detection Training Simulator (BDTS) that can be used to train emergency medical and security personnel in the use of portable biohazard detection and identification units in a safe, convenient, affordable, and realistic environment.
  • This embodiment simulates the use of contagion detector devices that have been developed using new technologies to detect pathogens and contagions in a physical area. Example devices include BIOHAZ, FACSCount, LUMINEX 100, ANALYTE 2000, BioDetector (BD), ORIGEN Analyzer, and others, as described by the Bio-Detector Assessment Report prepared by the U.S. Army Edgewood Chemical, Biological Center (ERT Technical Bulletin 2001-4), which is herein included by reference in its entirety. Since it is prohibitively expensive to install such devices in advance everywhere they may be needed in the future, removing them from commission for training emergency personnel is not practical. Thus, BDTSs can be substituted for training purposes. These BDTSs need to simulate the pathogen and contagion detection technology as well as the calibration of a real contagion detector device and any substances needed to calibrate or operate the device. In addition, the narrative needs to be constructed to simulate field conditions and provide guidance to increase the awareness of proper personnel protocol when hazardous conditions exist.
  • In addition to gaming and hazardous substance training simulators, one skilled in the art will recognize, that the techniques of the Simulated Phenomena Interaction System may be useful to create a variety of other simulation environments, including response training environments for other naturally occurring phenomenon, for example, earthquakes, floods, hurricanes, tornados, bombs, and the like. Also, these techniques may be used to enhance real world experiences with more “game-like” features. For example, a SPIS may be used to provide computerized (and narrative based) routing in an amusement park with rides or other facility so that a user's experience is optimized to frequent rides with the shortest waiting times. In this scenario, the SPIS acts as a “guide” by placing SPs in locations (relative to the user's physical location in the park) that are strategically located relative to the desired physical destination. The narrative, as evidenced by the SPs behavior and responses, encourages the user to go after the strategically placed SPs. The user is thus “led” by the SPIS to the desired physical destination and encouraged to engage in desired behavior (such as paying for the ride) by being “rewarded” by the SPIS according to the narrative (such as becoming eligible for some real world prize once the state of the mobile deice is shown to a park operator). Many other gaming, training, and computer aided learning experiences can be similarly presented and supported using the techniques of a Simulated Phenomena Interaction System.
  • Any such SPIS game (or other SPIS simulation scenario) can be augmented by placing the game in a commerce-enabled environment that integrates with the SPIS game through defined SPIS interfaces and data structures. For example, with the inclusion of additional modules and the use of a financial transaction system (such as those systems known in the art that are available to authorize and authenticate financial transactions over the Internet), spectators of various levels can affect, for a price, the interactions of a game in progress. The price paid may go to a designated charitable organization or may provide direct payment to the game provider or some other profit-seeking entity, depending upon how the commerce-enable environment is deployed. An additional type of SPIS participant (not the operator of the mobile device) called a “spectator” is defined. A spectator, depending upon the particular simulation scenario, authentication, etc. may have different access rights that designate what data is viewable by the spectator and what parts of or how the SPIS scenario or underlying environment may be affected. A spectator's ability to affect the simulation scenario or assist a mobile device operator is typically in proportion to the price paid. In addition, a spectator may be able to provide assistance to an individual participant or a team. For example, a narrative “hint” may be provided to the designated operator of a mobile device (the “game participant”) in exchange for the receipt of funds from the spectator. Further, the price of such assistance may vary according to the current standing of the game participant relative to the competition or some level to be attained. Thus, the spectator is given access to such information to facilitate a contribution decision.
  • Different “levels” of spectators may be defined, for example, by specifying a plurality of “classes” (as in the object-oriented term, or equivalents thereto) of spectators that own or inherit a set of rights. These rights dictate what types of data are viewable from, for example, the SPIS data repositories. The simulation engine is then responsible to abide by the specified access right definitions once a spectator is recognized as belonging to a particular spectator class. One skilled in the art will recognize that other simulation participants, such as a game administrator, an operator (game participant), or a member of a team can also be categorized as belonging to a participant level that defines the participants access rights.
  • In one example embodiment of a commerce-enabled environment, five classes of spectators (roles) are defined as having the following access rights:
  • (1) Participant (operator(s) of a mobile device):
      • Participants have access to all data relevant to their standing in the game (includes their status within the narrative context). They also have access to their competitor's status as if they are an anonymous spectator. They may keep data that they explicitly generate, such as notes, private from anyone else.
        (2) Team Member:
      • A Team Member has a cooperative relationship with the Participant and thus has access to all Participant data except private notes. Also may have access to all streaming data such as audio and/or video generated by any simulation scenario participants.
        (3) Anonymous Spectator:
      • An Anonymous Spectator has limited access to game data of all Participants. Can view general standings of all Participants, including handicap values, some narrative details (e.g., puzzles), and streaming data.
        (4) Authenticated Spectator:
      • An Authenticated Spectator has access to all data an Anonymous Spectator can access, plus enhanced views of narrative and Participant status. For example, they may be able to view the precise location of any SP or Participant.
        (5) Administrator:
      • Administrators have access to all of the data viewable by other levels, plus additional data sets such as enhanced handicap values of participants, state of the scenario or various puzzles and solutions. They may have the ability to modify the state of the narrative as the simulation occurs. Typically the only aspects of the simulation they cannot view or modify are associated with secure commerce aspects or private notes of the Participants.
        One skilled in the art will recognize that many other spectator definitions with different or similar access rights may be defined.
  • With the use of a commerce-enabled environment, spectators can indirectly participate in the simulation in a manner that enhances the simulation environment, while providing a source of income to the non-profit or profit-based recipient of the funds. In another example, spectators place (and pay for) wagers on simulation participants (e.g., game players) or others aspects of the underlying simulation scenario and the proceeds are distributed accordingly.
  • For use in all such simulation environments, a Simulated Phenomena Interaction System comprises a mobile device or other mobile computing environment and a simulation engine. FIG. 6 is an example block diagram of components of an example Simulated Phenomena Interaction System. In FIG. 6, a Simulated Phenomena Interaction System comprises one or more mobile devices or computing environments 601-604 and a simulation engine 610. For example, FIG. 6 shows four different types of mobile devices: a global positioning system (GPS) 601, a portable computing environment 602, a personal data assistant (PDA) 603, and a mobile telephone (e.g., a cell phone) 604. The mobile device is typically used by an operator as described above to indicate interaction requests with a simulated phenomenon. Simulation engine 610 responds to such indicated requests by determining whether the indicated interaction request is permissible and performing the interaction request if deemed so.
  • The simulation engine may further comprise a narrative with data and event logic, a simulated phenomena characterizations data repository, and a narrative engine (e.g., to implement a state machine for the simulation). The narrative engine uses the narrative and the simulated phenomena characterizations data repository to determine whether an indicated interaction is permissible, and, if so, to perform that interaction with a simulated phenomenon. In addition, the simulation engine may comprise other data repositories or store other data that characterizes the state of the mobile device, information about the operator, the state of the narrative, etc.
  • Accordingly, the simulation engine 610 may comprise a number of other components for processing interaction requests and for implementing the characterizations and behavior of simulated phenomena. For example, simulation engine 610 may comprise a narrative engine 612, an input/output interface 611 for interacting with the mobile devices 601-604 and for presenting a standardized interface to control the narrative engine and/or data repositories, and one or more data repositories 620-624. In what might be considered a more minimally configured simulation engine 610, the narrative engine 612 interacts with a simulated phenomena attributes data repository 620 and a narrative data and logic data repository 621. The simulated phenomena attributes data repository 620 typically stores information that is used to characterize and implement the “behavior” of simulated phenomena (responses to interaction requests). For example, attributes may include values for location, orientation, velocity, direction, acceleration, path, size, duration schedule, type, elasticity, mood, temperament, image, ancestry, or any other seemingly real world or imaginary characteristic of simulated phenomena. The narrative data and logic data repository 621 stores narrative information and event logic which is used to determine a next logical response to an interaction request. The narrative engine 612 uses the narrative data and logic data repository 621 and the simulated phenomena attributes data repository 620 to determine whether an indicated interaction is permissible, and, if so, to perform that interaction with the simulated phenomena. The narrative engine 612 then communicates a response or the result of the interaction to a mobile device, such as devices 601-604 through the I/O interface 611. I/O interface 611 may contain, for example support tools and protocol for interacting with a wireless device over a wireless network.
  • In a less minimal configuration, the simulation engine 610 may also include one or more other data repositories 622-624 for use with different configurations of the narrative engine 612. These repositories may include, for example, a user characteristics data repository 622, which stores characterizations of each user who is interacting with the system; a environment characteristics data repository 624, which stores values sensed by sensors within the real world environment; and a device attributes data repository 623, which may be used to track the state of each mobile device being used to interact with the SPs.
  • One skilled in the art will recognize that many different ways are available to determine or calculate values for the attributes stored in these repositories, including, for example, determining a pre-defined constant value; evaluating a mathematical formula, including a value that is based upon the values of other attributes; human input; real-world data sampling; etc. In addition, the same or different determination techniques may be used for each of the different types of data repositories (e.g., simulated phenomena, device, user, environment, etc.), varied on a per attribute basis, per device, per SP, etc. Many other arrangements are possible and contemplated.
  • One skilled in the art will recognize that many configurations are possible with respect to the narrative engine 612 and the various data repositories 620-624. These configurations may vary with respect to how much logic and data is contained in the narrative engine 612 itself versus stored in each data repository and whether the event logic (e.g., in the form of a narrative state machine) is stored in data repositories, as for example stored procedures, or is stored in other (not shown) code modules or as mathematical function definitions. In the embodiment exemplified in FIG. 6, it is assumed that the logic for representing and processing the simulated phenomena and the narratives are contained in the respective data repositories 620 and 621 themselves. In an alternate embodiment, there may be additional modules in the simulation engine that model the various subcomponents of the SPIS.
  • FIG. 7 is an example block diagram of an alternative embodiment of components of an example simulation engine. In this embodiment, separate modules implement the logic needed to model each component of a simulation engine, such as the simulated phenomena, the environment, and the narrative. As in the embodiment described in FIG. 6, the simulation engine 701 comprises a narrative engine 702, input/output interfaces 703, and one or more data repositories 708-712. Also, similarly, the narrative engine 702 receives and responds to interaction requests through the input/output interfaces 703. I/O interfaces 703 may contain, for example, support tools and protocol for interacting with a wireless device over a wireless network. In addition, however, simulation engine 701 contains separate models for interacting with the various data repositories 708-712. For example, simulation engine 701 comprises a phenomenon model 704, a narrative logic model 706, and an environment model 705. The data repositories 708-712 are shown connected to a data repository “bus” 707 although this bus may be merely an abstraction. Bus 707 is meant to signify that any of the models 704-706 may be communicating with one or more of the data repositories 708-712 resident on the bus 707 at any time. In this embodiment, as in the embodiment shown in FIG. 6, some of the data repositories 708-712 are shown as optional (dotted lines), such as a user characteristics data repository 711 and a device attributes data repository 712. However, because FIG. 7 shows an example that uses an environment model 705, FIG. 7 shows a corresponding environment data repository 709, which stores the state (real or otherwise) of various attributes being tracked in the environment.
  • Models 704-706 are used to implement the logic (that affects event flow and attribute values) that governs the various entities being manipulated by the system, instead of placing all of the logic into the narrative engine 702, for example. Distributing the logic into separate models allows for more complex modeling of the various entities manipulated by the simulation engine 701, such as, for example, the simulated phenomena, the narrative, and representations of the environment, users, and devices. For example, a module or subcomponent that models the simulated phenomena, the phenomenon model 704, is shown separately connected to the plurality of data repositories 708-712. This allows separate modeling of the same type of SP, depending, for example, on the mobile device, the user, the experience of the user, sensed real world environment values for a specific device, etc. Having a separate phenomenon model 704 also allows easy testing of the environment to implement, for example, new scenarios by simply replacing the relevant modeling components. It also allows complex modeling behaviors to be implemented more easily, such as SP attributes whose values require a significant amount of computing resources to calculate; new behaviors to be dynamically added to the system (perhaps, even, on a random basis); multi-user interaction behavior (similar to a transaction processing system that coordinates between multiple users interacting with the same SP); algorithms, such as artificial intelligence based algorithms, which are better executed on a distributed server machine; or other complex requirements.
  • Also, for example, the environment model 705 is shown separately connected to the plurality of data repositories 708-712. Environment model 705 may comprise state and logic that dictates how attribute values that are sensed from the environment influence the simulation engine responses. For example, the type of device requesting the interaction, the user associated with the current interaction request, or some such state may potentially influences how a sensed environment value affects an interaction response or an attribute value of an SP.
  • Similarly, the narrative logic model 706 is shown separately connected to the plurality of data repositories 708-712. The narrative logic model 706 may comprise narrative logic that determines the next event in the narrative but may vary the response from user to user, device to device, etc., as well as based upon the particular simulated phenomenon being interacted with.
  • The content of the data repositories and the logic necessary to model the various aspects of the system essentially defines each possible narrative, and hence it is beneficial to have an easy method for tailoring the SPIS for a specific scenario. In one embodiment, the various data repositories and/or the models are populated using an authoring system.
  • FIG. 23 is an example block diagram of an authoring system used with the Simulated Phenomena Interaction System. In FIG. 23, a narrative author 2301 invokes a narrative authoring toolkit (“kit”) 2302 to generate data repository content 2303 for each of the data repositories 2304 to be populated. The narrative authoring kit 2302 provides tools and procedures necessary to generate the content needed for the data respository. The generated content 2303 is then stored in the appropriate SPIS data repositories 2304. (For example, SP content is stored in the appropriate Simulated Phenomena Attributes data repository, such as repository 620 in FIG. 6.) In some circumstances, it is desirable to localize the SPIS data repository content by customizing a generic narrative scenario to a particular location, for example, by adding environment-specific data values to the narrative data. In such circumstances, the data repository content is optionally forwarded to a narrative localization kit 2305 prior to being stored in the appropriate Simulated Phenomena Attributes data repositories 2304. A localization person 2306 uses the localization kit 2305 to facilitate collecting, determining, organizing, and integrating environment-specific data into the SPIS data repositories 2304.
  • When a Simulated Phenomena Interaction System is integrated into a commerce-enabled scenario, additional components are present to handle commerce transactions and interfacing to the various other “participants” of the simulation scenario, for example, spectators, game administrators, contagion experts, etc. FIG. 24 is an example block diagram of an example Simulated Phenomena Interaction System integrated into components of a commerce-enabled environment. The commerce-enabled environment shown in FIG. 24 depicts the use of a SPIS scenario with a charity based commerce system. One skilled in the art will recognize that other commerce-enabled uses are also contemplated and integrated with the SPIS in a similar fashion. For example, a commerce-enabled environment that supports wagers placed on mobile device gaming participants or simulated phenomena of an underlying game is also supported by the modules depicted in FIG. 24.
  • In FIG. 24, commerce system 2400 comprises SPIS support modules 2404-2406, commerce transaction support 2431, a commerce data repository 2430, and simulation engine 2410. Users (commerce participants) 2401-2403, through the SPIS support modules 2404-2406, interact with the SPIS system as described relative to FIGS. 6 and 7 through the input/output interface 2411, which also contains a standardized interface (application programming interface known as an “API”) for interfacing to the SPIS simulation engine 2410. For example, mobile operator (participant) 2401 uses the operator participant support module 2404 to interact with the simulation engine 2412. Similarly administrator 2402 uses the administrator support module 2405 to manage various aspects of the underlying simulation scenario such as defining the various charitable donations required for different types of operator assistance. Also similarly, spectator 2403 uses the spectator support module 2406 to view simulation environment and competitors' parameters and to engage in a financial transaction (such as a charity donation) via commerce support module 2431.
  • For example, after viewing the progress of the underlying simulation scenario via spectator support module 2406, the spectator 2403 may choose to support a team the spectator 2403 desires will win. (In a commerce-enable wagering environment, the spectator 2403 may choose to place “bets” on a team, a device operator, or, for example, a simulated phenomenon that the spectator 2403 believes will win.) Accordingly, spectator 2403 “orders” an assist via spectator support module 2406 by paying for it via commerce support module 2431. Once a financial transaction has been authenticated and verified (using well-known transaction processing systems such as credit card servers on the Internet), appropriate identifying data is placed by the commerce support module 2431 into the commerce data repository 2430 where it can be retrieved by the various SPIS support modules 2404-2406. The spectator support module then informs the simulation engine 2410 of the donation and instructs the simulation engine 2410 to provide assistance (for example, through a hint to the designated mobile device operator) or other activity.
  • In some scenarios, a spectator 2403 may be permitted to modify certain simulation data stored in the data repositories 2420-2422. Such capabilities are determined by the capabilities offered through the API 2411, the narrative, and the manner in which the data is stored.
  • In one arrangement, the SPIS support modules 2404-2406 interface with the SPIS data repositories 2420-2422 via the narrative engine 2412. One skilled in the art will recognize that rather than interface through the narrative engine 2412, other embodiments are possible that interface directly through data repositories 2420-2422. Example SPIS data repositories that can be viewed and potentially manipulated by the different participants 2401-2403 include the simulated phenomena attributes data repository 2420, the narrative data & logic data repository 2421, and the user (operator) characteristics data repository 2422. Other SPIS data repositories, although not shown, may be similarly integrated.
  • In some scenarios, a spectator is permitted to place wagers on particular device operators, teams, or simulated phenomena. Further, in response to such wagers, the narrative may influence aspects of the underlying simulation scenario. In such cases the commerce support 2431 includes well-known wager-related support services as well as general commerce transaction support. One skilled in the art will recognize that the possibilities abound and that that modules depicted in FIG. 24 can support a variety of commerce-enabled environments.
  • Regardless of the internal configurations of the simulation engine, the components of the Simulated Phenomena Interaction System process interaction requests in a similar overall functional manner.
  • FIGS. 8 and 9 provide overviews of the interaction processing of a simulation engine and a mobile device in a Simulated Phenomena Interaction System. FIG. 8 is an overview flow diagram of example steps to process interaction requests within a simulation engine of a Simulated Phenomena Interaction System. In step 801, the simulation engine receives an interaction request from a mobile device. In step 802, the simulation engine characterizes the device from which the request was received, and, in step 803, characterizes the simulated phenomenon that is the target/destination of the interaction request. Using such characterizations, the simulation engine is able to determine whether or not, for example, a particular simulated phenomenon may be interacted with by the particular device. In step 804, the simulation engine determines, based upon the device characterization, the simulated phenomenon characterization, and the narrative logic the next event in the narrative sequence; that is, the next interaction response or update to the “state” or attributes of some entity in the SPIS. In step 805, if the simulation engine determines that the event is allowed (based upon the characterizations determined in steps 802-804), then the engine continues in step 806 to perform that event (interaction response), or else continues back to the beginning of the loop in step 801 to wait for the next interaction request.
  • FIG. 9 is an overview flow diagram of example steps to process interactions within a mobile device used with a Simulated Phenomena Interaction System. In step 901, optionally within some period of time, and perhaps not with each request or not at all, the device senses values based upon the real world environment in which the mobile device is operating. As described earlier, this sensing of the real world may occur by a remote sensor that is completely distinct from the mobile device, attached to the mobile device, or may occur as an integral part of the mobile device. For example, a remote sensor may be present in an object in the real world that has no physical connection to the mobile device at all. One skilled in the art will recognize that many types of values may be sensed by such mobile devices and incorporated within embodiments of the SPIS including, for example, sensing values associated with ambient light, temperature, heart rate, proximity of objects, barometric pressure, magnetic fields, traffic density, etc. In step 902, the device receives operator input, and in step 903 determines the type of interaction desired by the operator. In step 904, the device sends a corresponding interaction request to the simulation engine and then awaits a response from the simulation engine. One skilled in the art, will recognize that depending upon the architecture used to implement the SPIS, the sending of an interaction request may be within the same device or may be to a remote system. In step 905, a simulation engine response is received, and in step 906, any feedback indicated by the received response is indicated to the operator. The mobile device processing then returns to the beginning of the loop in step 901.
  • When the simulation engine is used in a commerce-enabled environment, such as that shown in FIG. 24, the simulation engine needs also to process interface requests and respond to simulation participants, such as administrators and spectators, other than the operators of mobile devices. FIG. 25 is an overview flow diagram of example steps to process spectator requests within a simulation engine of a Simulated Phenomena Interaction System. In step 2501, the simulation engine presents options to the designated spectator. In one scenario, the prices may vary according to the kind of assistance or manipulation requested or wager and the success status of a designated operator participant. For example, if the designated operator participant is a winning team, the price for spectator participation may be increased. In step 2502, the simulation engine receives a request (from a designated spectator) to assist the designated recipient. In step 2503, the simulation engine invokes a standard financial transaction system to process the financial aspects of the request. In step 2504 if the transaction is properly authorized, then the engine continues in step 2507, otherwise continues in step 2505. In step 2505, the engine indicates a failed request to the user, logs the failed financial transaction in steps 2506, and returns. In step 2507, the simulation engine provides the indicated assistance (or other indicated participation) to the designated operator or team, logs the successful transaction in step 2508, and returns.
  • Although the techniques of Simulated Phenomena Interaction System are generally applicable to any type of entity, circumstance, or event that can be modeled to incorporate a real world attribute value, the phrase “simulated phenomenon,” is used generally to imply any type of imaginary or real-world place, person, entity, circumstance, event, occurrence. In addition, one skilled in the art will recognize that the phrase “real-world” means in the physical environment or something observable as existing, whether directly or indirectly. Also, although the examples described herein often refer to an operator or user, one skilled in the art will recognize that the techniques of the present invention can also be used by any entity capable of interacting with a mobile environment, including a computer system or other automated or robotic device. In addition, the concepts and techniques described are applicable to other mobile devices and other means of communication other than wireless communications, including other types of phones, personal digital assistances, portable computers, infrared devices, etc, whether they exist today or have yet to be developed. Essentially, the concepts and techniques described are applicable to any mobile environment. Also, although certain terms are used primarily herein, one skilled in the art will recognize that other terms could be used interchangeably to yield equivalent embodiments and examples. In addition, terms may have alternate spellings which may or may not be explicitly mentioned, and one skilled in the art will recognize that all such variations of terms are intended to be included.
  • Example embodiments described herein provide applications, tools, data structures and other support to implement a Simulated Phenomena Interaction System to be used for games, interactive guides, hands-on training environments, and commerce-enabled simulation scenarios. One skilled in the art will recognize that other embodiments of the methods and systems of the present invention may be used for other purposes, including, for example, traveling guides, emergency protocol evaluation, and for more fanciful purposes including, for example, a matchmaker (SP makes introductions between people in a public place), traveling companions (e.g., a bus “buddy” that presents SPs to interact with to make an otherwise boring ride potentially more engaging), a driving pace coach (SP recommends what speed to attempt to maintain to optimize travel in current traffic flows), a wardrobe advisor (personal dog robot has SP “personality,” which accesses current and predicted weather conditions and suggests attire), etc. In the following description, numerous specific details are set forth, such as data formats and code sequences, etc., in order to provide a thorough understanding of the techniques of the methods and systems of the present invention. One skilled in the art will recognize, however, that the present invention also can be practiced without some of the specific details described herein, or with other specific details, such as changes with respect to the ordering of the code flow.
  • A variety of hardware and software configurations may be used to implement a Simulated Phenomena Interaction System. A typical configuration, as illustrated with respect to FIGS. 2 and 6, involves a client-server architecture of some nature. One skilled in the art will recognize that many such configurations exist ranging from a very thin client (mobile) architecture that communicates with all other parts of the SPIS remotely to a fat client (mobile) architecture that incorporates all portions of the SPIS on the client device. Many configurations in between these extremes are also plausible and expected.
  • FIG. 10 is an example block diagram of a general purpose computer system for practicing embodiments of a simulation engine of a Simulated Phenomena Interaction System. The general purpose computer system 1000 may comprise one or more server (and/or client) computing systems and may span distributed locations. In addition, each block shown may represent one or more such blocks as appropriate to a specific embodiment or may be combined with other blocks. Moreover, the various blocks of the simulation engine 1010 may physically reside on one or more machines, which use standard interprocess communication mechanisms, across wired or wireless networks to communicate with each other.
  • In the embodiment shown, computer system 1000 comprises a computer memory (“memory”) 1001, an optional display 1002, a Central Processing Unit (“CPU”) 1003, and Input/Output devices 1004. The simulation engine 1010 of the Simulated Phenomena Interaction System (“SPIS”) is shown residing in the memory 1001. The components of the simulation engine 1010 preferably execute on CPU 1003 and manage the generation and interaction with of simulated phenomena, as described in previous figures. Other downloaded code 1030 and potentially other data repositories 1030 also reside in the memory 1010, and preferably execute on one or more CPU's 1003. In a typical embodiment, the simulation engine 1010 includes a narrative engine 1011, an I/O interface 1012, and one or more data repositories, including simulated phenomena attributes data repository 1013, narrative data and logic data repository 1014, and other data repositories 1015. In embodiments that include separate modeling components, these components would additionally reside in the memory 1001 and execute on the CPU 1003.
  • In an example embodiment, components of the simulation engine 1010 are implemented using standard programming techniques. One skilled in the art will recognize that the components lend themselves object-oriented, distributed programming, since the values of the attributes and behavior of simulated phenomena can be individualized and parameterized to account for each device, each user, real world sensed values, etc. However, any of the simulation engine components 1011-1015 may be implemented using more monolithic programming techniques as well. In addition, programming interfaces to the data stored as part of the simulation engine 1010 can be available by standard means such as through C, C++, C#, and Java API and through scripting languages such as XML, or through web servers supporting such interfaces. The data repositories 1013-1015 are preferably implemented for scalability reasons as databases rather than as a text file, however any storage method for storing such information may be used. In addition, behaviors of simulated phenomena may be implemented as stored procedures, or methods attached to SP “objects,” although other techniques are equally effective.
  • One skilled in the art will recognize that the simulation engine 1010 and the SPIS may be implemented in a distributed environment that is comprised of multiple, even heterogeneous, computer systems and networks. For example, in one embodiment, the narrative engine 1011, the I/O interface 1012, and each data repository 1013-1015 are all located in physically different computer systems, some of which may be on a client mobile device as described with reference to FIGS. 11 and 12. In another embodiment, various components of the simulation engine 1010 are hosted each on a separate server machine and may be remotely located from tables stored in the data repositories 1013-1015.
  • FIGS. 11 and 12 are examples block diagrams of client devices used for practicing embodiments of the simulated phenomena interaction system. FIG. 11 illustrates an embodiment of a “thin” client mobile device, which interacts with a remote simulation engine running for example on a general purpose computer system, as shown in FIG. 10. FIG. 12 illustrates an embodiment of a “fat” client mobile device in which one or more portions of the simulation engine reside as part of the mobile device environment itself.
  • Specifically, FIG. 11 shows mobile device 1101 interacting over a mobile network 1130, such as a wireless network 1130, to interact with simulation engine 1120. The mobile device 1101 may comprise a display 1102, a CPU 1104, a memory 1107, one or more environment sensors 1103, one or more network devices 1106 for communicating with the simulation engine 1120 over the network 1130, and other input/output devices 1105. Code such as client code 1108 that is needed to interact with the simulation engine 1120 preferably resides in the memory 1108 and executes on the CPU 1104. One skilled in the art will recognize that a variety of mobile devices may be used with the SPIS included cell phones, PDAs, GPSes, portable computing devices, infrared devices, 3-D wireless (e.g., headmounted) glasses, virtual reality devices, other handheld devices and wearable devices, and basically any mobile or portable device capable of location sensing. In addition, network communication may be provided over cell phone modems, IEEE 802.11b protocol, Bluetooth protocol or any other wireless communication protocol or equivalent.
  • Alternatively, the client device may be implemented as a fat client mobile device as shown in FIG. 12. In FIG. 12, mobile device 1201 is shown communicating via a communications network 1230 to other mobile device or portable computing environments. The communications network may be a wireless network or a wired network used to intermittently send data to other devices and environments. The mobile device 1201 may comprise a display 1202, a CPU 1204, a memory 1207, one or more environment sensors 1203, one or more network devices 1206 for communicating over the network 1230, and other input/output devices 1205. The components 1202-1206 correspond to their counterparts described with reference to the thin client mobile device illustrated in FIG. 12. As currently depicted, all components and data of the simulation engine 1220 are contained within the memory 1207 of the client device 1201 itself. However, one skilled in the art will recognize that one or more portions of simulation engine 1220 may be instead remotely located such that the mobile device 1201 communicates over the communications network 1230 using network devices 1206 to interact with those portions of the simulation engine 1220. In addition to a simulation engine 1220 shown in the memory 1207 is other program code 1208, which may be used by the mobile device to initiate an interaction request as well as for other purposes, some of which may be unrelated to the SPIS.
  • Different configurations and locations of programs and data are contemplated for use with the techniques of the present invention. In example embodiments, these components may execute concurrently and asynchronously; thus, the components may communicate using well-known message passing techniques. One skilled in the art will recognize that equivalent synchronous embodiments are also supported by an SPIS implementation, especially in the case of a fat client architecture. Also, other steps could be implemented for each routine, and in different orders, and in different routines, yet still achieve the functions of the SPIS.
  • As described in FIGS. 1-9, some of the primary functions of a simulation engine of a Simulated Phenomena Interaction System are to implement (generate and manage) simulated phenomena and to handle interaction requests from mobile devices so as to incorporate simulated phenomena into the real world environments of users.
  • FIG. 13 is an example block diagram of an event loop for an example simulation engine of a Simulated Phenomena Interaction System. As described earlier, typically the narrative engine portion of the simulation engine receives interaction requests from a mobile device through the I/O interfaces, determines how to process them, processes the requests if applicable, and returns any feedback indicated to the mobile device for playback or display to an operator. The narrative engine receives as input with each interaction request an indication of the request type and information that identifies the device or specify attribute values from the device. Specifically, in step 1301, the narrative engine determines or obtains state information with respect to the current state of the narrative and the next expected possible states of the narrative. That is, the narrative engine determines what actions and/or conditions are necessary to advance to the next state and how that state is characterized. This can determined by any standard well-known means for implementing a state machine, such as a case statement in code, a table-driven method etc. In step 1302, the narrative engine determines what type of interaction request was designated as input and in steps 1303-1310 processes the request accordingly. More specifically, in step 1303, if the designated interaction request corresponds to a detection request, then the narrative engine proceeds in step 1307 to determine which detection interface to invoke and then invokes the determined interface. Otherwise, the narrative engine continues in step 1304 to determine whether the designated interaction request corresponds to a communications interaction request. If so, the narrative engine continues in step 1308, to determine which communication interface to invoke and subsequently invokes the determined interface. Otherwise, the narrative engine continues in step 1305 to determine whether the designated interaction request corresponds to a measurement request. If so, then the narrative engine continues in step 1309 to determine which measurement interface to invoke and then invokes the determined interface. Otherwise, the narrative engine continues in step 1306 to determine whether the designated interaction request corresponds to a manipulation request. If so, the narrative engine continues in step 1310 to determine which manipulation interface to invoke and then invokes the determined interface. Otherwise, the designated interaction request is unknown, and the narrative engine continues in step 1311. (The narrative engine may invoke some other default behavior when an unknown interaction request is designated.) In step 1311, the narrative engine determines whether the previously determined conditions required to advance the narrative to the next state have been satisfied. If so, the narrative engine continues in step 1312 to advance the state of the narrative engine to the next state indicated by the matched conditions, otherwise continues to wait for the next interaction request. Once the narrative state has been advanced, the narrative engine returns to the beginning of the event loop in step 1301 to wait for the next interaction request.
  • As indicated in FIG. 13, the narrative engine needs to determine which interaction routine to invoke (steps 1307-1310). One skilled in the art will recognize that any of the interaction routines including a detection routine can be specific to a simulated phenomenon, a device, an environment, or some combination of any such factors or similar factors. Also, depending upon the architecture of the system, the overall detection routine (which calls specific detection functions) may be part of the narrative engine, a model, or stored in one of the data repositories.
  • FIG. 14 is an example flow diagram of an example detection interaction routine provided by a simulation engine of a Simulated Phenomena Interaction System. This routine may reside and be executed by the narrative engine portion of the simulation engine. In the example shown in FIG. 14, the Detect_SP routine (the overall detection routine) includes as input parameters the factors needed to be considered for detection. In this example, the Detect_SP routine receives a designated identifier of the particular simulated phenomenon (SP_id), a designated identifier of the device (Dev_id), any designated number of attributes and values that correspond to the device (Dev_attrib_list), and the current narrative state information associated with the current narrative state (narr_state). The current narrative state information contains, for example, the information determined by the narrative engine in step 1301 of the Receive Interaction Request routine. The detection routine, as common to all the interaction routines, determines given the designed parameters whether the requested interaction is possible, invokes the interaction, and returns the results of the interaction or any other feedback so that it can be in turn reported to the mobile device via the narrative engine.
  • Specifically, in step 1401, the routine determines whether the detector is working, and, if so, continues in step 1404 else continues in step 1402. This determination is conducted from the point of view of the narrative, not the mobile device (the detector). In other words, although the mobile device may be working correctly, the narrative may dictate a state in which the client device (the detector) appears to be malfunctioning. In step 1402, the routine, because the detector is not working, determines whether the mobile device has designated or previously indicated in some manner that the reporting of status information is desirable. If so, the routine continues in step 1403 to report status information to the mobile device (via the narrative engine), and then returns. Otherwise, the routine simply returns without detection and without reporting information. In step 1404, when the detector is working, the routine determines whether a “sensitivity function” exists for the particular interaction routine based upon the designated SP identifier, device identifier, the type of attribute that the detection is detecting (the type of detection), and similar parameters.
  • A “sensitivity function” is the generic name for a routine, associated with the particular interaction requested, that determines whether an interaction can be performed and, in some embodiments, performs the interaction if it can be performed. That is, a sensitivity function determines whether the device is sufficiently “sensitive” (in “range” or some other attribute value) to interact with the SP with regard specifically to the designated attribute in the manner requested. For example, there may exist many detection routines available to detect whether a particular SP should be considered “detected” relative to the current characteristics of the requesting mobile device. The detection routine that is eventually selected as the “sensitivity function” to invoke at that moment may be particular to the type of device, some other characteristic of the device, the simulated phenomena being interacted with, or another consideration, such as an attribute value sensed in the real world environment, here shown as “attrib_type.” For example, the mobile device may indicate the need to “detect” an SP based upon a proximity attribute, or an agitation attribute, or a “mood” attribute (an example of a completely arbitrary, imaginary attribute of an SP). The routine may determine which sensitivity function to use in a variety of ways. The sensitivity functions may be stored, for example, as a stored procedures in the simulated phenomena characterizations data repository, such as data repository 620 in FIG. 6, indexed by attribute type of an SP type. An example routine for finding a sensitivity function and an example sensitivity function are described below with reference to Tables 1 and 2.
  • Once the appropriate sensitivity function is determined, then the routine continues in step 1405 to invoke the determined detection sensitivity function. Then, in step 1406, the routine determines as a result of invoking the sensitivity function, whether the simulated phenomenon was considered detectable, and, if so, continues in step 1407, otherwise continues in step 1402 (to optionally report non-success). In step 1407, the routine indicates (in a manner that is dependent upon the particular SP or other characteristics of the routine) that the simulated phenomenon is present (detected) and modifies or updates any data repositories and state information as necessary to update the state of the SP, narrative, and potentially the simulated engine's internal representation of the mobile device, to consider the SP “detected.” In step 1408, the routine determines whether the mobile device has previously requested to be in a continuous detection mode, and, if so, continues in step 1401 to begin the detection loop again, otherwise returns.
  • One skilled in the art will recognize that other functionality can be added and is contemplated to be added to the detection routine and the other interaction routines. For example, functions for adjustment (real or imaginary) of the mobile device from the narrative's perspective and functions for logging information could be easily integrated into these routines.
    TABLE 1
    1 function Sensitivity (interaction_type, dev_ID, SP_ID,
    att_type1, ..., att_typeN)
    2 For each att_type
    3 sensFunction =
    GetSensitivityFunctionForType(interaction_type, att_type)
    4 If not sensFunction(SP_ID, dev_ID)
    5 Return Not_Detectable
    6 End for
    7 Return Detectable
    8 end function
  • As mentioned, several different techniques can be used to determine which particular sensitivity function to invoke for a particular interaction request. Because, for example, there may be different sensitivity calculations based upon the type of interaction and the type of attribute to be interacted with. A separate sensitivity function may also exist on a per-attribute basis for the particular interaction on a per-simulated phenomenon basis (or additionally per device, per user, etc.). Table 1 shows the use of a single overall routine to retrieve multiple sensitivity functions for the particular simulated phenomenon and device combination, one for each attribute being interacted with. (Note that multiple attributes may be specified in the interaction request. Interaction may be a complex function of multiple attributes as well.) Thus, for example, if for a particular simulated phenomenon there are four attributes that need to be detected in order for the SP to be detected from the mobile device perspective, then there may be four separate sensitivity functions that are used to determine whether that attribute of the SP is detectable at that point. Note that, as shown in line 4, the overall routine can also include logic to invoke the sensitivity functions on the spot, as opposed to invoking the function as a separate step as shown in FIG. 14.
    TABLE 2
    SensitivityAgitation(SP_ID, dev_ID)
    {
    Position positionDev, positionSP;
    long range, dist;
    int agitationSP;
    agitationSP = GetAgitationStateFromSP(SP_ID);
    positionSP = GetPositionOfSP(SP_ID);
    positionDev = GetPositionFromDevice(dev_ID);
    range = agitationSP * 10;
    dist = sqrt( (positionSP.x − positionDev.x){circumflex over ( )}2 +
    (positionSP.y − positionDev.y){circumflex over ( )}2);
    if (dist <= range ) then
    return Detectable;
    else
    return Not_Detectable;
    }
  • Table 2 is an example sensitivity function that is returned by the routine GetSensitivityFunctionForType shown in Table 1 for a detection interaction for a particular simulated phenomenon and device pair as would be used with an agitation characteristic (attribute) of the simulated phenomenon. In essence, the sensitivity agitation function retrieves an agitation state variable value from the SP characterizations data repository, retrieves a current position from the SP characterization data repository, and receives a current position of the device from the device characterization data repository. The current position of the SP is typically an attribute of the SP, or calculated from such attribute. Further, it may be a function of the current actual location of the device. Note that the characteristics of the SP (e.g., the agitation state) are dependent upon which SP is being addressed by the interaction request, and may also be dependent upon the particular device interacting with the particular SP and/or the user that is interacting with the SP. Once the values are retrieved, the example sensitivity function then performs a set of calculations based upon these retrieved values to determine whether, based upon the actual location of the device relative to the programmed location of the SP, the SP agitation value is “within range.” If so, the function sends back a status of detectable; otherwise, it sends back a status of not detectable.
  • As mentioned earlier, the response to each interaction request is in some way based upon a real world physical characteristic, such as the physical location of the mobile device submitting the interaction request. The real world physical characteristic may be sent with the interaction request, sensed from a sensor in some other way or at some other time. Responses to interaction requests can also be based upon other real world physical characteristics, such as physical orientation of the mobile device—e.g., whether the device is pointing at a particular object or at another mobile device, or, for example, how fast the operator of the device is moving (velocity) or the direction of travel (bearing). One skilled in the art will recognize that many other characteristics can be incorporated in the modeling of the simulated phenomena, provided that the physical characteristics are measurable and taken into account by the narrative or models incorporated by the simulation engine. For the purposes of ease of description, a device's physical location will be used as exemplary of how a real world physical characteristic is incorporated in SPIS.
  • A mobile device, depending upon its type, is capable of sensing its location in a variety of ways, some of which are described here. One skilled in the art will recognize that there are many methods for sensing location and are contemplated for use with the SPIS. Once the location of the device is sensed, this location can in turn be used to model the behavior of the SP in response to the different interaction requests. For example, the position of the SP relative to the mobile device may be dictated by the narrative to be always a multiple from the current physical location of the user's device until the user enters a particular spot, a room, for example. Alternatively, an SP may “jump away” (exhibiting behavior similar to trying to swat a fly) each time the physical location of the mobile device is computed to “coincide” with the apparent location of the SP. To perform these type of behaviors, the simulation engine typically models both the apparent location of the SP and the physical location of the device based upon sensed information.
  • The location of the device may be an absolute location as available with some devices, or may be computed by the simulation engine (modeled) based upon methods like triangulation techniques, the device's ability to detect electromagnetic broadcasts, and software modeling techniques such as data structures and logic that models latitude, longitude, altitude, velocity and bearing of the device, etc. Examples of devices that can be modeled in part based upon the device's ability to detect electromagnetic broadcasts include cell phones such as the Samsung SCH W300 with the Verizon™ network, the Motorola V710, which can operate using terrestrial electromagnetic broadcasts of cell phone networks or using the electromagnetic broadcasts of satellite GPS systems, and other “location aware” cell phones, wireless networking receivers, radio receivers, photo-detectors, radiation detectors, heat detectors, and magnetic orientation or flux detectors. Examples of devices that can be modeled in part based upon triangulation techniques include GPS devices, Loran devices, some E911 cell phones.
  • FIG. 15 is an example diagram illustrating simulation engine modeling of a mobile device that is able to sense its location by detecting electromagnetic broadcasts. For example, in some cases, a mobile device is able to “sense” when it can receive transmissions from a particular cell tower. More specifically, location is determined by the mobile device by performing triangulation calculations that measure the signal strengths of various local cell phone (fixed location) base stations. More commonly, a mobile device such as a cell phone receives location information transmitted to it by the base station based upon calculations carried out on the wireless network server systems. These server systems typically rely at least in part on the detected signal strength as measured by various base stations in the vicinity of the cell phone. The servers use triangulation and other calculations to determine the cell phone's location, which is then broadcast back to the phone, typically in a format that can be translated into longitude/latitude or other standard GIS data formats. This sensed information is then forwarded from the mobile device to the simulation engine so that the simulation engine can model the position of the device (and subsequently the location of SPs). As a result of the modeling, the simulated engine might determine or be able to deduce that the device is currently situated in a particular real world area (region). Note that the regions may be continuous (detection flows from one region to another without an area where location in undetectable) or discontinuous (broadcast detection is interrupted by an area where transmissions cannot be received).
  • In the example shown in FIG. 15, each circle represents an physical area where the device is able to sense an electromagnetic signal from a transmitter, for example, a cell tower if the device is a cell phone. Thus, the circle labeled #1 represents a physical region where the mobile device is currently able to sense a signal from a first transmitter. The circle labeled #2 similarly represents a physical region where the mobile device is able to sense a signal from a second transmitter, etc. The narrative, hence the SP, can make use of this information in modeling the location of the SP relative to the mobile device's physical location. For example, when the mobile device demonstrates or indicates that it is in the intersection of the regions # 1 and #2 (that is the device can detect transmissions from transmitters # 1 and #2), labeled in the figure with an “A” and cross-hatching, the narrative might specify that an SP is detectable, even though it may have an effective location outside the intersection labeled “A.” For example, the narrative may have computed that the effective location of the simulated phenomena is in the intersection of regions # 2 and #3, labeled in the figure with a “B” and hatching. The narrative may indicate that a simulated phenomenon is close by the user, but not yet within vicinity. Alternative, if the device demonstrates or indicates that it is located in region “A” and if the range of the device is not deemed to include region “B,” then the narrative may not indicate presence of the SP at all. The user of the mobile device may have no idea that physical regions # 1 and #2 (or their intersection) exist—only that the SP is suddenly present and perhaps some indication of relative distance based upon the apparent (real or narrative controlled) range of the device.
  • In addition, by controlling the apparent position of an SP, the narrative may in effect “guide” the user of the mobile device to a particular location. For example, the narrative can indicate the position of an SP at a continuous relative distance to the (indicator of the) user, provided the location of the mobile device travels through and to the region desired by the narrative, for example along a path from region # 2, through region # 5, to region # 1. If the mobile device location instead veers from this path (travels from region # 2 directly to region # 1 by passing region # 5, the narrative can detect this situation and communicate with the user, for example indicating that the SP has become further away or undetectable (the user might be considered (“lost”).
  • A device might also be able to sense its location in the physical world based upon a signal “grid” as provided, for example, by GPS-enabled systems. A GPS-enabled mobile device might be able to sense not only that it is in a physical region, such as receiving transmissions from transmitter # 5, but it also might be able to determine that it is in a particular rectangular grid within that region, as indicated by rectangular regions #6-9. This information may be used to give GPS-enabled device a finer degree of detection than that available from cell phones, for example. One example such device is a Compaq iPaq H3850, with a Sierra wireless AirCard 300 using AT&T Wireless Internet Service and a Transplant Computing GPS card. In addition, cell phones that use the Qualcomm MSM6100 chipset have the same theoretical resolution as any other GPS. Also, an example of a fat-client mobile device is the Garmin IQue 3600, which is a PDA with GPS capability.
  • Other devices present more complicated location modeling considerations and opportunities for integration of simulated phenomena into the real world. For example, a wearable display device, such as Wireless 3D Glasses from the eDimensional company, allows a user to “see” simulated phenomena in the same field of vision as real world objects, thus providing a kind of “augmented reality.” FIG. 16 is an example illustration of an example field of vision on a display of a wearable device. The user's actual vision is the area demarcated as field of vision 1601. The apparent field of vision supported by the device is demarcated by field of vision 1602. Using SPIS technology, the user can see real world objects 1603 and simulated phenomena 1604 within the field 1602. One skilled in the art will recognize that appropriate software modeling can be incorporated into a phenomenon modeling component or the simulated phenomena attributes data repository to account for the 3D modeling supported by such devices and enhance them to represent simulated phenomena in the user's field of view.
  • PDAs with IRDA (infrared) capabilities, for example, a Tungsten T PDA manufactured by Palm Computing, also present more complicated modeling considerations and allows additionally for the detection of device orientation. Though this PDA supports multiple wireless networking functions (e.g., Bluetooth & Wi-Fi expansion card), the IRDA version utilizes its Infrared Port for physical location and spatial orientation determination. By pointing the infrared transmitter at an infrared transceiver (which may be an installed transceiver, such as in a wall in a room, or another infrared device, such as another player using a PDA/IRDA device), the direction the user is facing can be supplied to the simulation engine for modeling as well. This measurement may result in producing more “realistic” behavior in the simulation. For example, the simulation engine may be able to better detect when a user has actually pointed the device at an SP to capture it. Similarly, the simulation engine can also better detect two users facing their respective devices at each other (for example, in a simulated battle). Thus, depending upon the device, it may be possible for the SPIS to produce SPs that respond to orientation characteristics of the mobile device as well as location.
  • FIG. 17 is an example diagram illustrating simulation engine modeling of a mobile device enhanced with infrared capabilities whose location is sensed by infrared transceivers. In FIG. 17, two users of infrared capable mobile devices 1703 and 1706 are moving about a room 1700. In room 1700, there are planted various infrared transceivers 1702, 1704, and 1705 (and the transceivers in each mobile device 1703 and 1706), which are capable of detecting and reporting to the simulation engine the respective locations (and even orientations) of the mobile devices 1703 and 1706. 1701 represents a not-networked infrared source which blinks with a pattern that is recognized by the mobile device. Though no information is transferred from the infrared source to the simulation system, the system can none the less potentially recognize the emitted pattern as the identification of an object in a particular location in the real-world. A simulated phenomenon may even be integrated as part of one of these transceivers, for example, on plant 1708 as embodied in transceiver 1705. The transceiver reported location information can be used by the simulation engine to determine more accurately what the user is attempting to do by where the user is pointing the mobile device. For example, as currently shown in FIG. 17, only the signal from the plant (if the plant is transmitting signals, or, alternatively, the receipt of signal from the device 1703) is within the actual device detection field 1707 of device 1703. Thus, the simulation engine can indicate that the SP associated with plant 1708 is detectable or otherwise capable of interaction.
  • One skilled in the art will recognize that, in general, other devices with other types of location detection can also be incorporated into SPIS in a similar manner to incorporating detection using PDAs with IRDA. Many types of local location determination (determination local to the mobile device) can be employed. For example, a mobile device enhanced with the ability to detect radio frequency, ultrasonic, or other broadcast identification can also be incorporated. Transmitters that broadcast such signals can be placed in an environment similar to that illustrated in FIG. 17 so as to enhance the user's experience. When the mobile device detects these broadcasted signals, they can be communicated to the simulation engine. Alternatively, remote location determination (determination external to the mobile device) can be used. Accordingly, whatever broadcasting technique is incorporated, the mobile device may be outfitted with the transmitter, and appropriate receivers placed in the environment that communicate with the simulation engine when they detect the mobile device. Additional mathematical modeling, such as triangulation, can be used to hone in on the location of the device when multiple sensors are placed. Both local and remote location determination may be particularly useful to determine the location of an enhanced mobile device having GPS capabilities as it moves from, for example, outside where satellite detection is possible, to inside a locale where other methods of device location detection (or simulation/estimation by the narrative) are employed. An example system that provides detection inside a locale using a model of continuous degradation with partial GPS capability is Snaptrack by Qualcomm.
  • One skilled in the art will also recognize that there are inherent inconsistencies and limitations as to the accuracy of sampling data from all such devices. For example, broadcasting methodologies used in location determination as described above can be blocked, reflected, or distorted by the environment or other objects within the environment. Preferably, the narrative handles such errors, inconsistencies, and ambiguities in a manner that is consistent with the narrative context. For example, in the gaming system called “Spook” described earlier, when the environmental conditions provide insufficient reliability or precision in location determination, the narrative might send an appropriate text message to the user such as “Ghosts have haunted your spectral detector! Try to shake them by walking into an open field.” Also, some devices may necessitate that different techniques be used for location determination and the narrative may need to adjust accordingly and dynamically. For example, a device such as a GPS might have high resolution outdoors, but be virtually undetectable (and thus have low location resolution) indoors. The narrative might need to specify the detectability of an SP at that point in a manner that is independent from the actual physical location of the device, yet still gives the user information. Dependent upon the narrative, the system may choose to indicate that the resolution has changed or not.
  • A variety of techniques can be used to indicate detectability of an SP when location determination becomes degraded, unreliable, or lost. For example, the system can display its location in courser detail (similar to a “zoom out” effect). Using this technique the view range is modified to cover a larger area, so that the loss of location precision does not create a view that continuously shifts even though the user is stationary. If the system loses location determination capability completely, the device can use the last known position. Moreover, if the shape of the degraded or occluded location data area is known, the estimated or last-known device position can be shown as a part of a boundary of this area. For example, if the user enters a rectangular building that blocks all location determination signals, the presentation to the user can show the location of the user as a point on the edge of a corresponding rectangle. The view presented to the user will remain based on this location until the device's location can be updated. Regardless of the ability to determine the device's precise location, SP locations can be updated relative to whatever device location the simulation uses.
  • As mentioned, the physical location of the device may be sent with the interaction request itself or may have been sent earlier as part of some other interaction request, or may have been indicated to the simulation engine by some kind of sensor somewhere else in the environment. Once the simulation engine receives the location information, the narrative can determine or modify the behavior of an SP relative to that location.
  • FIG. 18 is an example illustration of a display on a mobile device that indicates the location of a simulated phenomenon relative to a user's location as a function of the physical location of the mobile device. As shown, the mobile device 1800 is displaying on the display screen area 1801 an indication in the “spectral detection field” 1802 of the location of a particular SP 1804 relative to the user's location 1803. In an example scenario, the location of the SP 1804 would be returned from the narrative engine in response to a detection interaction request. As described with respect to FIG. 15, the relative SP location shown is not likely an absolute physical distance and may not divulge any information to the user about the location modeling being employed in the narrative engine. Rather, the difference between the user's location 1803 and the SP location 1804 is dictated by the narrative and may move as the user moves the mobile device to indicate that the user is getting closer or farther from the SP. These aspects are typically controlled by the narrative logic and SP/device specific. There are many ways that the distances between the SP and a user may be modeled. FIG. 18 just shows one of them.
  • Indications of a simulated phenomenon relative to a mobile device are also functions of both the apparent range of the device (area in which the device “operates” for the purposes of the simulation engine) and the apparent range of the sensitivity function(s) used for interactions. The latter (sensitivity range) is typically controlled by the narrative engine but may be programmed to be related to the apparent range of the device. Thus, for example, in FIG. 18, the apparent range of the spectra-meter is shown by the dotted line of the detection field 1802. The range of the device may also be controlled by the logic of the narrative engine and have nothing to do with the actual physical characteristics of the device, or may be supplemented by the narrative logic. For example, the range of the spectra-meter may depend on the range of the sensitivity function programmed into the simulator engine. For example, a user may be able to increase the range (sensitivity) of the sensitivity function and hence the apparent range of the device by adjusting some attribute of the device, which may be imaginary. For example, the range of the spectra-meter may be increased by decreasing the device's ability to display additional information regarding an SP, such as a visual indication of the identity or type of the SP, presumably yielding more “power” to the device for detection purposes rather than display purposes.
  • Although the granularity of the actual resolution of the physical device may be constrained by the technology used by the physical device, the range of interaction, such as detectability, that is supported by the narrative engine is controlled directly by the narrative engine. Thus, the relative size between what the mobile device can detect and what is detectable may be arbitrary or imaginary. For example, although a device might have an actual physical range of 3 meters for a GPS, 30 meters for a WiFi connected device, or 100-1000 meters for cell phones, the simulation engine may be able to indicate to the user of the mobile device that there is a detectable SP 200 meters away, although the user might not yet be able to use a communication interaction to ask questions of it at this point.
  • FIG. 19 contains a set of diagrams illustrating different ways to determine and indicate the location of a simulated phenomenon relative to a user when a device has a different physical range from its apparent range as determined by the simulation engine. In Diagram A, the apparent range circumscribed by radius R2 represents the strength of a detection field 1902 in which an SP can be detected by a mobile device having an actual physical detection range determined by radius R1. For example, if the mobile device is a GPS, R1 may be 3 meters, whereas R2 may be (and typically would be) a large multiple of R1 such as 300 meters.
  • In Diagram B, the smaller circle indicates where the narrative has located the SP is relative to the apparent detection range of the device. The larger circle in the center indicates the location of the user relative to this same range and is presumed to be a convention of the narrative in this example. When the user progresses to a location that is in the vicinity of an SP (as determined by whatever modeling technique is being used by the narrative engine), then, as shown in Diagram C, the narrative indicates to the user that a particular SP is present. (The big “X” in the center circle might indicate that the user is in the same vicinity as the SP.) This indication may need to be modified based upon the capabilities and physical limitations of the device. For example, if a user is using a device, such as a GPS, that doesn't work inside a building and the narrative has located the SP inside the building, then the narrative engine may need to change the type of display used to indicate the SP's location relative to the user. For example, the display might change to a map that shows an inside of the building and indicate an approximate location of the SP on that map even though movement of the device cannot be physically detected from that point on. One skilled in the art will recognize that a multitude of possibilities exist for displaying relative SP and user locations based upon and taking into account the physical location of the mobile device and other physical parameters and that the user will perceive the “influence” of the SP on the user's physical environment as long as it continues to be related back to that physical environment.
  • FIG. 20 is an example flow diagram of an example measurement interaction routine provided by a simulation engine of a Simulated Phenomena Interaction System. This routine may reside and be executed by the narrative engine portion of the simulation engine. It allows a user via a mobile device to “measure” characteristics of an SP to obtain values of various SP attributes. For example, although “location” is one type of attribute that can be measured (and detected), other attributes such as the “color,” “size,” “orientation,” “mood,” “temperament,” “age,” etc. may also be measured. The definition of an SP in terms of the attributes an SP supports or defines will dictate what attributes are potentially measurable. Note that each attribute may support a further attribute which determines whether a particular attribute is currently measurable or not. This latter degree of measurability may be determine by the narrative based upon or independent of other factors such as the state of the narrative, or the particular device, user, etc.
  • Specifically, in step 2001, the routine determines whether the measurement meter is working, and, if so, continues in step 2004 else continues in step 2002. This determination is conducted from the point of view of the narrative, not the mobile device (the meter). Thus, although the metering device appears to be working correctly, the narrative may dictate a state in which the device appears to be malfunctioning. In step 2002, the routine, because the meter is not working, determines whether the device has designated or previously indicated in some manner that the reporting of status information is desirable. If so, the routine continues in step 2003 to report status information to the mobile device (via the narrative engine) and then returns. Otherwise, the routine simply returns without measuring anything or reporting information. In step 2004, when the meter is working, the routine determines whether a sensitivity function exists for a measurement interaction routine based upon the designated SP identifier, device identifier, and the type of attribute that the measurement is measuring (the type of measurement), and similar parameters. As described with reference to Tables 1 and 2, there may be one sensitivity function that needs to be invoked to complete the measurement of different or multiple attributes of a particular SP for that device. Once the appropriate sensitivity function is determined, then the routine continues in step 2005 to invoke the determined measurement sensitivity function. Then, in step 2006, the routine determines as a result of invoking the measurement related sensitivity function, whether the simulated phenomenon was measurable, and if so, continues in step 2007, otherwise continues in step 2002 (to optionally report non-success). In step 2007, the routine indicates the various measurement values of the SP (from attributes that were measured) and modifies or updates any data repositories and state information as necessary to update the state of the SP, narrative, and potentially the simulated engine's internal representation of the mobile device, to consider the SP “measured.” In step 2008, the routine determines whether the device has previously requested to be in a continuous measurement mode, and, if so, continues in step 2001 to begin the measurement loop again, otherwise returns.
  • FIG. 21 is an example flow diagram of an example communicate interaction routine provided by a simulation engine of a Simulated Phenomena Interaction System. This routine may reside and be executed by the narrative engine portion of the simulation engine. It allows a user via a mobile device to “communicate” with a designated simulated phenomenon. For example, communication may take the form of questions to be asked of the SP. These may be pre-formulated questions (retrieved from a data repository and indexed by SP, for example) which are given to a user in response to any request that indicates that the user is attempting communication with the SP, such as by typing: Talk or by pressing a Talk button. Alternatively, the simulation engine may incorporate an advanced pattern matching or natural language engine similar to a search tool. The user could then type in a newly formulated question (not canned) and the simulation engine attempt to answer it or request clarification. In addition, the SP can communicate with the user in a variety of ways, including changing some state of the device to indicate its presence, for example, blinking a light. Or, to simulate an SP speaking to a mobile device that has ringing capability (such as a cell phone), the device might ring seemingly unexpectedly. Also, pre-formulated content may be streamed to the device in text, audio, or graphic form, for example. One skilled in the art will recognize that many means to ask questions or hold “conversations” with an SP exist, or will be developed, and such methods can be incorporated into the logic of the simulation engine as desired. Whichever method is used, the factors that are to be considered by the SP in its communication with the mobile device are typically designated as input parameters. For example, an identifier of the particular SP being communicated with, an identifier of the device, and the current narrative state may be designated as input parameters. In addition, a data structure is typically designated to provide the message content, for example, a text message or question to the SP. The communication routine, given the designated parameters, determines whether communication with the designated SP is currently possible, and if so, invokes a function to “communicate” with the SP, for example, to answer a posed question.
  • Specifically, in step 2101, the routine determines whether the SP is available to be communicated with, and if so, continues in step 2104, else continues in step 2102. This determination is conducted from the point of view of the narrative, not the mobile device. Thus, although the mobile device appears to be working correctly, the narrative may dictate a state in which the device appears to be malfunctioning. In step 2102, the routine, because the SP is not available for communication, determines whether the device has designated or previously indicated in some manner that the reporting of such status information is desirable. If so, the routine continues in step 2103 to report status information to the mobile device of the incommunicability of the SP (via the narrative engine), and then returns. Otherwise, if reporting status information is not desired, the routine simply returns without the communication completing. In step 2104, when the SP is available for communication, the routine determines whether there is a sensitivity function for communicating with the designated SP based upon the other designated parameters. If so, then the routine invokes the communication sensitivity function in step 2105 passing along the content of the desired communication and a designated output parameter to which the SP can indicate its response. By indicating a response, the SP is effectively demonstrating its behavior based upon the current state of its attributes, the designated input parameters, and the current state of the narrative. In step 2106, the routine determines whether a response has been indicated by the SP, and, if so, continues in step 2107, otherwise continues in step 2102 (to optionally report non-success). In step 2107, the routine indicates that the SP returned a response and the contents of the response, which is eventually forwarded to the mobile device by the narrative engine. The routine also modifies or updates any data repositories and state information to reflect the current state of the SP, narrative, and potentially the simulated engine's internal representation of the mobile device to reflect the recent communication interaction. The routine then returns.
  • FIG. 22 is an example flow diagram of an example manipulation interaction routine provided by a simulation engine of a Simulated Phenomena Interaction System. This routine may reside and be executed by the narrative engine portion of the simulation engine. It may be invoked by a user to affect some characteristic of the SP by setting a value of the characteristic or to alter the SPs behavior in some way. For example, in the Spook game, a user invokes a manipulation interaction to vacuum up a ghost to capture it. As another example, in the training scenario, a manipulation interaction function may be used to put a (virtual) box around a contaminant where the box is constructed of a certain material to simulate containment of the contaminating material (as deemed by the narrative). As with the other interaction routines, different characteristics and attributes may be designated as input parameters to the routine in order to control what manipulation sensitivity function is used. Accordingly, there may be specific manipulation functions not only associated with the particular SP but, for example, by which button a user depresses on the mobile device. So, for example, if, for a specific simulation, the device is programmed to invoke certain manipulation interaction functions, then the proper function will be invoked when the user depresses a particular button.
  • Specifically, in step 2201, the routine determines whether it is possible to manipulate the designated SP given the state of the narrative, particular device and user, etc. and, if so, the routine continues in step 2204, else continues in step 2202. This determination is conducted from the point of view of the narrative, not the mobile device. Thus, although the mobile device appears to be working correctly, the narrative may dictate a state in which the device appears to be malfunctioning. In step 2202, because manipulation with the SP is not currently available, the routine determines whether the device has designated or previously indicated in some manner that the reporting of status information is desirable. If so, the routine continues in step 2203 to report the status information to the mobile device (via the narrative engine) and then returns. Otherwise, if reporting status information is not desired, the routine simply returns without communicating with the SP. In step 2204, when manipulation with the SP is available, the routine determines whether a sensitivity function exists for a communication interaction routine based upon a variety of factors such as those discussed with reference to prior interaction functions. In step 2205, the routine invokes the determined manipulation sensitivity function passing along any necessary parameters such as the value of an attribute of a device or a value of the SP to be manipulated. In step 2206, the routine determines as a result of invoking the manipulation sensitivity function whether the simulated phenomenon was successfully manipulated and, if so, continues in step 2207, otherwise continues in step 2202. In step 2207, the routine indicates the results of the particular manipulation requested with the SP, for example reporting a newly set value of an attribute, modifies or updates any data repositories and state information to reflect current state of the SP, narrative, and potentially the simulated engine's internal representation of the mobile device as necessary, and then returns.
  • EXAMPLE EMBODIMENT The Simple Game
  • A simple SPIS embodiment is described and referred to as a “Simple Game.” The Simple Game comprises a mobile device with user I/O hardware, software processing capability, a sensor allowing the determination of location, and the data and logic such that the following narrative functionality is provided:
      • 1. At least two distinct SP interactions are made available to the user at some point in the game;
      • 2. The user's ability to perform at least one of the interactions is limited in range at some point in the game; and
      • 3. The interaction ranges are not identical during at least one point in the game.
  • FIG. 26 is a diagram of a mobile display showing two distinct simulated phenomena interaction ranges as used in a Simple Game. This design demonstrates example minimum necessary components of the Simple Game user interface: the ability to convey two different interaction ranges and an indication of an SP. One skilled in the art will recognize that additional components (e.g., enhancements) can easily be integrated into the Simple Game as described elsewhere in this specification.
  • In FIG. 26, device display area 2601 represents a part of the video output hardware of the mobile device that is used to provide the described user interface. Interaction A Range 2602 and Interaction B Range 203 represent range limits for two different interactions. Simulated Phenomenon indicator 2603 represents a location of the SP.
  • More specifically, Interaction A Range 2602 represents a range limit imposed on the user when interacting with an SP with respect to a first type of interaction. In other words, an interaction of type A is only allowed when the SP is determined to be within the area bounded by the limit indicated by Range 2602. The narrative engine calculates an appropriate range limit and displays the range limit as a distance between the user's current real-world location and a real-world location associated with the SP.
  • In the Simple Game example embodiment, interaction A is a viewing function, which presents to the user a graphical view of detection (existence of an SP) and two measurements of the SP (distance between the user and the SP, and the bearing of the SP relative to the user's real-world orientation).
  • Simulated Phenomenon (“SP”) 2603 represents a visual indication of an SP. In the Simple Game, the position of the SP indication 2603 in the display area 2601 is relative to the user's real-world location, which is presumed to be for this example in the center of the interaction ranges. That is, the user's position (not shown) is in the center of display area 2601. The location of the SP's displayed indication 2603 on the display area 2601 depends on both the sensed or otherwise calculated user's real-world location and orientation of the user's device. However, the real-world location of an SP is typically calculated independent from the user's real-world location except during Simple Game initialization when a game defining fix (“GDF”) based on the user's location is used.
  • In the Simple Game embodiment described, the visual representations of SP locations are typically restricted to a single horizontal plane that corresponds to a flat surface. However, other embodiments can support more complex representations such as representations of multi-dimensional shapes, where the SP position may be calculated in more than two dimensions. As another example, if an SP is motionless, and is directly in front of the user, its indication would be shown as centered in the upper portion of display area 2601. If the user then physically moved forward in a straight line, the user would see the indication move down the middle of the display area. However, if the SP were instead in motion, it's the SP's indication could move in any direction, depending on its and the user's velocity and bearing.
  • As an example, in FIG. 26, the indication of the SP 2603 is shown up and to the left of the center of the display area. If the system were to detect that the mobile device was physically rotated 180 degrees in either direction, the SP indication 2603 could end up being displayed below and to the right of the center of the display area. The orientation of the display in the example shown in FIG. 26 is intended to show the points directly above the center of the display area as the real-world locations that are directly in front of the user. If the device is unable to detect when it is rotated (in any plane, e.g., left/right or up/down) while remaining stationary over a single real-world location, user orientation can be calculated as the current or last known direction of travel. Also, if the game has just been initialized, the display can be shown in a default orientation, such as north being ahead and at the top of the display.
  • In other embodiments (not shown), objects and ranges within the display area 2601 are represented from a first person point of view, instead of the “birds-eye” perspective currently shown in FIG. 26, where the world anywhere within the user's interaction ranges is displayed. When displaying the game from a first person perspective, what is displayed on display area 2601 is governed by the narrative, the interaction ranges, and the user's current orientation within the game space. That is, the display area 2601 shows the game space as seen from the user's perspective. Typically, but not necessarily, the display area 2601's field of view is limited to a portion of the game space, simulating or resembling actual human vision. Then, only the SPs within interaction ranges that are visible within the current field of view may be displayed by the game. Other variations are of course possible, including displaying the SPs and ranges from the point of view of another user, an SP, or an arbitrary position.
  • In addition, it can be advantageous to perform some “smoothing” calculations to minimize erratic displayed changes in orientation and/or bearing. This is especially useful when a location calculation can return different values even when the sensing device is stationary. For example, even a stationary GPS receiver will indicate different locations over time. This can cause the bearing and/or distance of an SP indication, such as indication 2603, to change regardless of any actual motion by the device. This display error is more noticeable when the user is moving at low velocities. To minimize this type of error, the SPIS can use a variety of dampening algorithms, such as comparing the current calculated bearing or position to previous ones calculated on a running average basis (with older data being de-emphasized). One skilled in the art will recognize that other variations for detecting real device movement versus calculation errors can also be incorporated and the display adjusted accordingly.
  • Interaction B Range 2604 represents the range limit imposed on the user when interacting with an SP with respect to a second type of interaction. As currently shown in FIG. 26, the position of SP indication 2603 on the Device Display area 2601 is such that the user would not be able to perform interaction B, as the SP is outside of it's the appropriate interaction range (indicated by Range 2604). In the Simple Game embodiment described, interaction B is the ability to manipulate the SP by “capturing” it. Thus, as shown, the user is not currently able to capture the SP. When an SP is captured, it's the SP's indication 2603 is removed from the display. The capturing interaction may be automatic once the SP is within the range or a particular portion of the range, or capture may additionally require the user's manipulation of a device input control, such as a separate button. The capturing of an SP may advance the game narrative in some way, such as by accumulating points or eliminating an SP from further interactions or other types of interactions.
  • In addition to interactions that have corresponding visual representations, interactions can also have audio characteristics. For example, a predefined sound can be presented to the user when an interaction has been allowed and/or is being executed. The use of audio feedback can also allow the user to monitor for narrative conditions without watching the display area. For example, a distinct sound can be played when an SP is detected within the range of interaction A 2602.
  • Interactions can also have associated corresponding tactile feedback, such as vibrations of a mobile device. For example, when the user captures an SP, the device vibrates. Even further, the device may vibrate differently, depending upon the type of SP captured. One skilled in the art will recognize that combinations of visual, audio, and tactile feedback can also be incorporated.
  • The range limit lines displayed in FIG. 26 are intended to delineate the limits of interaction, though the lines themselves may or may not be displayed to the user. Ranges 2602 and 2604 are currently displayed as circles, but can be other shapes. Though the calculation of whether the SP is within a non-circular range is more complex (as its boundary is more than a single distance from its center), the described area is still be used to define the limits of allowed interactions.
  • In addition to deterministic ranges defined by clearly bounded areas, the ranges can be also determined by probabilistic algorithms. For example, there may be interaction areas defined for a particular SP where the likelihood of interaction is increased or decreased, rather than being a binary allowed/not allowed circumstance. In addition, the likelihood of interaction may correspond to a non-linear increase or decrease.
  • Interaction ranges can be established using a variety of techniques. In one embodiment of the Simple Game, SPs have identical ranges for each of the interaction types. These can be determined by a narrative author either as fixed in distance or as a function of some other game value. In some embodiments, the ranges can be selected by the user. Selection process can be from within the game (such as choosing specific values or general game descriptions), or before the game is invoked (such as selecting from a collection of games). For example, two Simple Games, otherwise the same, could be labeled “Backyard Spook” and “Game Field Spook”. When the former is selected, the game area, including the SP paths and interaction ranges, are determined to be smaller that the latter.
  • Also, the Simple Game can be enhanced such to make ranges for the same type of interaction different for different SP types or specific SP instances. For example, SPs may have an attribute that has values associated with its “visibility”. When the attribute has a low associated value, the corresponding SP may only be visible when the user's real-world location is close to that associated with the SP.
  • Establishing and Managing Location
  • Typically, the Simple Game uses a single GDF (game defining fix) to determine location and establish what it means in a game context. For example, when the Simple Game is initially invoked at a new location, the system determines an absolute location of the mobile device. This location is used to determine the game playing area, typically with the area centered on the GDF, although one skilled in the art will recognize that other variations are possible, for example, defining the game playing area with the GDF at a particular distance from the playing area.
  • Multiple GDFs are also supported. For example, two points can be determined, the first serving as the center of the game, the second the maximum distance that SPs can occur away from the first. Additional GDFs can be used to define non-circular game playing areas.
  • The Simple Game must have at least one SP, though multiple SPs may be created by the SPIS either simultaneously or serially. For example, the creation of a new SP may be predicated on the capture of an existing SP. The initial locations of SP (or multiple SPs) are typically determined relative to the GDF. SP paths can be based solely on the original GDF, or can be based solely or in part on the device's real-world location. SP's can move, following paths defined in a variety of ways, including formulaically defined shapes (e.g., ellipses or polygons), explicit paths (along a series of relative or absolute locations), pseudo-random generated paths or according to paths determined using other software algorithms that mathematically derive or select location values.
  • Even though path-defining functions may be simple and the same for multiple invocations of the game, the resulting SP paths can significantly differ as the user's movement and interaction with SPs differ. Also, SP paths can be defined in terms of seed conditions (e.g., a pseudo-random number generated). They can also be defined in terms of other changing states, such as real world phenomenon (weather, stock market), dynamic game narratives, and SP conditions, such as those defined by SP attributes stored in a data repository.
  • User Commands
  • The example Simple Game provides the user with four commands: 1) initiate game, which can be an explicit command indicated by user manipulation of device input controls, or an implicit command such as by activating the device or other software program; 2) change real-world location, which is accomplished implicitly by user physically changing locations as determined by SPIS using data from a location sensor; 3) capture SP, which can be automatic or manually controlled; and 4) end game, which although typically always an option, can happen without user control, such as when a timer expires or other system-aware condition is satisfied.
  • Of course many other commands can be provided, such as the following example commands:
      • stun SP—stop all changes to the SP's real-world location for a finite duration;
      • accumulate points—points per SP can vary according to SP characteristics such as the difficulty of capture;
      • incremental device capability enhancements—for example, capturing an SP may provide additional stun capability;
      • or other game enhancements, such as saving scores or displaying statistics.
  • One skilled in the art will recognize that, even given the minimum narrative requirements as defined above, a Simple Game can support a variety of user commands, potentially limited only by one's imagination.
  • Game Playing Area
  • The example Simple Game described can be used in any arbitrary but constrained geographic area served by location determination technology. For example, it can be played in a neighborhood park or other safe open space that has sufficient unobstructed view of the sky to allow the reception of GPS satellite broadcasts if a GPS type mobile device is used.
  • The size of the game playing area can be predetermined by the game creators, or it can be selected by the user when the game begins. For example, the user can enter or select from among a set of values defining the maximum distance that SPs can differ from the GDF (absolute game defining fix location determined at game initialization).
  • Simple Game Types
  • The example Simple Game can support various narrative configurations. One example narrative configuration is defined as a ‘sudden death’ configuration, which runs for a fixed time. For example, a quick version of the game may run for not greater than a limited period of time, such as three minutes from game initiation. At the end of the duration, the user's ability to interact with SPs is eliminated, and the user is presented with a score corresponding to points associated with the capture of SPs. Potentially, a running score tabulation can also be maintained by the SPIS.
  • The duration of a sudden death configuration may be predetermined and fixed by the game creator, or may be selected by the user. In either case it may be preferable to have the duration established by the time the user is allowed to interact with the SPs.
  • In a typical sudden death narrative, a fixed number of SPs are always present in the vicinity of the game defining fix (GDF). Of course, if the user strays sufficiently far from the vicinity of the GDF these SPs may not be visible. Five SPs, each representing a different type of SP, each type associated with unique characteristics such as velocity values, works well. When an SP is captured, the user accumulates points. SPs with higher velocities provide greater point values. When an SP is captured, a replacement of the same type is created elsewhere in the vicinity of the game defining fix. In an alternative narrative, SPs can be associated with predetermined locations, however this may limit the ability of playing the game when and where a user may desire.
  • Another example narrative configuration supported by embodiments of the Simple Game is the “exterminator” configuration. Unlike the sudden death version, a total fixed number of SPs are made available for capture in any one game. The SPs may be available immediately and simultaneously when the game is invoked, or they may have narrative dependencies such as a requirement that particular ones are captured before others are made available. Accordingly, the game ends when all SPs are captured. At that time, the total elapsed time necessary to capture all the SPs may be displayed.
  • Data Storage for a Simple Game
  • In general, the SPIS supports narrative constructs complex enough that it is often desirable to maintain permanent and separately maintained data stores to manage the logic and state of the narrative. However, in Simple Game narratives there is typically little data needed; thus, it can be convenient to integrate the required variables into the simulation engine's memory space. This can be done with variables created at game initiation and kept in volatile memory. For example, aside from accounting for the actions of the user and SPs, a Sudden Death narrative configuration can be implemented using simple logic based on variables that maintain a value associated with elapsed time and accumulated points. Similarly, an Exterminator narrative configuration uses elapsed time and possibly precedence rules for SPs (i.e., which SPs need to be captured before others are made available for interaction), which are easily maintained as part of the simulation engine's volatile memory.
  • Similar to the storage considerations for narrative related data, the SP characterizations and states for the Simple Game do not require separately maintained data stores per se. For example, SP actions can be controlled with narrative logic that relies exclusively on SP location values. Optionally, a value of an additional attribute of an SP such as its type, can be used by the simulation engine to select logic of an SP path determination formula. These values can be stored in volatile memory as well.
  • Enhancements for Multiple Players
  • When played on a single device, the Simple Game typically requires separate users to take turns (that is, they do not use the device simultaneously). However various game enhancements that facilitate multi-user play on a single device can be implemented, such as performance rankings.
  • Though users can keep track of their game performance themselves, it may be convenient to have the system assist them. One mechanism is have the system display the total accumulated points for each game playing session. These totals may include all of the scores (perhaps organized by sequence or user identification) or a selection of scores (such as the top 10). The scores can be associated with particular players.
  • Other performance data can be presented, such as a history of the real-world path that the user followed, or other metrics describing the history of the narrative such as when or where the SP interactions occurred, and characteristics of the SPs themselves. For example, a graphic representation of the play area, with indications of user and SP path, with time of capture, and SP characteristics can be available during or at the completion of games. Alternatively, the information can be displayed in a textual fashion, such as using a table that lists user, SP, captures (time and location detail),” etc.
  • Performance data, besides often being interesting and entertaining, can be used to augment multi-player scenarios. For example, the path that a participant follows can have its total length calculated, shared and displayed. The length of the path can be part of a performance rating by, among other techniques, having points deducted for length greater than the player's prior and/or established “par” value. Alternatively, the shape of the path can be meaningful. For example, the goal of the games can be, in whole or part, to achieve a path that does not cross itself.
  • In addition, the Simple Game can be enhanced with features that facilitate simultaneous competitive play on multiple devices, even when there is no communication between the devices. These include those for single devices described previously, and others such as creating situations where there is competition for location. For example, multiple similar games running on separate devices can be run at the same time within the same game area and thereby create competition for capturing SPs at particular locations.
  • In general, when multiple SPIS-enabled games are started at the same time, using the same GDF, the same deterministic SP, and the same narrative definitions, the ranges and real-world locations of allowed SP interactions will be identical (give or take errors caused by the devices, as described above). Further, when an interaction range is relatively small (for example, capture can be within a few meters) and occurs at the same place and time, user's will often try to occupy the same location at the same time, resulting in enjoyable game conflict. Even stationary SPs can cause significant competition when the ability to interact with them is limited in time.
  • The multi-player enhancement just described is typically implemented with deterministic game functions, such as predetermined or formulaically repeatable SP paths. However randomization or user affected SP paths can still be used during game play, even though they may only create location competition by coincidence (the SP paths vary with the users). When randomization is used to determine SP paths, the frequency of location competition can be increased if the SP path or game area is restricted in some manner.
  • An important aspect to providing this type of competition between players in a convenient fashion is the use of the single game defining fix as described above. When each of the users occupies the same location when generating their respective GDFs, they can precisely share the game space.
  • A further enhancement is to implement for each player a delay between creating the GDF and actual game play. Techniques for delaying and synchronizing game play include having the games begin at a predetermined time, or having the users manually begin the games simultaneously.
  • When the devices are able to communicate, additional enhancements can be made to enhance competitive play between multiple players, each having a mobile device. Typically implementations of such enhancements rely on continuous or rapid inter-device data exchange. Multi-user play can include features that rely on the physical exchange of memory media containing simulation relevant data. For example, a device with removable memory media can store a digitally secure definition of a captured SP. By allowing another player with a similar system to access the device, they can transport or morph the SP to their device.
  • Additional Enhancements
  • Many other enhancements can be incorporated into a Simple Game or into other SPIS-enabled games. For example, another technique for determining location or for recognizing objects is to make available environmental or user information tokens that can be detected & identified optically by the mobile device and incorporated as part of the game. For instance, using built-in cameras, a physical pattern on a real world object can be detected by the mobile device. Patterns may be stored for example using known bar coding techniques, or other mechanisms for creating identifiable tokes. A pattern once detected can be used, for example, to identify a real-world location from a table or other data structure stored as part of the game or device that associates such patterns with locations. In addition, such detected patterns may be used to identify objects, such as SPs, that have significance within the game narrative. For example, the identification of a particular object, may be interpreted by the narrative aspect to cause an SP to appear in a particular location. Other ramifications can of course result, dependant upon the narrative and other configurative aspects of the game.
  • Enhancements to the Simple Game can include those that maintain the user's full participation in the shared narrative for extended periods even when the user is connected only intermittently. For example, the SP path determining logic can be simple enough to run on either mobile or remote computing system running remote simulation engine software. This allows the user to interact with SP's in their vicinity by using the narrative logic executing on their machine. However, other users who maintain connection with the remote system will likely lose the ability to track the location of the disconnected mobile user.
  • Automatic Determination of Location (Auto-Localization)
  • Games, simulators, training programs, guiding programs, and other computerized systems that associate real-world locations with computer-controlled system aspects need mechanisms to unambiguously determine locations and incorporate them into system logic. This need presents a variety of challenges to system creators, including the difficulty in verifying that a particular locale is appropriate for the system's operation. In particular, the verification of the safety of area locale is important and difficult without an in-person inspection of the area. The problem becomes more acute for systems that seek to rely on especially precise locations.
  • Currently some location-based systems rely on generating the necessary location data by making estimations based on Geographic Information System (GIS) reference data. This technique often fails due to its lack of location resolution and/or the limited or dated nature of the environmental context information. Other systems rely on in-person inspections by individuals who assess locale appropriateness and, if necessary, determine location specifications. This technique can be impractical if, for example, the system is intended to be used at many or undetermined locations.
  • For location-based systems that only provide relatively low location-resolution, determining an unambiguous location is less of a problem. For example, consider how, when the minimum range for a type of SP interaction of interest to the user is over 100 meters, it is easy to localize an application. An accurate city map referenced with a location grid, e.g., longitude/latitude will often suffice to determine locations on which to base an SP location in such a system. For example, the intersection of major cross streets, an object such as a fountain in a park, or a geographic feature such as a mountain peak will provide sufficient accuracy for this low level of resolution.
  • In contrast, location-based systems that make use of higher location-resolution (finer granularity), such as less than 20 meters, can support types of interactions that are not so easily localized. Consider an SP that can only be interacted with when a user is in close proximity. This requires the user to occupy a relatively small area where the precision of commonly available maps is typically insufficient. Further, maps rarely convey significant situation context such as attendant physical hazards like automobile or pedestrian traffic; environmental details of potential interest to game creators such as street signage; or details relevant to social context such as a gathering place for particular types of activities.
  • The SPIS provides an improved technique termed “auto-localization.” The Simple Game provides an example system that uses auto-localization. The techniques also are operable for similar location-based systems that employ simulated phenomena and are suitable for use in other location-based systems as well that may or may not be associated with simulated phenomena.
  • The SPIS auto-localization technique allow a user of the location-based system to localize the system at run time, with no explicit action by the operator of the device. Auto-localization techniques can determine location using absolute (real-world) location information or using a process termed “dead reckoning,” which deduces location based upon changes in location or other sensed measurements. Auto-localization using dead reckoning is described further below.
  • Auto-Localization Using Absolute Locations
  • When auto-localizing using absolute locations, the mobile device determines an initial (current) location in the real world, fixes a game “grid” (which defines a game area for the purpose of that game), and then executes the game, including determining and interpreting movement of the mobile device user(s) and SP(s) according to the localized grid. For example, the Simple Game requires a single point on which to establish the association between the virtual playfield and the user's immediate vicinity.
  • FIG. 27 is an example diagram of an example location-based system's reference grid with indication of an initial location determination. The columns A-D and rows 1-3 represent different location references. Although many different references can be used, the columns shown in FIG. 27 may be associated with longitudinal values and the rows with latitudinal values. In one example embodiment, the grid is used to define the locations of a user of a mobile location-determining device and the one or more SPs of the narrative. Before the game is initialized by the mobile device for play in a particular real world location, the grid is virtual and arbitrary and not associated with a particular locale or environment. After initialization, it is unambiguously associated with the real world and is in this state referred to as “fixed” or having an “absolute location.”
  • When the game is initialized, a current location is determined and stored in a data store (for example, a single variable). When used to associate the game area with the real-world, this current location is referred to as the Game Defining Fix (GDF). The GDF can be determined using a single latitude/longitude reading, by taking an average of “N” readings for a more precise GDF, or by any other method that generates a more precise latitude/longitude from several readings. A specific technique for establishing and using a GDF is described further below with respect to the pseudo-code of Table 3. In FIG. 27, The shaded area delimited by coordinates B2 represents the GDF. Once a GDF is established, the game grid is now a fixed, unmoving absolute reference to the real world within which the game narrative is played out.
  • For example, in the Simple Game described above, a user may desire to interact with a particular SP. Whether that interaction is permitted depends upon the proximity between the user's location and that of the SP. Using the GDF and the fixed game grid, the game's narrative algorithm(s) can begin determining the locations of SPs, and thus make the determinations of what interactions are permissible by the user.
  • FIG. 28 is an example diagram of an example location-based system's reference grid with indication of an initial SP location. In FIG. 28, the position A1 represents an initial location of the SP (relative to the real world) on the game grid as determined by the narrative and other game components.
  • Once the initial position(s) of the SP(s) are determined, the game then determines what interactions are possible. FIG. 29 is an example diagram of an example location-based system's reference grid with indication of an initial interaction area. The circle represents the view seen by the user. The radius of the circle 2901 defines a first interaction range. For purposes of ease of explanation, an interaction is described as a “view” capability, although one skilled in the art will recognize that the techniques described herein are operable with many different types of interactions. Also, for ease of description, assume that the user's initial location is synonymous with the GDF upon game invocation (soon after the GDF and SPs are determined). Thus, the first interaction range 2901 is shown centered on the GDF.
  • Because the SP location (e.g., at position A1) falls at least partially within the interaction range 2901, the user is permitted to receive all or part of a visual representation of the SP's location relative to the user's location. Thus, a display of the SP, such as that illustrated in FIG. 26 (SP 2603) may be provided.
  • In a typical game, as the user moves (the mobile device moving with the user), the game narrative advances to make additional interactions and/or additional SPs available. FIG. 30 is an example diagram of an example location-based system's reference grid with an indication of a subsequent interaction area. In FIG. 30, position C4 represents the user's location after the user has moved. This is the user's current location, as sensed by the user's mobile device and which may be stored in a variable. Once the user moves (to position C4), the additional interaction area 3001 is made available. Note that the circular view range has remained centered on the user. At this point the user would be restricted from at least one type of SP interaction (e.g., the first interaction 2901), because the SP's location (e.g., at position A1) is out of range of the viewing area defined by the interaction range 3001. To resume interaction with the SP, the SP must again come within range of the user. This is accomplished by either the user relocating, the SP moving (as potentially defined by the narrative), or both.
  • Note that the view range can be associated to the fixed game grid without further reference to the GDF, since the grid has been associated with the real-world and subsequent positions are detected or calculated relative to this initial location on the grid. Alternatively, a game can operate without an explicit grid, such as that shown in FIGS. 27-30. In that case, SP locations are determined with reference solely relative to the GDF and user locations are still be based on sensed measurements that are correlated to the GDF.
  • The auto-localization techniques can be used with different shapes and sizes of game areas. A game area can be defined in terms of a single GDF and a pre-established shape and size—for example, a circle with a 15 m radius. Alternatively, the user can select the size or some other method of determining or inherited a size can be incorporated. For example, after invoking the system and allowing it sufficient time to collect location data for establishing the GDF, the user can then move to the closest hazard. At the hazard, the user can then make an indication (e.g., press a button) to establish the boundary of the game space. The distance from this location to the GDF can be used as the maximum radius of a game space. Alternatively, the user can define an arbitrarily complex game space by moving to multiple locations and making indications to establish a fix at each location. These locations can then be used as the vertices of a polygon and an arbitrary game shape thus established. Though the shape of the game space can be arbitrarily complex, the game can still rely on a single fix (the GDF) to define the game grid. For example, the first fix can be used as the GDF, and all other points calculated relative to it.
  • Game spaces may also differ depending upon the number of players supported by the game. Game spaces for a single player typically include the following attributes: a GDF is determined only when needed; the GDF does not require permanent storage; there is no explicit user or game/trainer developer action necessary to establish the game space; and the game space can be determined using one or more absolute location determinations. For a multi-player game, the game logic typically uses one or more GDF, hence the game space needs to be determined accordingly. The game space may be defined, for example, as the largest area encompassing all of the game spaces of the individual players. Other definitions are also possible.
  • Note that the a GDF can be determined at game initialization and can be stored for many different uses. For example, it can be used for a single play of game (a game “session”), or stored for subsequent games. Also, it can be used by other location games and thus shared. In addition, a GDF can be supplied externally, such as by the user manually entering coordinates, place names, or other location identifiers associated with the a location model used by the game logic. If external data communication is supported (e.g., via a data transmission receiver or removable media reader), the game defining location can be provided by sources other than the user or game provider. Also, a GDF can be shared among users.
  • Table 3 is pseudo-code that illustrates an example technique for establishing and using a GDF as part of auto-localization.
    TABLE 3
    #define NUM_READS 1 // must be one or greater
    double latitudeGDF, longitudeGDF;
    Main( )
    {
    ...
    DoGameDefiningFix( );
    // after GDF has been done
    // localize the SPs to the GDF
    LocalizeSPs(latitudeGDF, longitudeGDF);
    ...
    // play the game
    ...
    }
    void DoGameDefiningFix( )
    {
    double latitude, longitude;
    latitudeGDF = longitudeGDF = 0;
    for (int count = 0; count < NUM_READS; count++)
    {
    // this function get the current
    // lat/long from the GPS
    GetCurrentLatLong(&latitude, &longitude);
    latitudeGDF += latitude;
    longitudeGDF += longitude;
    }
    latitudeGDF = latitudeGDF / NUM_READS;
    longitudeGDF = longitudeGDF / NUM_READS;
    }
    void LocalizeSPs(double latitude, double longitude)
    {
    int index;
    for (index = 0; index < glb.spVec.Size( ) ; index ++)
    {
    glb.spVec[index]−>SetLocation(longitude, latitude);
    glb.spVec[index]−>m_center.m_x = longitude;
    glb.spVec[index]−>m_center.m_y = latitude;
    glb.spVec[index]−>InitialPosition( );
    }
    }
    void CSP_Moving::InitialPosition( )
    {
    double xstep, ystep;
    int dir, bearing;
    switch (m_type)
    {
    case Random:
    // start 3/4 way out in some random direction
    m_xTemp1 = 0;
    m_xTemp2 = 0;
    bearing = rand( ) % 360;
    dir = 1;
    xstep = sin(deg2rad(bearing)) * (3*m_radius/4) *
    ONEMETER;
    ystep = cos(deg2rad(bearing)) * (3*m_radius/4) *
    ONEMETER;
    SetLocation(m_center.m_x + xstep,
    m_center.m_y + ystep);
    break;
    case Star:
    // start half way out in some random direction
    bearing = rand( ) % 360;
    dir = 1;
    xstep = sin(deg2rad(bearing)) * (m_radius/2) *
    ONEMETER;
    ystep = cos(deg2rad(bearing)) * (m_radius/2) *
    ONEMETER;
    SetLocation(m_center.m_x + xstep,
    m_center.m_y + ystep);
    m_iTemp1 = bearing;
    m_iTemp2 = dir;
    break;
    case Box:
    //xdir = 1;
    //ydir = 0;
    int xdir, ydir;
    xdir = (rand( ) % 2 ? 1 : −1);
    ydir = (rand( ) % 2 ? 1 : −1);
    SetLocation(m_center.m_x + m_radius/2 *
    ONEMETER *
    xdir, m_center.m_y + m_radius/2 * ONEMETER * ydir);
    xdir = rand( ) % 3;
    ydir = rand( ) % 3;
    if (xdir == 2)
    xdir = −1;
    if (xdir == 0)
    {
    if (ydir == 0)
    ydir = 1;
    else if (ydir == 2)
    ydir = −1;
    }
    else
    ydir = 0;
    m_iTemp1 = xdir;
    m_iTemp2 = ydir;
    break;
    }
  • Auto-Localization Using Dead Reckoning
  • As mentioned, auto-localization can be performed without relying on a precise location of the user in the real world. Instead, using a “dead reckoning” technique, the game system produced using SPIS can automatically and dynamically determine location by relying on knowing how fast and what direction the user has traveled since a last location request. For this technique to work, the game determines a start position, which may be based upon real world values such as latitude or longitude, or which may be an arbitrary position such as at coordinate (0,0). The game area is presumed to surround the user (the start position) when the game begins and a game grid is established. Using dead reckoning, the game then tracks the user's motion in the real world relative to this starting position in the game. As with absolute location auto-localization techniques, the game determines and interprets the movement of the mobile device user(s) and SP(s) according to the established (explicit or implicit) game grid.
  • Accordingly, FIGS. 27-30 also can represent a game grid that has been established and is modified using a dead-reckoning auto-localization technique. As described, the grid displayed in FIG. 27 represents a reference system associated with physical locations (e.g., longitudinal values, latitudinal values, or any other coordinate system). At the beginning of game play, using dead reckoning techniques, the grid is fixed to the user with the user at the center, even though the user can be anywhere in the real world playfield used for the game. It is not necessary for the mobile device to take a latitude/longitude reading to establish a Game Center (GC). The GC can be assumed to be any longitude and latitude or could be (0,0) depending on the coordinate system used to track the user and SPs. The system then tracks the user's relative motion from the GC (real or arbitrary) at the start of the game. Once the GC (instead of a GDF is established), the game logic and narrative continues, as described above with respect to FIGS. 28-30, to track movement of the user and determine SPs and interaction areas 2901 and 3001 as described above. Game area shapes and sizes can also be established in manners similar to those described above with respect to absolute location auto-localization.
  • Table 4 is pseudo-code that illustrates an example technique for establishing and using latitude and longitude to establish a game center (GC) for use with auto-localization.
    TABLE 4
    // uses the latitude and longitude grid
    // for the user location
    double latitudeGC, longitudeGC;
    Main( )
    {
    ...
    SetGameCenter( );
    // after GC has been set
    // localize the SPs relative to the GC
    LocalizeSPs(latitudeGC, longitudeGC);
    ...
    // play the game
    ...
    }
    void SetGameCenter( )
    {
    double latitude, longitude;
    // since we are using dead reckoning
    // for user tracking
    // will just use my house as the game center
    latitudeGC = 47.69088333;
    longitudeGC = −122.18996667;
    return;
    }
    void LocalizeSPs(double latitude, double longitude)
    {
    int index;
    for (index = 0; index < glb.spVec.Size( ) ; index ++)
    {
    glb.spVec[index]−>SetLocation(longitude, latitude);
    glb.spVec[index]−>m_center.m_x = longitude;
    glb.spVec[index]−>m_center.m_y = latitude;
    glb.spVec[index]−>InitialPosition( );
    }
    }
    void CSP_Moving::InitialPosition( )
    {
    double xstep, ystep;
    int dir, bearing;
    switch (m_type)
    {
    case Random:
    // start ¾ way out in some random direction
    m_xTemp1 = 0;
    m_xTemp2 = 0;
    bearing = rand( ) % 360;
    dir = 1;
    xstep = sin(deg2rad(bearing)) * (3*m_radius/4) *
    ONEMETER;
    ystep = cos(deg2rad(bearing)) * (3*m_radius/4) *
    ONEMETER;
    SetLocation(m_center.m_x + xstep, m_center.m_y +
    ystep);
    break;
    case Star:
    // start half way out in some random direction
    bearing = rand( ) % 360;
    dir = 1;
    xstep = sin(deg2rad(bearing)) * (m_radius/2) *
    ONEMETER;
    ystep = cos(deg2rad(bearing)) * (m_radius/2) *
    ONEMETER;
    SetLocation(m_center.m_x + xstep, m_center.m_y +
    ystep);
    m_iTemp1 = bearing;
    m_iTemp2 = dir;
    break;
    case Box:
    //xdir = 1;
    //ydir = 0;
    int xdir, ydir;
    xdir = (rand( ) % 2 ? 1 : −1);
    ydir = (rand( ) % 2 ? 1 : −1);
    SetLocation(m_center.m_x + m_radius/2 *
    ONEMETER * xdir,
    m_center.m_y + m_radius/2 *
    ONEMETER * ydir);
    xdir = rand( ) % 3;
    ydir = rand( ) % 3;
    if (xdir == 2)
    xdir = −1;
    if (xdir == 0)
    {
    if (ydir == 0)
    ydir = 1;
    else if (ydir == 2)
    ydir = −1;
    }
    else
    ydir = 0;
    m_iTemp1 = xdir;
    m_iTemp2 = ydir;
    break;
    }
    }

    GPS Transient Error Suppression
  • Though location-based systems can be conveniently localized using with a single location determination, transient errors inherent in GPS systems can cause the localized reference position to suddenly shift erratically. This can cause confusion and inconvenience to the device operator by, among other effects, causing the visual representation of location-based simulated phenomenon to move independently of the SP's intended position and or motion. Further, GPS systems not only experience occasional erroneous location determinations, but are also known to produce erroneous latitude/longitude in patterns. In some cases, these patterns constitute a series of erroneous readings that can cause the visual representation of location-based SPs to drift for periods extending for multiple seconds.
  • The SPIS location determination techniques also provide a mechanism for reducing, if not eliminating, many such GPS transient errors. FIG. 31 is an overview flow diagram of an example Determine Location routine that incorporates GPS transient error suppression. A game typically executes the Determine Location routine to determine the user's location when the mobile device is a GPS. In summary, the Determine Location routine uses the GPS to detect location and then compares this detected location to a computed location that takes into account prior positions in order to minimize transient errors.
  • More specifically, in step 3101, the routine reads the GPS data to establish a GDF as described above. In step 3102, this coordinate value is stored in a data store, for example in variable “Previous_Loc.” Steps 3103-3108 are executed each time the game needs to determine the user's location (once the game has begun). They may, for example, be invoked as an interrupt routine, or called from or as part of an event handler for the game. In step 3103, after the game has begun, the GPS (e.g., longitude/latitude) data is again collected (detected or determined in whatever manner appropriate to the device) and stored for use as a (potential) current location position. The GPS also typically collects velocity and bearing values along with the longitude/latitude data. In step 3104, the GPS reported velocity and bearing is used to derive a location calculation based upon the prior stored location. Specifically, a current location position is calculated as the previous location position (stored in variable “Previous_Loc”) plus the reported offset of the reported bearing and velocity. In some embodiments, an additional test can be incorporated whereby reported velocities with low values (e.g., less than 1 knot/hour) are rounded to zero. Then, in step 3105, the two potential current location positions (from steps 3103 and 3104) are compared. The difference between the two positions is compared against a minimum threshold. When the difference is greater than the threshold, the routine continues to execute in step 3106 to account for GPS error otherwise continues in step 3107 to use the GPS determined location. In step 3106, the routine sets the current location (stored in “Previous_Loc) to the calculated current location position from step 3104. In step 3107, the routine sets the current location position to the GPS detected/reported current location position from step 3103. In step 3108, the routine determines if the end of the game has been indicated, for example, by the user, narrative, system administrator, or by some other mechanism, and, if so, ends the game, otherwise continues to the beginning of the loop in step 3103 to process the next location.
  • Note that the minimum threshold value used in step 3105 can be a single pre-established value (such as 3 meters). Alternatively, the threshold can change over time. For instance, in games played for extended periods over large game spaces, it is possible that the extended consecutive use of locations determined solely using calculated positions in step 3104 could drift sufficiently far enough from those in step 3103 that step 3107 may not execute. Such behavior is undesirable as this technique of FIG. 31 is intended to suppress extreme GPS latitude/longitude errors that are transient and eventually cease. Therefore, by temporarily increasing the threshold and allowing step 3107 to execute at least once, such drift can be eliminated. Since this can cause an undesirable sudden shift in the reference grid (and thus be noticeable to the user) the drift could be adjusted for gradually in multiple steps.
  • Testing Location-Based Systems
  • Location-based systems have unique development and testing challenges. One challenge is they cannot be completely tested until actually used in the real-world locations for which they are intended, because a necessary part of their operation is the operation of at least one component that reliably determines the real-world location of mobile devices. Another challenge is that it is inconvenient to perform system tests on a device while it is moving.
  • SPIS offers a variety of solutions to these problems. For example, data can be collected by the system while it is being used, and then the data later incorporated for testing purposes, such as by using a simulation generated from the collected data. In general, the testing process includes:
      • 1. Using the system while collecting user and SP path data, then
      • 2. Using the collected data, replay the simulation, or
      • 3. Using the collected data, use the simulation.
  • More specifically, a game creation kit can be enabled with a data collection control setting that controls collection of device, SP, narrative, and SP interaction data generated while the game system is in actual use. The data collection control can also be active while the simulation is running by an authorized user or remote administrator. While system attributes are being collected there need not be any indication to the user that this is occurring. The collected data is then used to recreate the simulation activities at the convenience of a user or system developer.
  • One way to recreate a simulation is to run the simulation using the recorded location data in place of the data normally determined by the location determination system. By providing the recorded location data, the game uses deterministic SP paths to display the SP's in the same position as when the game was originally played. Many advantages are realized, such as allowing the simulation engine to use all the same data and logic during testing as is used in the field. Also, the developer can see and experience the game from a game user's point of view. The developer can even try to attempt to interact with the SP's as would the recorded user.
  • Another testing enhancement that can be incorporated is to use any valid arbitrary location data, including a single location. For example, a single latitude/longitude value can be used to establish a GDF. The game can then run indefinitely acting as if a hypothetical mobile user remained stationary. This simplifies testing of SP behavior (e.g. it is easy to observe the SP motion as presented by a mobile device) independent of movement of the user's mobile device. Other testing enhancements that use simulated or hypothetical data can similarly be incorporated.
  • Location-Based Systems for Non-Mobile Users
  • In addition to the mobile scenarios described above, the SPIS also can be used to produce games and other systems that are played or used by one or more users at fixed positions. In this case, Simulated Phenomena are defined in terms of changing real-world attributes (weather, traffic flow, stock market). As with other SPIS configurations, these types of systems can be used in an individual or multi-player mode. Note that, although many of the examples herein are described with reference to games, other types of SPIS-based systems, as mentioned throughout, for example, complex games, simulation scenarios, charity systems, can be similarly enhanced.
  • To participate in a SPIS-enabled narrative as a game player (in contrast to an audience member), a user typically needs to be associated with a real-world location. This includes users who never change position. These permanently immobile users can have their locations determined in a variety of ways, including self-reporting. That is, a user can indicate to the SPIS-based system an initial location. The system may then associate the self-reported location with a specific user and not allow them to change it.
  • In other situations, the initial physical location of the user is determined based upon a real-world sensor (in contrast to user self-report), and the user (device) remains motionless. For the purposes of systems such as the Simple Game, it doesn't matter if the user never moves, as long as a physical location can be at least initially established with confidence.
  • For example, the user may wish to interact with SPs using a non-mobile computing device. As with other SPIS-based systems, the game system can be configured as a ‘fat client’ on which all its processing occurs, or as a system in which all or part of the simulation engine is remotely accessed via a communications capability. SPs can be shown to be dispersed and moving in large geographic patterns. As any particular SP approaches a user location, the user's ability to interact with the SP increases.
  • In addition to self-reporting techniques and techniques using real-world sensors, a user's location can be established when the user is using a mobile device capable of location determination and coordinated with a stationary device. The device can stay on, continuing to establish location. Sometime thereafter, a data exchanges between the mobile and one or more stationary computing devices can occur, verifying the location of the user to the software being accessed from the stationary device. Alternatively, an exchange of authorizing data can occur from the mobile to the stationary device, and the mobile device then turned off. Alternatively, the SPIS-based system can make use of the mobile device tracking to determine a location of the stationary device. It could then allow the user to operate from the fixed location, within the context of the game.
  • Some users may be immobile for significant amounts of time. A SPIS-based system such as a game can be adapted to accommodate such scenarios. For example, suppose that the immobile user is a member of a team. Team members can be associated with a particular geographic location (for example, King County in Washington State), a particular human organization or characteristic (for example, Santa Rosa Junior College Alumni), or some other arbitrary identification (a group calling itself “The Radical Empiricists”). Members of this team can behave cooperatively among themselves to gain points or other game advantages. For example, points can be gained by the capturing of SPs. Further, the ability to capture an SP may be limited in range from each user's determined or recorded location. (In this narrative example it is assumed that at least some of the users are immobile at least at some point during the game.)
  • Further suppose that the game narrative in this example generates SPs according to atmospheric pressure, with an increase in SP density (as measured across geographic regions) associated with lower air pressure. This produces an enhanced ability to capture SPs during stormy weather, especially for immobile users who need to wait for an SP's path to bring them within interaction range.
  • While this type of narrative provides valuable entertainment experiences, including the inducement for team members to cooperate in part by observing and anticipating weather systems, it can also create disparities in advantage between geographic regions. Consider, for example, the frequency, intensity, and durations of low-pressure systems in mid-latitude cities like Seattle, and those close to the equator in within longer lasting stable high-pressure systems like San Diego.
  • One way to address these disparities is to have explicit handicapping of regions based on historic weather patterns. For instance, the points associated with SPs captured in San Diego can be greater than those in Seattle. Another way to address these disparities is to have cooperation between teams. For example, the San Diego and Seattle teams could share points or immobile player locations. Another way to address these disparities more implicitly is to have the rarer good weather SPs be worth more points.
  • An aspect that is common to many such scenarios is that geographic distribution of immobile users can affect a team's ability to easily interact with a large numbers of SPs. For example, it can be advantageous to have a uniform geographic distribution of users if the SP density tends to be uniform. Alternatively, if SP's tend to cluster, then more users in areas where the greatest density tends to occur can be advantageous. Thus, in some embodiments, a SPIS-based system is supported that allows immobile users to select a discretionary (perhaps user specified) single fixed location associated with their immobile locations.
  • Since SP locations can be anywhere a narrative determines, algorithms based on human population density, travel patterns, or other historic or sensed behavior can be incorporated. For example, the number of SPs moving in New York's Central Park can be initially based on its typical daily visitor count (for example, since there are typically more people on the weekends, SP population would be higher on the weekends in this case). Alternatively, the SP count could depend on how many users were currently participating in the narrative. Alternatively, the SP population can be fixed at all times (for example, when an SP is captured, the narrative generates a replacement somewhere in the park or its vicinity). In embodiments that accommodate immobile as well as mobile users, when the system loses track of a mobile device's location, the last known location can be used as their current location, regardless of which device was used to connect to the SPIS-based system.
  • Other SPIS-Based System Enhancements
  • As briefly mentioned, the SPIS supports non-visual presentations of SPs. For example, a SPIS-based game can be enhance using audio features. Audio can occur or change when detection occurs; or audio volume or type of audio can change based on proximity to user (for example, with something is found within a detectable range). Multiple synchronized audio can be employed to indicate bearing. For example, the volume of a right channel (for example, as played by a speaker) can be greater than the volume of a left channel when an SP is bearing to the right of the user's orientation. Alternatively, a delay of left channel can be used to simulate propagation delay. Sounds also can differ according to SP type. Distinctive sounds can be associated with SP or device status, or the attempted or successful occurrence of an SP interaction. For example, in a Simple Game each type of ghost (SP) may have a scream associated with it and presented when the SP is captured.
  • SPIS-based systems can also be similarly enhanced using tactile feedback, such as vibration frequencies, pitch, etc.
  • SPIS-based systems can also be enhanced to be user-modifiable. For example, in some games, a user can be allowed to create SPs. An initial location for the user created SP could be the user location when SP is created. SP characteristics also can be user defined. For example, a hostile SP could be created that seeks out another user. Other variations are of course possible.
  • In some SPIS-based systems, SPs can be shared or generated across users. For example, previously captured SPs could be released (regardless of who captured them), providing the user an ability to populate their current location with one or more SPs that the user did not create.
  • In addition, SPs can, within the context of specific narratives, be unique even among multiple players. That is, there can be but a single instance of an SP that has distinguishing characteristics. For example, a unique SP can be experienced by users as an SP that is visible to any user who is within range, but that disappears when someone captures it. While visible it can be uniquely identified, and once captured never visible again unless released.
  • Also, by using a variety of well known digital security techniques, a unique SP can be reliably transferred between individual users. Again, since the SP is unique and non-duplicatable, there can only be one user who has control of it at any point in time.
  • Historically based SPs provide an example of unique SPs having distinguishing characteristics. That is, an SP based on an historic figure can be created by a narrative, and then exchanged between users. Historically based SPs can incorporate a wide variety of distinguishing characteristics, including their actual names and the places they traveled while alive. Other characteristics can similarly be included.
  • All of the above U.S. patents, U.S. patent application publications, U.S. patent applications, foreign patents, foreign patent applications and non-patent publications referred to in this specification and/or listed in the Application Data Sheet, including but not limited to U.S. Provisional Patent Application No. 60/577,438, entitled “SIMULATED PHENOMENA INTERACTION GAME,” filed Jun. 5, 2004; U.S. patent application Ser. No. 10/845,584, entitled “COMMERCE-ENABLED ENVIRONMENT FOR INERACTING WITH SIMULATED PHENOMENA,” filed May 13, 2004; U.S. Provisional Patent Application No. 60/470,394, entitled “METHOD AND SYSTEM FOR INTERACTING WITH SIMULATED PHENOMENA,” filed May 13, 2003; and U.S. patent application Ser. No. 10/438,172, entitled “METHOD AND SYSTEM FOR INTERACTING WITH SIMULATED PHENOMENA,” filed May 13, 2003, are incorporated herein by reference, in their entirety.
  • From the foregoing it will be appreciated that, although specific embodiments of the invention have been described herein for purposes of illustration, various modifications may be made without deviating from the spirit and scope of the invention. For example, one skilled in the art will recognize that the methods and systems for limiting the range of interacting with simulated phenomena discussed herein are applicable to other architectures other than a fat client device. For example, using client-server architectures, the experience of the simulation environment can be distributed across multiple devices. In addition, although described herein with reference to a mobile device, one skilled in the art will recognize that the mobile device need not be transported to work with the system and that a non-mobile device may be used as long as there is some other means of sensing or associating information about the user's real world environment and forwarding that information to the SPIS. One skilled in the art will also recognize that the methods and systems discussed herein are applicable to differing protocols, communication media (optical, wireless, cable, etc.) and devices (such as wireless handsets, electronic organizers, personal digital assistants, portable email machines, pagers, etc.) whether or not they are explicitly mentioned herein, providing they support the capabilities of a SPIS-based system.

Claims (20)

1. A method in a location-based computer game environment comprising:
invoking a first interaction with a simulated phenomenon, the first interaction having an associated first range; and
invoking a second interaction with the simulated phenomenon, the second interaction having an associated second range, wherein at least one of the first range and the second range is limited to a dynamically determined value and wherein the first range is not the same as the second range.
2. The method of claim 1, the game environment including a mobile device for playing a game, and wherein at least one of the first interaction range or the second interaction range is based upon a real-world location associated with the mobile device.
3. The method of claim 1, the game environment including a mobile device for playing a game, further comprising:
automatically determining an initial location of the mobile device that is used to associate the game environment to a real-world locale.
4. The method of claim 3 wherein the automatically determined location is based upon a detected real-world location in a physical coordinate system.
5. The method of claim 3 wherein the automatically determined location is based upon an associated latitude or longitude value and subsequent changes to location of are determined based upon detected changes in bearing and velocity characteristics associated with the mobile device.
6. The method of claim 3 wherein the automatically determined location is based upon an arbitrary reference point and subsequent changes to location of are determined based upon detected changes in bearing and velocity characteristics associate with the mobile device.
7. The method of claim 3, further comprising:
determining an initial position for the simulated phenomenon associated with a location in the real-world based upon the automatically determining an initial location of the mobile device.
8. A mobile computer game, comprising:
a simulated phenomenon; and
a game engine that executes a first interaction with the simulated phenomenon according to a range that is limited and that executes a second interaction with the simulated phenomenon according to a second range, wherein the first and second range are different; and
a presentation component for notification of a result of at least one of the first interaction or the second interaction.
9. The computer game of claim 8 wherein the presentation component presents a notification based upon at least one of visual, audio, or tactile feedback.
10. The computer game of claim 8, wherein the range of the first interaction is limited based upon a real-world location associated with a mobile device used to play the computer game.
11. The computer game of claim 8, wherein the game engine communicates with a mobile device used to play the game in order to automatically fix a game grid associated with the game to a real-world location.
12. The computer game of claim 11 wherein the game grid is explicit or implicit.
13. The computer game of claim 11 wherein the real-world location used to fix the game grid is automatically determined by the mobile device.
14. The computer game of claim 8, wherein the game engine communicates with a mobile device used to play the game in order to automatically fix a game grid associated with the game to an initial position.
15. The computer game of claim 14, wherein actual movement of the mobile device in the real-world is tracked to indicate changes to location relative to the game grid.
16. The computer game of claim 14 wherein the initial position is an associated latitude and longitude position.
17. The computer game of claim 14 wherein detected changes in bearing and velocity characteristics associated with the mobile device are used to track changes to location relative to the game grid.
18. The computer game of claim 14 wherein the initial position is based upon an arbitrary reference point.
19. A computer-readable memory medium whose contents enable a computing device to present a mobile computer game, by performing a method comprising:
automatically establishing an initial game location associated with a real-world location, thereby affixing the game to a real-world environment;
invoking a first interaction with a simulated phenomenon, the first interaction having an associated first range that is limited based upon the real world environment; and
invoking a second interaction with the simulated phenomenon, the second interaction having an associated second range, wherein the first and second ranges are different.
20. The computer-readable medium of claim 19 wherein the computer-readable medium is at least a memory of the computing device or a data transmission medium transmitting to the computing device a generated data signal containing the contents.
US11/147,408 2002-05-13 2005-06-06 Simulated phenomena interaction game Abandoned US20070265089A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/147,408 US20070265089A1 (en) 2002-05-13 2005-06-06 Simulated phenomena interaction game

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US38055202P 2002-05-13 2002-05-13
US47039403P 2003-05-13 2003-05-13
US10/438,172 US20040002843A1 (en) 2002-05-13 2003-05-13 Method and system for interacting with simulated phenomena
US10/845,584 US20050009608A1 (en) 2002-05-13 2004-05-13 Commerce-enabled environment for interacting with simulated phenomena
US57743804P 2004-06-05 2004-06-05
US11/147,408 US20070265089A1 (en) 2002-05-13 2005-06-06 Simulated phenomena interaction game

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US10/438,172 Continuation-In-Part US20040002843A1 (en) 2002-05-13 2003-05-13 Method and system for interacting with simulated phenomena
US10/845,584 Continuation-In-Part US20050009608A1 (en) 2002-05-13 2004-05-13 Commerce-enabled environment for interacting with simulated phenomena

Publications (1)

Publication Number Publication Date
US20070265089A1 true US20070265089A1 (en) 2007-11-15

Family

ID=38685816

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/147,408 Abandoned US20070265089A1 (en) 2002-05-13 2005-06-06 Simulated phenomena interaction game

Country Status (1)

Country Link
US (1) US20070265089A1 (en)

Cited By (106)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060281065A1 (en) * 2005-06-14 2006-12-14 Margiotta Vince S Methods and systems for coordinating business processes into a competitive environment for training
US20070047517A1 (en) * 2005-08-29 2007-03-01 Hua Xu Method and apparatus for altering a media activity
US20070078009A1 (en) * 2005-10-03 2007-04-05 Airplay Network, Inc. Cellular phone games based upon television archives
US20070117576A1 (en) * 2005-07-14 2007-05-24 Huston Charles D GPS Based Friend Location and Identification System and Method
US20070277097A1 (en) * 2006-05-25 2007-11-29 Erik Frederick Hennum Apparatus, system, and method for context-aware authoring transform
US20080005172A1 (en) * 2006-06-30 2008-01-03 Robert Gutmann Dead reckoning in a gaming environment
US20080036653A1 (en) * 2005-07-14 2008-02-14 Huston Charles D GPS Based Friend Location and Identification System and Method
US20080198230A1 (en) * 2005-07-14 2008-08-21 Huston Charles D GPS Based Spectator and Participant Sport System and Method
US20080214303A1 (en) * 2004-12-02 2008-09-04 Tampereen Teknillinen Yliopisto Method, System and Computer Program Product For Producing, Offering and Executing Recreational Application Programs
US20080259096A1 (en) * 2005-07-14 2008-10-23 Huston Charles D GPS-Based Location and Messaging System and Method
US20080280676A1 (en) * 2007-05-07 2008-11-13 Samsung Electronics Co. Ltd. Wireless gaming method and wireless gaming-enabled mobile terminal
US20090036188A1 (en) * 2007-08-01 2009-02-05 Gelman Geoffrey M General gaming engine
US20090099983A1 (en) * 2006-05-19 2009-04-16 Drane Associates, L.P. System and method for authoring and learning
WO2009072010A1 (en) * 2007-12-07 2009-06-11 Sony Ericsson Mobile Communications Ab Dynamic gaming environment
US20090247285A1 (en) * 2006-06-13 2009-10-01 Gagner Mark B Location detection for portable wagering game machines
EP2127713A2 (en) * 2008-05-21 2009-12-02 Kabushiki Kaisha Bandai Game device
WO2009155483A1 (en) * 2008-06-20 2009-12-23 Invensys Systems, Inc. Systems and methods for immersive interaction with actual and/or simulated facilities for process, environmental and industrial control
US20100067451A1 (en) * 2008-09-16 2010-03-18 Hall Robert J Quality of service scheme for collision-based wireless networks
US20100199193A1 (en) * 2009-01-31 2010-08-05 International Business Machines Corporation Client-side simulated virtual universe environment
US20100302143A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for control of a simulated object that is associated with a physical location in the real world environment
US20100306825A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for facilitating user interaction with a simulated object associated with a physical location
US20100304804A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method of simulated objects and applications thereof
US20110216060A1 (en) * 2010-03-05 2011-09-08 Sony Computer Entertainment America Llc Maintaining Multiple Views on a Shared Stable Virtual Space
US8023500B2 (en) 1996-08-20 2011-09-20 Invensys Systems, Inc. Methods for process control with change updates
US8028275B2 (en) 1999-05-17 2011-09-27 Invensys Systems, Inc. Control systems and methods with smart blocks
WO2011129907A1 (en) 2010-04-13 2011-10-20 Sony Computer Entertainment America Llc Calibration of portable devices in a shared virtual space
US20110319148A1 (en) * 2010-06-24 2011-12-29 Microsoft Corporation Virtual and location-based multiplayer gaming
US20110319164A1 (en) * 2008-10-08 2011-12-29 Hirokazu Matsushita Game control program, game device, and game control method adapted to control game where objects are moved in game field
US8090452B2 (en) 1999-06-11 2012-01-03 Invensys Systems, Inc. Methods and apparatus for control using control devices that provide a virtual machine environment and that communicate via an IP network
US8126987B2 (en) 2009-11-16 2012-02-28 Sony Computer Entertainment Inc. Mediation of content-related services
US8127060B2 (en) 2009-05-29 2012-02-28 Invensys Systems, Inc Methods and apparatus for control configuration with control objects that are fieldbus protocol-aware
WO2012026936A1 (en) * 2010-08-26 2012-03-01 Sony Ericsson Mobile Communications Ab A game engine module and method for playing an electronic game using location information
US20120157210A1 (en) * 2010-12-15 2012-06-21 At&T Intellectual Property I Lp Geogame for mobile device
US20120231886A1 (en) * 2009-11-20 2012-09-13 Wms Gaming Inc. Integrating wagering games and environmental conditions
US20120272158A1 (en) * 2008-11-15 2012-10-25 Adobe Systems Incorporated Method and device for identifying devices which can be targeted for the purpose of establishing a communication session
US8368640B2 (en) 1999-05-17 2013-02-05 Invensys Systems, Inc. Process control configuration system with connection validation and configuration
WO2013043214A1 (en) * 2011-09-22 2013-03-28 Jonathan Peterson Methods and apparatus to associate a detected presence of a conductive object
US8433759B2 (en) 2010-05-24 2013-04-30 Sony Computer Entertainment America Llc Direction-conscious information sharing
US8463964B2 (en) 2009-05-29 2013-06-11 Invensys Systems, Inc. Methods and apparatus for control configuration with enhanced change-tracking
US8500031B2 (en) 2010-07-29 2013-08-06 Bank Of America Corporation Wearable article having point of sale payment functionality
US8577718B2 (en) 2010-11-04 2013-11-05 Dw Associates, Llc Methods and systems for identifying, quantifying, analyzing, and optimizing the level of engagement of components within a defined ecosystem or context
US8589488B2 (en) 2005-07-14 2013-11-19 Charles D. Huston System and method for creating content for an event using a social network
US20140011585A1 (en) * 2012-07-03 2014-01-09 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) Game apparatus
US8702506B2 (en) 2005-11-30 2014-04-22 At&T Intellectual Property I, L.P. Geogame for mobile device
US8712056B2 (en) 2010-06-03 2014-04-29 At&T Intellectual Property I, L.P. Secure mobile ad hoc network
US20140155145A1 (en) * 2006-06-02 2014-06-05 Wms Gaming Inc. Handheld wagering game system and methods for conducting wagering games thereupon
US8751159B2 (en) 2009-11-04 2014-06-10 At&T Intellectual Property I, L.P. Augmented reality gaming via geographic messaging
US8773467B2 (en) 2011-06-13 2014-07-08 International Business Machines Corporation Enhanced asset management and planning system
US8777752B2 (en) 2005-11-30 2014-07-15 At&T Intellectual Property I, L.P. Geogame for mobile device
US8821293B2 (en) 2007-08-17 2014-09-02 At&T Intellectual Property I, L.P. Location-based mobile gaming application and method for implementing the same using a scalable tiered geocast protocol
US8952796B1 (en) 2011-06-28 2015-02-10 Dw Associates, Llc Enactive perception device
US8966557B2 (en) 2001-01-22 2015-02-24 Sony Computer Entertainment Inc. Delivery of digital content
US8996359B2 (en) 2011-05-18 2015-03-31 Dw Associates, Llc Taxonomy and application of language analysis and processing
US9020807B2 (en) 2012-01-18 2015-04-28 Dw Associates, Llc Format for displaying text analytics results
US9071451B2 (en) 2012-07-31 2015-06-30 At&T Intellectual Property I, L.P. Geocast-based situation awareness
US9161158B2 (en) 2011-06-27 2015-10-13 At&T Intellectual Property I, L.P. Information acquisition using a scalable wireless geocast protocol
US9177307B2 (en) * 2010-07-29 2015-11-03 Bank Of America Corporation Wearable financial indicator
US20150339952A1 (en) * 2014-05-24 2015-11-26 Nirit Glazer Method and system for using location services to teach concepts
US9210589B2 (en) 2012-10-09 2015-12-08 At&T Intellectual Property I, L.P. Geocast protocol for wireless sensor network
US9250703B2 (en) 2006-03-06 2016-02-02 Sony Computer Entertainment Inc. Interface with gaze detection and voice input
US9269353B1 (en) 2011-12-07 2016-02-23 Manu Rehani Methods and systems for measuring semantics in communications
US9319842B2 (en) 2011-06-27 2016-04-19 At&T Intellectual Property I, L.P. Mobile device configured point and shoot type weapon
US9344842B2 (en) 2005-07-14 2016-05-17 Charles D. Huston System and method for viewing golf using virtual reality
US20160250553A1 (en) * 2013-10-17 2016-09-01 Sony Computer Entertainment Inc. Game System, Game Controlling Method, and Game Controlling Program
US9457272B2 (en) 2006-04-12 2016-10-04 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US9483405B2 (en) 2007-09-20 2016-11-01 Sony Interactive Entertainment Inc. Simplified run-time program translation for emulating complex processor pipelines
US9495870B2 (en) 2011-10-20 2016-11-15 At&T Intellectual Property I, L.P. Vehicular communications using a scalable ad hoc geographic routing protocol
US9498724B2 (en) 2006-01-10 2016-11-22 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US9501904B2 (en) 2006-01-10 2016-11-22 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US9504922B2 (en) 2004-06-28 2016-11-29 Winview, Inc. Methods and apparatus for distributed gaming over a mobile device
US9526991B2 (en) 2004-06-28 2016-12-27 Winview, Inc. Methods and apparatus for distributed gaming over a mobile device
US9660745B2 (en) 2012-12-12 2017-05-23 At&T Intellectual Property I, L.P. Geocast-based file transfer
US9667513B1 (en) 2012-01-24 2017-05-30 Dw Associates, Llc Real-time autonomous organization
US9669293B1 (en) * 2012-07-31 2017-06-06 Niantic, Inc. Game data validation
US9672692B2 (en) 2006-04-12 2017-06-06 Winview, Inc. Synchronized gaming and programming
US9788329B2 (en) 2005-11-01 2017-10-10 At&T Intellectual Property Ii, L.P. Non-interference technique for spatially aware mobile ad hoc networking
US9919210B2 (en) 2005-10-03 2018-03-20 Winview, Inc. Synchronized gaming and programming
US10016684B2 (en) 2010-10-28 2018-07-10 At&T Intellectual Property I, L.P. Secure geographic based gaming
US10075893B2 (en) 2011-12-15 2018-09-11 At&T Intellectual Property I, L.P. Media distribution via a scalable ad hoc geographic protocol
US10120438B2 (en) 2011-05-25 2018-11-06 Sony Interactive Entertainment Inc. Eye gaze to alter device behavior
US10127735B2 (en) 2012-05-01 2018-11-13 Augmented Reality Holdings 2, Llc System, method and apparatus of eye tracking or gaze detection applications including facilitating action on or interaction with a simulated object
US10165339B2 (en) 2005-06-20 2018-12-25 Winview, Inc. Method of and system for managing client resources and assets for activities on computing devices
US20190022530A1 (en) * 2017-07-22 2019-01-24 Niantic, Inc. Validating a player's real-world location using activity within a parallel reality game
US10226698B1 (en) 2004-07-14 2019-03-12 Winview, Inc. Game of skill played by remote participants utilizing wireless devices in connection with a common game event
US10510214B2 (en) * 2005-07-08 2019-12-17 Cfph, Llc System and method for peer-to-peer wireless gaming
US10556183B2 (en) 2006-01-10 2020-02-11 Winview, Inc. Method of and system for conducting multiple contest of skill with a single performance
US10721543B2 (en) 2005-06-20 2020-07-21 Winview, Inc. Method of and system for managing client resources and assets for activities on computing devices
US10719123B2 (en) 2014-07-15 2020-07-21 Nant Holdings Ip, Llc Multiparty object recognition
CN111773658A (en) * 2020-07-03 2020-10-16 珠海金山网络游戏科技有限公司 Game interaction method and device based on computer vision library
US10835809B2 (en) * 2017-08-26 2020-11-17 Kristina Contreras Auditorium efficient tracking in auditory augmented reality
US10933317B2 (en) * 2019-03-15 2021-03-02 Sony Interactive Entertainment LLC. Near real-time augmented reality video gaming system
US10958985B1 (en) 2008-11-10 2021-03-23 Winview, Inc. Interactive advertising system
US11017628B2 (en) 2006-10-26 2021-05-25 Interactive Games Llc System and method for wireless gaming with location determination
US11017630B2 (en) 2012-02-28 2021-05-25 Cfph, Llc Gaming through mobile or other devices
US11069185B2 (en) 2005-07-08 2021-07-20 Interactive Games Llc System and method for wireless gaming system with user profiles
US11082746B2 (en) 2006-04-12 2021-08-03 Winview, Inc. Synchronized gaming and programming
US11182462B2 (en) 2006-11-15 2021-11-23 Cfph, Llc Biometric access sensitivity
US11195233B1 (en) 2014-06-12 2021-12-07 Allstate Insurance Company Virtual simulation for insurance
US11216887B1 (en) * 2014-06-12 2022-01-04 Allstate Insurance Company Virtual simulation for insurance
US11238439B1 (en) 2016-01-07 2022-02-01 Worldpay, Llc Point of interaction device emulation for payment transaction simulation
US11308765B2 (en) 2018-10-08 2022-04-19 Winview, Inc. Method and systems for reducing risk in setting odds for single fixed in-play propositions utilizing real time input
CN114629761A (en) * 2022-03-22 2022-06-14 吉林省广播电视研究所(吉林省广播电视局科技信息中心) Frequency modulation signal anti-triangulation frequency measurement demodulation method
US20220203232A1 (en) * 2020-12-30 2022-06-30 Sony Interactive Entertainment Inc. Helper mode in spectated video games
US11392636B2 (en) 2013-10-17 2022-07-19 Nant Holdings Ip, Llc Augmented reality position-based service, methods, and systems
US11551529B2 (en) 2016-07-20 2023-01-10 Winview, Inc. Method of generating separate contests of skill or chance from two independent events
US11854153B2 (en) 2011-04-08 2023-12-26 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms

Citations (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4640812A (en) * 1984-06-11 1987-02-03 General Electric Company Nuclear system test simulator
US4807202A (en) * 1986-04-17 1989-02-21 Allan Cherri Visual environment simulator for mobile viewer
US5064376A (en) * 1983-04-01 1991-11-12 Unisys Corporation Portable compact simulated target motion generating system
US5120057A (en) * 1990-01-26 1992-06-09 Konami Co., Ltd. Hand held video game with simulated battle against aliens
US5381338A (en) * 1991-06-21 1995-01-10 Wysocki; David A. Real time three dimensional geo-referenced digital orthophotograph-based positioning, navigation, collision avoidance and decision support system
US5596405A (en) * 1995-10-03 1997-01-21 The United States Of America As Represented By The Secretary Of The Navy Method of and apparatus for the continuous emissions monitoring of toxic airborne metals
US5679075A (en) * 1995-11-06 1997-10-21 Beanstalk Entertainment Enterprises Interactive multi-media game system and method
US5688124A (en) * 1994-03-04 1997-11-18 Buck Werke Gmbh & Co. Method for simulating weapons fire, and high-angle trajectory weapons fire simulator
US5794128A (en) * 1995-09-20 1998-08-11 The United States Of America As Represented By The Secretary Of The Army Apparatus and processes for realistic simulation of wireless information transport systems
US5807113A (en) * 1996-04-22 1998-09-15 The United States Of America As Represented By The Secretary Of The Army Method and apparatus for training in the detection of nuclear, biological and chemical (NBC) contamination
US5942969A (en) * 1997-01-23 1999-08-24 Sony Corporation Treasure hunt game using pager and paging system
US6023241A (en) * 1998-11-13 2000-02-08 Intel Corporation Digital multimedia navigation player/recorder
US6080063A (en) * 1997-01-06 2000-06-27 Khosla; Vinod Simulated real time game play with live event
US6149435A (en) * 1997-12-26 2000-11-21 Electronics And Telecommunications Research Institute Simulation method of a radio-controlled model airplane and its system
US6177905B1 (en) * 1998-12-08 2001-01-23 Avaya Technology Corp. Location-triggered reminder for mobile user devices
US6227966B1 (en) * 1997-02-19 2001-05-08 Kabushiki Kaisha Bandai Simulation device for fostering a virtual creature
US6227974B1 (en) * 1997-06-27 2001-05-08 Nds Limited Interactive game system
US6287200B1 (en) * 1999-12-15 2001-09-11 Nokia Corporation Relative positioning and virtual objects for mobile devices
US6320495B1 (en) * 2000-03-24 2001-11-20 Peter Sporgis Treasure hunt game utilizing GPS equipped wireless communications devices
US20020010734A1 (en) * 2000-02-03 2002-01-24 Ebersole John Franklin Internetworked augmented reality system and method
US20020024675A1 (en) * 2000-01-28 2002-02-28 Eric Foxlin Self-referenced tracking
US20020049074A1 (en) * 2000-07-20 2002-04-25 Alcatel Method of making a game available for a mobile telephony terminal of a subscriber and program modules and means therefor
US20020090985A1 (en) * 2000-09-07 2002-07-11 Ilan Tochner Coexistent interaction between a virtual character and the real world
US20020188760A1 (en) * 2001-05-10 2002-12-12 Toru Kuwahara Information processing system that seamlessly connects real world and virtual world
US20020191017A1 (en) * 1999-09-24 2002-12-19 Sinclair Matthew Frazer Wireless system for interacting with a game service
US6500008B1 (en) * 1999-03-15 2002-12-31 Information Decision Technologies, Llc Augmented reality-based firefighter training system and method
US20030036428A1 (en) * 2001-08-20 2003-02-20 Christian Aasland Method and apparatus for implementing multiplayer PDA games
US6527641B1 (en) * 1999-09-24 2003-03-04 Nokia Corporation System for profiling mobile station activity in a predictive command wireless game system
US20030055984A1 (en) * 2001-05-18 2003-03-20 Sony Computer Entertainment Inc. Entertainment system
US20030052454A1 (en) * 2001-07-13 2003-03-20 Leen Fergus A. System and method for establishing a wager for a gaming application
US6545682B1 (en) * 2000-05-24 2003-04-08 There, Inc. Method and apparatus for creating and customizing avatars using genetic paradigm
US20030144047A1 (en) * 2002-01-31 2003-07-31 Peter Sprogis Treasure hunt game utilizing wireless communications devices and location positioning technology
US6607038B2 (en) * 2000-03-15 2003-08-19 Information Decision Technologies, Llc Instrumented firefighter's nozzle and method
US20030177187A1 (en) * 2000-11-27 2003-09-18 Butterfly.Net. Inc. Computing grid for massively multi-player online games and other multi-user immersive persistent-state and session-based applications
US20030190956A1 (en) * 2002-04-09 2003-10-09 Jan Vancraeynest Wireless gaming system using standard cellular telephones
US20030224855A1 (en) * 2002-05-31 2003-12-04 Robert Cunningham Optimizing location-based mobile gaming applications
US20040176082A1 (en) * 2002-02-07 2004-09-09 Cliff David Trevor Wireless communication systems
US6822648B2 (en) * 2001-04-17 2004-11-23 Information Decision Technologies, Llc Method for occlusion of movable objects and people in augmented reality scenes
US7073129B1 (en) * 1998-12-18 2006-07-04 Tangis Corporation Automated selection of appropriate information based on a computer user's context
US7110013B2 (en) * 2000-03-15 2006-09-19 Information Decision Technology Augmented reality display integrated with self-contained breathing apparatus

Patent Citations (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5064376A (en) * 1983-04-01 1991-11-12 Unisys Corporation Portable compact simulated target motion generating system
US4640812A (en) * 1984-06-11 1987-02-03 General Electric Company Nuclear system test simulator
US4807202A (en) * 1986-04-17 1989-02-21 Allan Cherri Visual environment simulator for mobile viewer
US5120057A (en) * 1990-01-26 1992-06-09 Konami Co., Ltd. Hand held video game with simulated battle against aliens
US5381338A (en) * 1991-06-21 1995-01-10 Wysocki; David A. Real time three dimensional geo-referenced digital orthophotograph-based positioning, navigation, collision avoidance and decision support system
US5688124A (en) * 1994-03-04 1997-11-18 Buck Werke Gmbh & Co. Method for simulating weapons fire, and high-angle trajectory weapons fire simulator
US5794128A (en) * 1995-09-20 1998-08-11 The United States Of America As Represented By The Secretary Of The Army Apparatus and processes for realistic simulation of wireless information transport systems
US5596405A (en) * 1995-10-03 1997-01-21 The United States Of America As Represented By The Secretary Of The Navy Method of and apparatus for the continuous emissions monitoring of toxic airborne metals
US5679075A (en) * 1995-11-06 1997-10-21 Beanstalk Entertainment Enterprises Interactive multi-media game system and method
US5807113A (en) * 1996-04-22 1998-09-15 The United States Of America As Represented By The Secretary Of The Army Method and apparatus for training in the detection of nuclear, biological and chemical (NBC) contamination
US6080063A (en) * 1997-01-06 2000-06-27 Khosla; Vinod Simulated real time game play with live event
US5942969A (en) * 1997-01-23 1999-08-24 Sony Corporation Treasure hunt game using pager and paging system
US6227966B1 (en) * 1997-02-19 2001-05-08 Kabushiki Kaisha Bandai Simulation device for fostering a virtual creature
US6227974B1 (en) * 1997-06-27 2001-05-08 Nds Limited Interactive game system
US6149435A (en) * 1997-12-26 2000-11-21 Electronics And Telecommunications Research Institute Simulation method of a radio-controlled model airplane and its system
US6023241A (en) * 1998-11-13 2000-02-08 Intel Corporation Digital multimedia navigation player/recorder
US6177905B1 (en) * 1998-12-08 2001-01-23 Avaya Technology Corp. Location-triggered reminder for mobile user devices
US7073129B1 (en) * 1998-12-18 2006-07-04 Tangis Corporation Automated selection of appropriate information based on a computer user's context
US6500008B1 (en) * 1999-03-15 2002-12-31 Information Decision Technologies, Llc Augmented reality-based firefighter training system and method
US6527641B1 (en) * 1999-09-24 2003-03-04 Nokia Corporation System for profiling mobile station activity in a predictive command wireless game system
US20020191017A1 (en) * 1999-09-24 2002-12-19 Sinclair Matthew Frazer Wireless system for interacting with a game service
US6287200B1 (en) * 1999-12-15 2001-09-11 Nokia Corporation Relative positioning and virtual objects for mobile devices
US20020024675A1 (en) * 2000-01-28 2002-02-28 Eric Foxlin Self-referenced tracking
US20020010734A1 (en) * 2000-02-03 2002-01-24 Ebersole John Franklin Internetworked augmented reality system and method
US7110013B2 (en) * 2000-03-15 2006-09-19 Information Decision Technology Augmented reality display integrated with self-contained breathing apparatus
US6607038B2 (en) * 2000-03-15 2003-08-19 Information Decision Technologies, Llc Instrumented firefighter's nozzle and method
US6320495B1 (en) * 2000-03-24 2001-11-20 Peter Sporgis Treasure hunt game utilizing GPS equipped wireless communications devices
US6545682B1 (en) * 2000-05-24 2003-04-08 There, Inc. Method and apparatus for creating and customizing avatars using genetic paradigm
US20020049074A1 (en) * 2000-07-20 2002-04-25 Alcatel Method of making a game available for a mobile telephony terminal of a subscriber and program modules and means therefor
US20020090985A1 (en) * 2000-09-07 2002-07-11 Ilan Tochner Coexistent interaction between a virtual character and the real world
US20030177187A1 (en) * 2000-11-27 2003-09-18 Butterfly.Net. Inc. Computing grid for massively multi-player online games and other multi-user immersive persistent-state and session-based applications
US6822648B2 (en) * 2001-04-17 2004-11-23 Information Decision Technologies, Llc Method for occlusion of movable objects and people in augmented reality scenes
US20020188760A1 (en) * 2001-05-10 2002-12-12 Toru Kuwahara Information processing system that seamlessly connects real world and virtual world
US20030055984A1 (en) * 2001-05-18 2003-03-20 Sony Computer Entertainment Inc. Entertainment system
US20030052454A1 (en) * 2001-07-13 2003-03-20 Leen Fergus A. System and method for establishing a wager for a gaming application
US20030036428A1 (en) * 2001-08-20 2003-02-20 Christian Aasland Method and apparatus for implementing multiplayer PDA games
US20030144047A1 (en) * 2002-01-31 2003-07-31 Peter Sprogis Treasure hunt game utilizing wireless communications devices and location positioning technology
US20040176082A1 (en) * 2002-02-07 2004-09-09 Cliff David Trevor Wireless communication systems
US20030190956A1 (en) * 2002-04-09 2003-10-09 Jan Vancraeynest Wireless gaming system using standard cellular telephones
US20030224855A1 (en) * 2002-05-31 2003-12-04 Robert Cunningham Optimizing location-based mobile gaming applications

Cited By (264)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8023500B2 (en) 1996-08-20 2011-09-20 Invensys Systems, Inc. Methods for process control with change updates
US8368640B2 (en) 1999-05-17 2013-02-05 Invensys Systems, Inc. Process control configuration system with connection validation and configuration
US8225271B2 (en) 1999-05-17 2012-07-17 Invensys Systems, Inc. Apparatus for control systems with objects that are associated with live data
US8229579B2 (en) 1999-05-17 2012-07-24 Invensys Systems, Inc. Control systems and methods with versioning
US8028275B2 (en) 1999-05-17 2011-09-27 Invensys Systems, Inc. Control systems and methods with smart blocks
US8028272B2 (en) 1999-05-17 2011-09-27 Invensys Systems, Inc. Control system configurator and methods with edit selection
US8090452B2 (en) 1999-06-11 2012-01-03 Invensys Systems, Inc. Methods and apparatus for control using control devices that provide a virtual machine environment and that communicate via an IP network
US8966557B2 (en) 2001-01-22 2015-02-24 Sony Computer Entertainment Inc. Delivery of digital content
US9908053B2 (en) 2004-06-28 2018-03-06 Winview, Inc. Methods and apparatus for distributed gaming over a mobile device
US10232270B2 (en) 2004-06-28 2019-03-19 Winview, Inc. Methods and apparatus for distributed gaming over a mobile device
US10828571B2 (en) 2004-06-28 2020-11-10 Winview, Inc. Methods and apparatus for distributed gaming over a mobile device
US11654368B2 (en) 2004-06-28 2023-05-23 Winview, Inc. Methods and apparatus for distributed gaming over a mobile device
US9504922B2 (en) 2004-06-28 2016-11-29 Winview, Inc. Methods and apparatus for distributed gaming over a mobile device
US10709987B2 (en) 2004-06-28 2020-07-14 Winview, Inc. Methods and apparatus for distributed gaming over a mobile device
US11400379B2 (en) 2004-06-28 2022-08-02 Winview, Inc. Methods and apparatus for distributed gaming over a mobile device
US10226705B2 (en) 2004-06-28 2019-03-12 Winview, Inc. Methods and apparatus for distributed gaming over a mobile device
US9821233B2 (en) 2004-06-28 2017-11-21 Winview, Inc. Methods and apparatus for distributed gaming over a mobile device
US9526991B2 (en) 2004-06-28 2016-12-27 Winview, Inc. Methods and apparatus for distributed gaming over a mobile device
US10226698B1 (en) 2004-07-14 2019-03-12 Winview, Inc. Game of skill played by remote participants utilizing wireless devices in connection with a common game event
US10933319B2 (en) 2004-07-14 2021-03-02 Winview, Inc. Game of skill played by remote participants utilizing wireless devices in connection with a common game event
US11786813B2 (en) 2004-07-14 2023-10-17 Winview, Inc. Game of skill played by remote participants utilizing wireless devices in connection with a common game event
US20080214303A1 (en) * 2004-12-02 2008-09-04 Tampereen Teknillinen Yliopisto Method, System and Computer Program Product For Producing, Offering and Executing Recreational Application Programs
US20060281065A1 (en) * 2005-06-14 2006-12-14 Margiotta Vince S Methods and systems for coordinating business processes into a competitive environment for training
US8047848B2 (en) * 2005-06-14 2011-11-01 Vince Scott Margiotta Method and system for providing incentives in a business environment
US10165339B2 (en) 2005-06-20 2018-12-25 Winview, Inc. Method of and system for managing client resources and assets for activities on computing devices
US10721543B2 (en) 2005-06-20 2020-07-21 Winview, Inc. Method of and system for managing client resources and assets for activities on computing devices
US11451883B2 (en) 2005-06-20 2022-09-20 Winview, Inc. Method of and system for managing client resources and assets for activities on computing devices
US11069185B2 (en) 2005-07-08 2021-07-20 Interactive Games Llc System and method for wireless gaming system with user profiles
US10510214B2 (en) * 2005-07-08 2019-12-17 Cfph, Llc System and method for peer-to-peer wireless gaming
US11348410B2 (en) * 2005-07-08 2022-05-31 Cfph, Llc System and method for peer-to-peer wireless gaming
US20220284775A1 (en) * 2005-07-08 2022-09-08 Cfph, Llc System and method for peer-to-peer wireless gaming
US10512832B2 (en) 2005-07-14 2019-12-24 Charles D. Huston System and method for a golf event using artificial reality
US8249626B2 (en) 2005-07-14 2012-08-21 Huston Charles D GPS based friend location and identification system and method
US20080036653A1 (en) * 2005-07-14 2008-02-14 Huston Charles D GPS Based Friend Location and Identification System and Method
US10802153B2 (en) 2005-07-14 2020-10-13 Charles D. Huston GPS based participant identification system and method
US8842003B2 (en) 2005-07-14 2014-09-23 Charles D. Huston GPS-based location and messaging system and method
US11087345B2 (en) 2005-07-14 2021-08-10 Charles D. Huston System and method for creating content for an event using a social network
US8589488B2 (en) 2005-07-14 2013-11-19 Charles D. Huston System and method for creating content for an event using a social network
US9566494B2 (en) 2005-07-14 2017-02-14 Charles D. Huston System and method for creating and sharing an event using a social network
US8933967B2 (en) 2005-07-14 2015-01-13 Charles D. Huston System and method for creating and sharing an event using a social network
US8417261B2 (en) 2005-07-14 2013-04-09 Charles D. Huston GPS based friend location and identification system and method
US9344842B2 (en) 2005-07-14 2016-05-17 Charles D. Huston System and method for viewing golf using virtual reality
US8207843B2 (en) 2005-07-14 2012-06-26 Huston Charles D GPS-based location and messaging system and method
US20080198230A1 (en) * 2005-07-14 2008-08-21 Huston Charles D GPS Based Spectator and Participant Sport System and Method
US9445225B2 (en) * 2005-07-14 2016-09-13 Huston Family Trust GPS based spectator and participant sport system and method
US9498694B2 (en) 2005-07-14 2016-11-22 Charles D. Huston System and method for creating content for an event using a social network
US9798012B2 (en) 2005-07-14 2017-10-24 Charles D. Huston GPS based participant identification system and method
US20070117576A1 (en) * 2005-07-14 2007-05-24 Huston Charles D GPS Based Friend Location and Identification System and Method
US8275397B2 (en) 2005-07-14 2012-09-25 Huston Charles D GPS based friend location and identification system and method
US20080259096A1 (en) * 2005-07-14 2008-10-23 Huston Charles D GPS-Based Location and Messaging System and Method
US20070047517A1 (en) * 2005-08-29 2007-03-01 Hua Xu Method and apparatus for altering a media activity
US9511287B2 (en) * 2005-10-03 2016-12-06 Winview, Inc. Cellular phone games based upon television archives
US20070078009A1 (en) * 2005-10-03 2007-04-05 Airplay Network, Inc. Cellular phone games based upon television archives
US11154775B2 (en) 2005-10-03 2021-10-26 Winview, Inc. Synchronized gaming and programming
US11148050B2 (en) * 2005-10-03 2021-10-19 Winview, Inc. Cellular phone games based upon television archives
US10137369B2 (en) 2005-10-03 2018-11-27 Winview, Inc. Cellular phone games based television archives
US9919210B2 (en) 2005-10-03 2018-03-20 Winview, Inc. Synchronized gaming and programming
US10653955B2 (en) 2005-10-03 2020-05-19 Winview, Inc. Synchronized gaming and programming
US9788329B2 (en) 2005-11-01 2017-10-10 At&T Intellectual Property Ii, L.P. Non-interference technique for spatially aware mobile ad hoc networking
US8702506B2 (en) 2005-11-30 2014-04-22 At&T Intellectual Property I, L.P. Geogame for mobile device
US8777752B2 (en) 2005-11-30 2014-07-15 At&T Intellectual Property I, L.P. Geogame for mobile device
US11266896B2 (en) 2006-01-10 2022-03-08 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US11298621B2 (en) 2006-01-10 2022-04-12 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US11951402B2 (en) 2006-01-10 2024-04-09 Winview Ip Holdings, Llc Method of and system for conducting multiple contests of skill with a single performance
US9919221B2 (en) 2006-01-10 2018-03-20 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US9652937B2 (en) 2006-01-10 2017-05-16 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US9978217B2 (en) 2006-01-10 2018-05-22 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US10186116B2 (en) 2006-01-10 2019-01-22 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US11358064B2 (en) 2006-01-10 2022-06-14 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US11338189B2 (en) 2006-01-10 2022-05-24 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US11918880B2 (en) 2006-01-10 2024-03-05 Winview Ip Holdings, Llc Method of and system for conducting multiple contests of skill with a single performance
US10343071B2 (en) 2006-01-10 2019-07-09 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US9501904B2 (en) 2006-01-10 2016-11-22 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US9498724B2 (en) 2006-01-10 2016-11-22 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US10758809B2 (en) 2006-01-10 2020-09-01 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US10410474B2 (en) 2006-01-10 2019-09-10 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US10556183B2 (en) 2006-01-10 2020-02-11 Winview, Inc. Method of and system for conducting multiple contest of skill with a single performance
US10744414B2 (en) 2006-01-10 2020-08-18 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US10806988B2 (en) 2006-01-10 2020-10-20 Winview, Inc. Method of and system for conducting multiple contests of skill with a single performance
US9250703B2 (en) 2006-03-06 2016-02-02 Sony Computer Entertainment Inc. Interface with gaze detection and voice input
US9687739B2 (en) 2006-04-12 2017-06-27 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US9744453B2 (en) 2006-04-12 2017-08-29 Winview, Inc. Methodology for equalizing systemic latencies in reception in connection with games of skill played in connection with an online broadcast
US11007434B2 (en) 2006-04-12 2021-05-18 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US9999834B2 (en) 2006-04-12 2018-06-19 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US9993730B2 (en) 2006-04-12 2018-06-12 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US11917254B2 (en) 2006-04-12 2024-02-27 Winview Ip Holdings, Llc Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US11889157B2 (en) 2006-04-12 2024-01-30 Winview Ip Holdings, Llc Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US11825168B2 (en) 2006-04-12 2023-11-21 Winview Ip Holdings, Llc Eception in connection with games of skill played in connection with live television programming
US9662577B2 (en) 2006-04-12 2017-05-30 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US10150031B2 (en) 2006-04-12 2018-12-11 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US11736771B2 (en) 2006-04-12 2023-08-22 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US11722743B2 (en) 2006-04-12 2023-08-08 Winview, Inc. Synchronized gaming and programming
US11716515B2 (en) 2006-04-12 2023-08-01 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US11678020B2 (en) 2006-04-12 2023-06-13 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US9919211B2 (en) 2006-04-12 2018-03-20 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US10195526B2 (en) 2006-04-12 2019-02-05 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US10279253B2 (en) 2006-04-12 2019-05-07 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US11082746B2 (en) 2006-04-12 2021-08-03 Winview, Inc. Synchronized gaming and programming
US9901820B2 (en) 2006-04-12 2018-02-27 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US9878243B2 (en) 2006-04-12 2018-01-30 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US10874942B2 (en) 2006-04-12 2020-12-29 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US10363483B2 (en) 2006-04-12 2019-07-30 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US9604140B2 (en) 2006-04-12 2017-03-28 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US9662576B2 (en) 2006-04-12 2017-05-30 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US9724603B2 (en) 2006-04-12 2017-08-08 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US9457272B2 (en) 2006-04-12 2016-10-04 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US9707482B2 (en) 2006-04-12 2017-07-18 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US10556177B2 (en) 2006-04-12 2020-02-11 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US11077366B2 (en) 2006-04-12 2021-08-03 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US9672692B2 (en) 2006-04-12 2017-06-06 Winview, Inc. Synchronized gaming and programming
US10576371B2 (en) 2006-04-12 2020-03-03 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US11235237B2 (en) 2006-04-12 2022-02-01 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US11185770B2 (en) 2006-04-12 2021-11-30 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US11179632B2 (en) 2006-04-12 2021-11-23 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US9687738B2 (en) 2006-04-12 2017-06-27 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US10695672B2 (en) 2006-04-12 2020-06-30 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US10052557B2 (en) 2006-04-12 2018-08-21 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US11083965B2 (en) 2006-04-12 2021-08-10 Winview, Inc. Methodology for equalizing systemic latencies in television reception in connection with games of skill played in connection with live television programming
US20090099983A1 (en) * 2006-05-19 2009-04-16 Drane Associates, L.P. System and method for authoring and learning
US20070277097A1 (en) * 2006-05-25 2007-11-29 Erik Frederick Hennum Apparatus, system, and method for context-aware authoring transform
US20220036694A1 (en) * 2006-06-02 2022-02-03 Sg Gaming, Inc. Handheld wagering game system and methods for conducting wagering games thereupon
US20140155145A1 (en) * 2006-06-02 2014-06-05 Wms Gaming Inc. Handheld wagering game system and methods for conducting wagering games thereupon
US9412228B2 (en) * 2006-06-02 2016-08-09 Bally Gaming, Inc. Handheld wagering game system and methods for conducting wagering games thereupon
US10068418B2 (en) 2006-06-02 2018-09-04 Bally Gaming, Inc. Handheld wagering game system and methods for conducting wagering games thereupon
US20090247285A1 (en) * 2006-06-13 2009-10-01 Gagner Mark B Location detection for portable wagering game machines
US9550112B2 (en) 2006-06-30 2017-01-24 Sony Interactive Entertainment America Llc Dead reckoning in a gaming environment
US8142289B2 (en) * 2006-06-30 2012-03-27 Sony Computer Entertainment America Llc Dead reckoning in a gaming environment
US20080005172A1 (en) * 2006-06-30 2008-01-03 Robert Gutmann Dead reckoning in a gaming environment
US20120192015A1 (en) * 2006-06-30 2012-07-26 Robert Gutmann Dead reckoning in a gaming environment
US8734258B2 (en) * 2006-06-30 2014-05-27 Sony Computer Entertainment America Llc Dead reckoning in a gaming environment
US11017628B2 (en) 2006-10-26 2021-05-25 Interactive Games Llc System and method for wireless gaming with location determination
US11182462B2 (en) 2006-11-15 2021-11-23 Cfph, Llc Biometric access sensitivity
US20080280676A1 (en) * 2007-05-07 2008-11-13 Samsung Electronics Co. Ltd. Wireless gaming method and wireless gaming-enabled mobile terminal
US8506404B2 (en) * 2007-05-07 2013-08-13 Samsung Electronics Co., Ltd. Wireless gaming method and wireless gaming-enabled mobile terminal
US20090036188A1 (en) * 2007-08-01 2009-02-05 Gelman Geoffrey M General gaming engine
US20230290219A1 (en) * 2007-08-01 2023-09-14 Cfph, Llc General gaming engine
US10984631B2 (en) 2007-08-01 2021-04-20 Cfph, Llc General gaming engine
US8632407B2 (en) * 2007-08-01 2014-01-21 Cfph, Llc General gaming engine
US10297112B2 (en) * 2007-08-01 2019-05-21 Cfph, Llc General gaming engine
US9875617B2 (en) 2007-08-01 2018-01-23 Cfph, Llc General gaming engine
US11657678B2 (en) 2007-08-01 2023-05-23 Cfph, Llc General gaming engine
US9895604B2 (en) 2007-08-17 2018-02-20 At&T Intellectual Property I, L.P. Location-based mobile gaming application and method for implementing the same using a scalable tiered geocast protocol
US8821293B2 (en) 2007-08-17 2014-09-02 At&T Intellectual Property I, L.P. Location-based mobile gaming application and method for implementing the same using a scalable tiered geocast protocol
US9483405B2 (en) 2007-09-20 2016-11-01 Sony Interactive Entertainment Inc. Simplified run-time program translation for emulating complex processor pipelines
WO2009072010A1 (en) * 2007-12-07 2009-06-11 Sony Ericsson Mobile Communications Ab Dynamic gaming environment
US20090149250A1 (en) * 2007-12-07 2009-06-11 Sony Ericsson Mobile Communications Ab Dynamic gaming environment
US10143925B2 (en) 2007-12-07 2018-12-04 Sony Mobile Communications Inc. Dynamic gaming environment
CN101896237A (en) * 2007-12-07 2010-11-24 索尼爱立信移动通讯有限公司 Dynamic gaming environment
EP2127713A3 (en) * 2008-05-21 2009-12-30 Kabushiki Kaisha Bandai Game device
EP2127713A2 (en) * 2008-05-21 2009-12-02 Kabushiki Kaisha Bandai Game device
CN103055508A (en) * 2008-05-21 2013-04-24 万代股份有限公司 Game device
US20090319058A1 (en) * 2008-06-20 2009-12-24 Invensys Systems, Inc. Systems and methods for immersive interaction with actual and/or simulated facilities for process, environmental and industrial control
WO2009155483A1 (en) * 2008-06-20 2009-12-23 Invensys Systems, Inc. Systems and methods for immersive interaction with actual and/or simulated facilities for process, environmental and industrial control
CN104407518A (en) * 2008-06-20 2015-03-11 因文西斯系统公司 Systems and methods for immersive interaction with actual and/or simulated facilities for process, environmental and industrial control
US8594814B2 (en) * 2008-06-20 2013-11-26 Invensys Systems, Inc. Systems and methods for immersive interaction with actual and/or simulated facilities for process, environmental and industrial control
US20100067451A1 (en) * 2008-09-16 2010-03-18 Hall Robert J Quality of service scheme for collision-based wireless networks
US9544922B2 (en) 2008-09-16 2017-01-10 At&T Intellectual Property I, L.P. Quality of service scheme for collision-based wireless networks
US9138649B2 (en) * 2008-10-08 2015-09-22 Sony Corporation Game control program, game device, and game control method adapted to control game where objects are moved in game field
US20110319164A1 (en) * 2008-10-08 2011-12-29 Hirokazu Matsushita Game control program, game device, and game control method adapted to control game where objects are moved in game field
US11601727B2 (en) 2008-11-10 2023-03-07 Winview, Inc. Interactive advertising system
US10958985B1 (en) 2008-11-10 2021-03-23 Winview, Inc. Interactive advertising system
US9923974B2 (en) * 2008-11-15 2018-03-20 Adobe Systems Incorporated Method and device for identifying devices which can be targeted for the purpose of establishing a communication session
US20120272158A1 (en) * 2008-11-15 2012-10-25 Adobe Systems Incorporated Method and device for identifying devices which can be targeted for the purpose of establishing a communication session
US9600306B2 (en) * 2009-01-31 2017-03-21 International Business Machines Corporation Client-side simulated virtual universe environment
US20100199193A1 (en) * 2009-01-31 2010-08-05 International Business Machines Corporation Client-side simulated virtual universe environment
US10855683B2 (en) 2009-05-27 2020-12-01 Samsung Electronics Co., Ltd. System and method for facilitating user interaction with a simulated object associated with a physical location
US20100302143A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for control of a simulated object that is associated with a physical location in the real world environment
US20100304804A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method of simulated objects and applications thereof
US20100306825A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for facilitating user interaction with a simulated object associated with a physical location
US8303387B2 (en) 2009-05-27 2012-11-06 Zambala Lllp System and method of simulated objects and applications thereof
US11765175B2 (en) 2009-05-27 2023-09-19 Samsung Electronics Co., Ltd. System and method for facilitating user interaction with a simulated object associated with a physical location
US8745494B2 (en) 2009-05-27 2014-06-03 Zambala Lllp System and method for control of a simulated object that is associated with a physical location in the real world environment
US8127060B2 (en) 2009-05-29 2012-02-28 Invensys Systems, Inc Methods and apparatus for control configuration with control objects that are fieldbus protocol-aware
US8463964B2 (en) 2009-05-29 2013-06-11 Invensys Systems, Inc. Methods and apparatus for control configuration with enhanced change-tracking
US9656165B2 (en) 2009-11-04 2017-05-23 At&T Intellectual Property I, L.P. Campus alerting via wireless geocast
US8868027B2 (en) 2009-11-04 2014-10-21 At&T Intellectual Property I, L.P. Campus alerting via wireless geocast
US8751159B2 (en) 2009-11-04 2014-06-10 At&T Intellectual Property I, L.P. Augmented reality gaming via geographic messaging
US9802120B2 (en) 2009-11-04 2017-10-31 At&T Intellectual Property I, L.P. Geographic advertising using a scalable wireless geocast protocol
US9118428B2 (en) 2009-11-04 2015-08-25 At&T Intellectual Property I, L.P. Geographic advertising using a scalable wireless geocast protocol
US9266025B2 (en) 2009-11-04 2016-02-23 At&T Intellectual Property I, L.P. Augmented reality gaming via geographic messaging
US9675882B2 (en) 2009-11-04 2017-06-13 At&T Intellectual Property I, L.P. Augmented reality gaming via geographic messaging
US8126987B2 (en) 2009-11-16 2012-02-28 Sony Computer Entertainment Inc. Mediation of content-related services
US8968092B2 (en) * 2009-11-20 2015-03-03 Wms Gaming, Inc. Integrating wagering games and environmental conditions
US20120231886A1 (en) * 2009-11-20 2012-09-13 Wms Gaming Inc. Integrating wagering games and environmental conditions
US9513700B2 (en) 2009-12-24 2016-12-06 Sony Interactive Entertainment America Llc Calibration of portable devices in a shared virtual space
US10535153B2 (en) 2009-12-24 2020-01-14 Sony Interactive Entertainment America Llc Tracking position of device inside-out for virtual reality interactivity
US8537113B2 (en) 2010-03-05 2013-09-17 Sony Computer Entertainment America Llc Calibration of portable devices in a shared virtual space
US20110216002A1 (en) * 2010-03-05 2011-09-08 Sony Computer Entertainment America Llc Calibration of Portable Devices in a Shared Virtual Space
US8730156B2 (en) 2010-03-05 2014-05-20 Sony Computer Entertainment America Llc Maintaining multiple views on a shared stable virtual space
US20110216060A1 (en) * 2010-03-05 2011-09-08 Sony Computer Entertainment America Llc Maintaining Multiple Views on a Shared Stable Virtual Space
US9310883B2 (en) 2010-03-05 2016-04-12 Sony Computer Entertainment America Llc Maintaining multiple views on a shared stable virtual space
WO2011129907A1 (en) 2010-04-13 2011-10-20 Sony Computer Entertainment America Llc Calibration of portable devices in a shared virtual space
US8433759B2 (en) 2010-05-24 2013-04-30 Sony Computer Entertainment America Llc Direction-conscious information sharing
US8712056B2 (en) 2010-06-03 2014-04-29 At&T Intellectual Property I, L.P. Secure mobile ad hoc network
EP2586000A4 (en) * 2010-06-24 2013-05-15 Microsoft Corp Virtual and location-based multiplayer gaming
US20110319148A1 (en) * 2010-06-24 2011-12-29 Microsoft Corporation Virtual and location-based multiplayer gaming
US9573064B2 (en) * 2010-06-24 2017-02-21 Microsoft Technology Licensing, Llc Virtual and location-based multiplayer gaming
WO2011163063A2 (en) 2010-06-24 2011-12-29 Microsoft Corporation Virtual and location-based multiplayer gaming
CN102958573A (en) * 2010-06-24 2013-03-06 微软公司 Virtual and location-based multiplayer gaming
EP2586000A2 (en) * 2010-06-24 2013-05-01 Microsoft Corporation Virtual and location-based multiplayer gaming
US8500031B2 (en) 2010-07-29 2013-08-06 Bank Of America Corporation Wearable article having point of sale payment functionality
US9177307B2 (en) * 2010-07-29 2015-11-03 Bank Of America Corporation Wearable financial indicator
WO2012026936A1 (en) * 2010-08-26 2012-03-01 Sony Ericsson Mobile Communications Ab A game engine module and method for playing an electronic game using location information
US10016684B2 (en) 2010-10-28 2018-07-10 At&T Intellectual Property I, L.P. Secure geographic based gaming
US8577718B2 (en) 2010-11-04 2013-11-05 Dw Associates, Llc Methods and systems for identifying, quantifying, analyzing, and optimizing the level of engagement of components within a defined ecosystem or context
US20120157210A1 (en) * 2010-12-15 2012-06-21 At&T Intellectual Property I Lp Geogame for mobile device
US11869160B2 (en) 2011-04-08 2024-01-09 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US11854153B2 (en) 2011-04-08 2023-12-26 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US8996359B2 (en) 2011-05-18 2015-03-31 Dw Associates, Llc Taxonomy and application of language analysis and processing
US10120438B2 (en) 2011-05-25 2018-11-06 Sony Interactive Entertainment Inc. Eye gaze to alter device behavior
US8773467B2 (en) 2011-06-13 2014-07-08 International Business Machines Corporation Enhanced asset management and planning system
US9698996B2 (en) 2011-06-27 2017-07-04 At&T Intellectual Property I, L.P. Information acquisition using a scalable wireless geocast protocol
US9973881B2 (en) 2011-06-27 2018-05-15 At&T Intellectual Property I, L.P. Information acquisition using a scalable wireless geocast protocol
US10279261B2 (en) 2011-06-27 2019-05-07 At&T Intellectual Property I, L.P. Virtual reality gaming utilizing mobile gaming
US11202961B2 (en) 2011-06-27 2021-12-21 At&T Intellectual Property I, L.P. Virtual reality gaming utilizing mobile gaming
US9161158B2 (en) 2011-06-27 2015-10-13 At&T Intellectual Property I, L.P. Information acquisition using a scalable wireless geocast protocol
US9319842B2 (en) 2011-06-27 2016-04-19 At&T Intellectual Property I, L.P. Mobile device configured point and shoot type weapon
US8952796B1 (en) 2011-06-28 2015-02-10 Dw Associates, Llc Enactive perception device
US9360961B2 (en) 2011-09-22 2016-06-07 Parade Technologies, Ltd. Methods and apparatus to associate a detected presence of a conductive object
WO2013043214A1 (en) * 2011-09-22 2013-03-28 Jonathan Peterson Methods and apparatus to associate a detected presence of a conductive object
US9495870B2 (en) 2011-10-20 2016-11-15 At&T Intellectual Property I, L.P. Vehicular communications using a scalable ad hoc geographic routing protocol
US9269353B1 (en) 2011-12-07 2016-02-23 Manu Rehani Methods and systems for measuring semantics in communications
US10462727B2 (en) 2011-12-15 2019-10-29 At&T Intellectual Property I, L.P. Media distribution via a scalable ad hoc geographic protocol
US10075893B2 (en) 2011-12-15 2018-09-11 At&T Intellectual Property I, L.P. Media distribution via a scalable ad hoc geographic protocol
US9020807B2 (en) 2012-01-18 2015-04-28 Dw Associates, Llc Format for displaying text analytics results
US9667513B1 (en) 2012-01-24 2017-05-30 Dw Associates, Llc Real-time autonomous organization
US11017630B2 (en) 2012-02-28 2021-05-25 Cfph, Llc Gaming through mobile or other devices
US10878636B2 (en) 2012-05-01 2020-12-29 Samsung Electronics Co., Ltd. System and method for selecting targets in an augmented reality environment
US10127735B2 (en) 2012-05-01 2018-11-13 Augmented Reality Holdings 2, Llc System, method and apparatus of eye tracking or gaze detection applications including facilitating action on or interaction with a simulated object
US11417066B2 (en) 2012-05-01 2022-08-16 Samsung Electronics Co., Ltd. System and method for selecting targets in an augmented reality environment
US10388070B2 (en) 2012-05-01 2019-08-20 Samsung Electronics Co., Ltd. System and method for selecting targets in an augmented reality environment
US9656170B2 (en) * 2012-07-03 2017-05-23 Kabushiki Kaisha Square Enix Game apparatus
US20140011585A1 (en) * 2012-07-03 2014-01-09 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) Game apparatus
US9794860B2 (en) 2012-07-31 2017-10-17 At&T Intellectual Property I, L.P. Geocast-based situation awareness
US9071451B2 (en) 2012-07-31 2015-06-30 At&T Intellectual Property I, L.P. Geocast-based situation awareness
US9369295B2 (en) 2012-07-31 2016-06-14 At&T Intellectual Property I, L.P. Geocast-based situation awareness
US10130888B1 (en) 2012-07-31 2018-11-20 Niantic, Inc. Game data validation
US9669293B1 (en) * 2012-07-31 2017-06-06 Niantic, Inc. Game data validation
US9210589B2 (en) 2012-10-09 2015-12-08 At&T Intellectual Property I, L.P. Geocast protocol for wireless sensor network
US10511393B2 (en) 2012-12-12 2019-12-17 At&T Intellectual Property I, L.P. Geocast-based file transfer
US9660745B2 (en) 2012-12-12 2017-05-23 At&T Intellectual Property I, L.P. Geocast-based file transfer
US11392636B2 (en) 2013-10-17 2022-07-19 Nant Holdings Ip, Llc Augmented reality position-based service, methods, and systems
US20160250553A1 (en) * 2013-10-17 2016-09-01 Sony Computer Entertainment Inc. Game System, Game Controlling Method, and Game Controlling Program
US10471354B2 (en) * 2013-10-17 2019-11-12 Sony Interactive Entertainment Inc. Game system, game controlling method, and game controlling program
US11478703B2 (en) * 2013-10-17 2022-10-25 Sony Interactive Entertainment Inc. Game system, game controlling method, and game controlling program
US20150339952A1 (en) * 2014-05-24 2015-11-26 Nirit Glazer Method and system for using location services to teach concepts
US11216887B1 (en) * 2014-06-12 2022-01-04 Allstate Insurance Company Virtual simulation for insurance
US11861724B2 (en) 2014-06-12 2024-01-02 Allstate Insurance Company Virtual simulation for insurance
US11195233B1 (en) 2014-06-12 2021-12-07 Allstate Insurance Company Virtual simulation for insurance
US10719123B2 (en) 2014-07-15 2020-07-21 Nant Holdings Ip, Llc Multiparty object recognition
US11295293B2 (en) * 2016-01-07 2022-04-05 Worldpay, Llc Point of interaction device emulation for payment transaction simulation
US11238439B1 (en) 2016-01-07 2022-02-01 Worldpay, Llc Point of interaction device emulation for payment transaction simulation
US11551529B2 (en) 2016-07-20 2023-01-10 Winview, Inc. Method of generating separate contests of skill or chance from two independent events
US11541315B2 (en) 2017-07-22 2023-01-03 Niantic, Inc. Validating a player's real-world location using activity within a parallel-reality game
US10717005B2 (en) * 2017-07-22 2020-07-21 Niantic, Inc. Validating a player's real-world location using activity within a parallel reality game
US20190022530A1 (en) * 2017-07-22 2019-01-24 Niantic, Inc. Validating a player's real-world location using activity within a parallel reality game
US10835809B2 (en) * 2017-08-26 2020-11-17 Kristina Contreras Auditorium efficient tracking in auditory augmented reality
US11308765B2 (en) 2018-10-08 2022-04-19 Winview, Inc. Method and systems for reducing risk in setting odds for single fixed in-play propositions utilizing real time input
US10933317B2 (en) * 2019-03-15 2021-03-02 Sony Interactive Entertainment LLC. Near real-time augmented reality video gaming system
US11890536B2 (en) 2019-03-15 2024-02-06 Sony Interactive Entertainment LLC Near real-time augmented reality video gaming system
CN111773658A (en) * 2020-07-03 2020-10-16 珠海金山网络游戏科技有限公司 Game interaction method and device based on computer vision library
US11420123B2 (en) * 2020-12-30 2022-08-23 Sony Interactive Entertainment Inc. Helper mode in spectated video games
US20220203232A1 (en) * 2020-12-30 2022-06-30 Sony Interactive Entertainment Inc. Helper mode in spectated video games
CN114629761A (en) * 2022-03-22 2022-06-14 吉林省广播电视研究所(吉林省广播电视局科技信息中心) Frequency modulation signal anti-triangulation frequency measurement demodulation method

Similar Documents

Publication Publication Date Title
US20070265089A1 (en) Simulated phenomena interaction game
US20050009608A1 (en) Commerce-enabled environment for interacting with simulated phenomena
US20040002843A1 (en) Method and system for interacting with simulated phenomena
JP7364627B2 (en) Verifying the player&#39;s real-world position using activities in a parallel reality game
US8275834B2 (en) Multi-modal, geo-tempo communications systems
US6691032B1 (en) System and method for executing user-definable events triggered through geolocational data describing zones of influence
KR101670147B1 (en) Portable device, virtual reality system and method
RU2497566C2 (en) Interactive media-system for simulation of real events
US20090005140A1 (en) Real world gaming framework
US20070190494A1 (en) Multiplayer gaming using gps-enabled portable gaming devices
US20190217200A1 (en) Computer systems and computer-implemented methods for conducting and playing personalized games based on vocal and non-vocal game entries
CN115138075A (en) Verifying player real world locations using landmark image data corresponding to verification paths
TW202300201A (en) Repeatability predictions of interest points
Kurczak et al. Hearing is believing: evaluating ambient audio for location-based games
KR102224182B1 (en) User terminal and golf information system including the same
WO2004101090A2 (en) Commerce-enabled environment for interacting with simulated phenomena
KR101806427B1 (en) Method for game service and apparatus executing the method
Venselaar Towards location-and orientation-aware gaming: Research on Location-based Games with additional compass features
US20240075380A1 (en) Using Location-Based Game to Generate Language Information
KR102342778B1 (en) Golf simulation device providing personalized avatar for user and operating method thereof
US20200396566A1 (en) Interactive dance contest
Woodward et al. A stand-alone proximity-based gaming wearable for remote physical activity monitoring
Samarin Indoor Positioning for Location-Based Applications
Pnevmatikakis et al. Game and multisensory driven ecosystem to an active lifestyle
KR20160008679A (en) The virtual racing system between indoor and outdoor users via location based application for smart device connected to indoor moving machine

Legal Events

Date Code Title Description
AS Assignment

Owner name: GLOVENTURES LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ROBARTS, JAMES O.;ALVAREZ, CESAR A.;REEL/FRAME:016858/0896;SIGNING DATES FROM 20051202 TO 20051205

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION