US20030091970A1 - Method and apparatus for advanced leadership training simulation - Google Patents

Method and apparatus for advanced leadership training simulation Download PDF

Info

Publication number
US20030091970A1
US20030091970A1 US10/036,107 US3610701A US2003091970A1 US 20030091970 A1 US20030091970 A1 US 20030091970A1 US 3610701 A US3610701 A US 3610701A US 2003091970 A1 US2003091970 A1 US 2003091970A1
Authority
US
United States
Prior art keywords
participants
simulation
simulation content
responses
generating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/036,107
Inventor
Nathaniel Fast
Andrew Gordon
Randall Hill
Nicholas Iuppa
Richard Lindheim
William Swartout
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ALTSIM Inc AND UNIVERSITY OF SOUTHERN CALIFORNIA
University of Southern California USC
Altsim Inc
Original Assignee
ALTSIM Inc AND UNIVERSITY OF SOUTHERN CALIFORNIA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ALTSIM Inc AND UNIVERSITY OF SOUTHERN CALIFORNIA filed Critical ALTSIM Inc AND UNIVERSITY OF SOUTHERN CALIFORNIA
Priority to US10/036,107 priority Critical patent/US20030091970A1/en
Assigned to ALTSIM, INC. reassignment ALTSIM, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FAST, NATHANIEL A., IUPPA, NICHOLAS V.
Assigned to SOUTHERN CALIFORNIA, UNIVERSITY OF reassignment SOUTHERN CALIFORNIA, UNIVERSITY OF ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LINDHEIM, RICHARD D., GORDON, ANDREW S., HILL, JR., RANDALL W., SWARTOUT, WILLIAM R.
Priority to PCT/US2002/032175 priority patent/WO2003042955A1/en
Priority to CA002466309A priority patent/CA2466309A1/en
Priority to EP02792186A priority patent/EP1444673A1/en
Priority to US10/356,462 priority patent/US7155158B1/en
Publication of US20030091970A1 publication Critical patent/US20030091970A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/003Simulators for teaching or training purposes for military purposes and tactics

Definitions

  • the present invention relates generally to simulation technology, and more particularly to the use of simulation technology to teach skills in leadership and related topics through an Internet-based distance-learning architecture, as well as for general consumer gaming use.
  • the distance-learning features link participants at remote locations into a single collaborative experience via computer networks.
  • Story-based simulations increase participant attention and retention because story-based experiences are more involving and easier to remember. Participants are also able to build judgmental, cognitive and decision-making leadership skills because the simulations provide realistic context in which to model outstanding leadership behavior. Story-based simulations can teach innovation because they are able to challenge participants by providing dramatic encounters with unexpected events and possibilities. Also, story-based simulations overcome the limitations of current constructive and virtual simulations in modeling complex human behavior, which is an increasing part of today's leadership challenges.
  • a prime consideration in training modern leadership skills is the establishment of a simulation network for collective training that reflects the real world network of distributed command nodes.
  • the present invention proposes to overcome the above limitations and problems through a broad, long-range solution that creates a unique, fully immersive type of leadership training simulation that provides complex, realistic human interactions through a highly innovative and adaptive story-generation technology.
  • the same technology may also be applied to simulations created for consumer gaming.
  • the present application discloses simulation technology that teaches skills in leadership and related topics through an Internet-based distance-learning architecture.
  • the simulations are extremely compelling and memorable because they employ dramatic, people-centered stories and real-time instructional feedback managed by artificial intelligence software tools.
  • the advanced leadership training simulation system comprises a story representation system for representing simulation content for use in the training simulation, a story execution system for delivering the simulation content to one or more participants via a computer network, and an experience manager system for monitoring the participants responses to the simulation content, providing feedback to the participants and adjusting story events to match a change in the story's direction.
  • the story representation system provides a computer model of a story divided into discrete tasks, actions, goals or contingencies to be achieved by the participants in an engrossing story format.
  • the experience manager monitors the progress of the simulation with respect to the story representation tasks achieved by the participants and reports progress to an instructor interface.
  • An instructor monitoring the instructor interface may intervene in the simulation to adjust the direction of the simulation to maximize the dramatic and educational effectiveness of the simulation.
  • such a system would serve the needs of the game manager or game monitor.
  • the instructor may intervene in the simulation by changing the events of the story, by giving direct instruction to the participants, or by introducing a synthetic character into the simulation to change the simulation in a desired manner or to encourage certain responses from the participants.
  • An automated coaching system may also be used as part of or instead of the instructor intervention.
  • the system may also comprise an immersive audio system for enhancing realistic situations and an authoring tools system for developing new simulation scenarios, as well as tools allowing interoperability with other systems and/or simulations.
  • FIG. 1 is a diagram of the main components of the preferred embodiment as disclosed in the present application.
  • FIG. 2 is a diagram of certain components of the content delivery process of the preferred embodiment
  • FIG. 3 is a diagram of the monitoring process of the preferred embodiment
  • FIG. 4 illustrates an example of the monitoring process of the preferred embodiment
  • FIG. 5 is a diagram of the media record structure of the preferred embodiment.
  • FIG. 6 is a diagram of synthetic character generation for the preferred embodiment.
  • the present invention's distance-learning and general gaming technology employs a computer-based architecture that operates over the Internet to bring together dispersed participants into a single collaborative activity that simulates a realistic experience.
  • the experience is designed to be fully immersive and engaging in many ways, and to have the interactivity of a leading-edge multi-player game in order to appeal to and motivate a new generation of game-savvy participants.
  • the story representation system 20 is a computer program that provides a representation model within the system, i.e., it represents stories, structure and events in the program (akin to a storyboard) and allows integration of media and characters to a series of events and includes a task model 22 .
  • Expected participant behavior can be mapped onto the task model 22 , which is a list of tasks to be performed and goals to be reached.
  • the task model 22 may be used as an expectation of participant action.
  • the task model 22 preferably has three components.
  • a goal hierarchy 24 which is an outline of all the goals that are to be achieved in the task, where each major goal may be subdivided into a set of sub-goals, which in turn may be subdivided into sub-goals of their own, and so on. Sub-goals may be thought of as necessary conditions for the achievement of the parent goal, but not always sufficient conditions.
  • the expected plan 26 is initially presented as a linear plan of action, which itself begins the execution of a set of repetitive sub-plans and the monitoring for trigger conditions of a set of triggered plans.
  • the expected plan 26 may branch into a system of plans and sub-plans, wherein the repetitive plans are those that the participant is expected to repeat at certain intervals, such as repeated communications with other officers or repeated checking of maps and charts.
  • Triggered plans are triggered by certain events or conditions, such as transferring control to a Tactical Command Center once certain conditions are met.
  • the third component of the task model is a staff battle plan 28 .
  • a staff battle plan 28 is a set of prescribed activities that the participants and other characters are expected to follow in the event of an unforeseen occurrence. The occurrence is unforeseen, but, as with the expected plan 26 , the possibilities and the proper activities for handling it are well defined.
  • a story execution system 30 is a computer that selects the story elements and delivers them to the participants through a participant interface 31 connected to each participant's workstation 32 .
  • the story execution system 30 sends the story elements to the participant workstations 32 and records participant reaction to these elements, which is inputted into the participant workstations 32 by the participants.
  • the story execution system 30 provides for both input and output for the run-time operation of the simulated environment.
  • participants preferably have video connectivity so that they can see their fellow participants on their computer screens.
  • the story execution system 30 includes a story execution server 33 , which is a web server, such as an Apache Web Server, having additional server-side logic that manages the simulation.
  • a content database 34 is linked to the story execution server 33 and delivers to it the media content for the simulation according to the programmed story execution server logic 35 derived from the task model 22 and in response to input from the participants and/or input from the instructor.
  • the story execution server 33 then delivers the media content to the participants' workstations 32 through the participant interface 31 , which relies on readily-available web technology to communicate with the story execution server 33 .
  • the story execution server 33 also creates and delivers the simulation's web pages in accordance with known web page construction techniques and inserts keyed Hypertext Reference (HREF) Anchors to the interactive controls so that the server can track and relate the participants' actions.
  • the participant workstations 32 can then be web browsers that use plug-in components, such as a Shockwave Player, and basic scripting for display and interaction with the media. It also allows the participants to use a variety of existing media presentation components without source modification.
  • FIGS. 1 and 2 show three participant workstations 32 , although more or less than three may be used as necessary, depending on the number of participants.
  • the story execution server 33 preferably includes a participant manager 36 , which is a web page publishing engine that creates and maintains all interactions with the participant workstations 32 .
  • the participant manager 36 keeps the tables listing the current state of the participant interface and the triggers for the experience manager 40 (discussed below). It also outputs to a system activity database 37 , which is the log of all activity of the participants and the system itself.
  • the story execution server 33 further includes a page output engine 38 , which is a server that creates and delivers the formatted output (web pages and media content) to the participant workstations 32 .
  • the page output engine 38 utilizes tag substitution, which is managed by the participant manager 36 .
  • Tag substitution works to create a normal reference between the display control element on the participant workstations 32 and the related function on the story execution server 33 that the tag will trigger.
  • the participant manager 36 can then pre-process and forward the related command to the story execution server 33 components to influence the simulation's future course.
  • Dynamic tags are thereby generated that are specific to the singular nature of the currently running simulation, not relying upon hard coded tags generated during authoring that would not support a dynamic experience manager 40 . This allows different simulation events to use the same content files in various ways and with various individuals with alternative feedback results.
  • the participant manager 36 is preferably broad enough to maintain connections to any remote entity that utilizes or communicates with the story execution server 33 . This allows for a pass-through design where tagged elements can be normalized with remote simulations who may not be in the same simulation environment.
  • the participant manager 36 provides a common interface through which the simulations may inter-communicate.
  • the participant manager's 36 tag substitution allows alternative tag types for various participant types.
  • Such a structure also allows for automated systems to interact as virtual participants or for media generators to create dynamic new media with the system as necessary. This remote capability frees up the story execution server 33 to support the output and create a platform-independent runtime environment for automated media generation.
  • Creation and delivery of the output page is done by dynamically allocating media elements into a set of templates that are specific to the participant. In this way, a unique control set can be created for each participant that is specific to their function. This also allows for support of multiple browsers or client platforms that react in different ways to HTML layout rules.
  • Time is often of the essence for the participant's character, but occasionally time may be suspended while the participant receives advice or criticism from the instructor or, in a gaming application, from the game manager or game monitor.
  • the story execution server 33 further includes a master clock 39 , which can receive external commands that will suspend or halt the story execution server 33 , or suspend a participant's time. Time preferably may be halted for the entire simulation, for any set of participants, or for any event. When time is halted for an individual or exclusive group during a simulation, it may be thought of as a suspension, after which the participant or participants will rejoin at the current system time, missing events that have occurred during the suspension period.
  • reactions may be automatically inserted by the story execution server 33 to default selections specified during the authoring process.
  • reactions may be automatically inserted by the story execution server 33 to default selections specified during the authoring process.
  • This mechanism is also used to allow for participants who drop their connection to the story execution server 33 to be processed by the story execution server 33 , which provides default responses to the scenario enabling the simulation to play out without adversely affecting the continuity of the experience.
  • the instructor may wish to use the dropped connection as part of the exercise.
  • an experience manager 40 is an artificial intelligence rule engine residing on the story execution server 33 that monitors the progress of participants in the simulation and compares the progress to the pedagogical and dramatic goals of the simulation as expressed in the story representation system 20 . When differences cause specific rules to be triggered, the experience manager 40 generates an alert 41 and recommends modifications to the storyline that help keep the simulation on track. Participants' reactions to the simulation events are expressed through the interactive components, such as audio/video conference, that are part of the participant interface 31 .
  • an instructor interface 50 is a web client that communicates as a special class of participant through the story execution system 30 with the content database 34 and the experience manager 40 in order to present to the instructor an event-by-event description of the simulation as it actually unfolds and to display the participants' expected and actual behaviors.
  • the game manager or game monitor may use the instructor interface 50 in much the same way as an instructor would.
  • a plug-in such as Java Applets or Shockwave Player, manages the communications from the instructor interface 50 through the story execution system 30 in order to update media event records, call routines that would affect properties that influence the experience manager 40 , select alternative media for a participant, or manage the story state.
  • the instructor interface 50 includes a heading 51 , which indicates the name or number of the simulation. Also present on the instructor interface 50 is an experience manager display 52 , a story representation display 53 and a participant display 54 . Alerts 41 and corresponding recommendations generated by the experience manager 40 are displayed in the experience manager display 52 .
  • the story representation display 53 depicts the expected storyline and the way it is affected by the participants' behavior.
  • the participant display 54 along with various access tools 55 , gives the instructor access to all of the participant elements, such as maps, charts, newscasts, tools and so forth. The instructor may preview any or all of these elements and may also modify them as necessary.
  • the instructor interface 50 also includes various other tools, such as an email tool 56 for communicating with participants, a synthetic character development tool 57 for generating and inserting synthetic characters 60 (discussed below), and a clock 58 for keeping track of time in each story state.
  • the instructor interface 50 handles the master state of the story.
  • Present on the instructor interface 50 is a master list of states for all media to be presented in the expected story, along with a set of entries that represent each media element that must be selected in order to transition to the next state.
  • the state of the instructor interface 50 is defined as the totality of media that is currently displayed and that can be triggered in the immediate future by selecting any interactive control on the instructor interface 50 .
  • the transition from one state to the next is the updating of the media on the participant interface 31 by initiating a selection that alters what is seen or what may be selected in the immediate future.
  • an identification tag is sent to the instructor interface 50 to be presented as text and icons in the story representation display 53 and the participant display 54 .
  • each required item in the current state must be accessed while in that current state. Participants may access other media not related to the current state, and these will be transmitted to the instructor interface 50 as well, but without influencing the state transition. Once all of the required media elements are selected, the state then transitions to the next state, and this transition is reflected accordingly on the clock 58 .
  • synthetic characters 60 which are computer-generated speaking images, may be introduced into the simulation for various reasons.
  • a synthetic character may be required to play the role of a character in the story or the role of another participant.
  • participant interface 50 it may be required to provide coaching to participants automatically or through directives from the instructor via the instructor interface 50 . They can play adversaries or friends or other personalities that say or do things that make it necessary for the story to head in the required direction. They can also substitute as participants when sufficient numbers of live simulation participants are unavailable.
  • An automated coaching system 70 is a computer program connected to the story execution system 30 that provides pre-programmed advice and instruction to the participants, either automatically or when prompted by the instructor. It uses artificial intelligence technology to monitor participant performance and recommend appropriate actions to the participants.
  • Authoring tools 80 which are applications connected to the story representation system 20 , enable non-programmers to create new simulations based on new or existing storylines.
  • the authoring tools 80 are a collection of applications that allow for the generation and integration of the media that represents the story into the content database 34 . They are image, video, audio, graphic and text editors, interactive tools (such as for simulated radio communications or radar displays), interface template layout editors, or tools that integrate these media elements into the story.
  • the authoring tools 80 enable non-programmers to create new scenarios that take into consideration pedagogical goals and the principles of good drama and storytelling.
  • Immersive audio 90 is connected to the story representation system 20 and may be used to give the experience an especially rich and authentic feel. Immersive audio 90 provides a level of realism that helps propel the participants' emotional states and raise the credibility of the simulation.
  • the system is preferably designed to support a story-based simulation.
  • Story-based simulations depend upon information transferred to the active participants and upon the participants' interaction with that content.
  • the information is presented to the participants in terms of content media.
  • the media may take any form of representation that the participant workstations 32 are able to present to the participants.
  • the media may play out in a multitude of representational contexts. For example, audio may be a recorded speech, the sound of a communications center or a simulated interactive radio call. These three examples could be represented with different participant interfaces, yet they are all audio files or streams.
  • the story execution system 30 obtains the simulation media components from the content database 34 .
  • All simulation-related media and references have record definitions in the content database 34 that define them as media events 100 .
  • Media events 100 are the master records for content that is presented by the story execution system 30 .
  • a media event 100 is a description of information related to the nature of the corresponding media component and the impact it has on the simulation, required content media, positioning and playback control information. Not only can media components be played out from the content database 34 , but they can be created and inserted into the content database 34 during authoring (i.e., internally) or from an external system during the runtime.
  • Information related to the story representation system 20 and required by the experience manager 40 is also expressed as a media event 100 .
  • the media events 100 not only allow for markers for authoring, monitoring and evaluation, but also provide required data to assist the experience manager 40 in processing directives.
  • Media events 100 can be different to different participants and preferably support polymorphism. This is due to the fact that participants' interfaces 31 may be different in terms of display components, alert importance and desired representational form.
  • the records of media events 100 preferably contain one or more simulation event records 102 .
  • Each simulation event record 102 contains information related to action and performance of the simulation event in a particular participant interface.
  • the simulation event records 102 contain the parameters for the individual component they will represent. They also contain the identification symbols for the components and parameters that manage their layout. This data is transferred to and referenced by the participant manager 36 , which acts as the repository of current state information for the experience manager 40 .
  • the simulation event records 102 hold the information that is related to the role of the media in the participants' interfaces 31 . If required, a specific media event 100 may contain a separate simulation event record 102 for each participant. Different participants may utilize different layouts for the media in their interface.
  • a simulation event record 102 is linked to content media 104 through a media operation record 106 .
  • the media operation record 116 is specific to the simulation event record's 102 usage of the media.
  • the content media 104 is a generic media record that is indifferent to playback component requirements. This many-to-one relationship between media operation records 106 and content media 104 facilitates effective polymorphic usage of the media and its application. All participant interaction and simulation milestones are logged into the system activity database 37 , which allows for manual review and re-creation of a simulation.
  • artificial intelligence engines are preferably rules-based systems, wherein a computer is programmed with a set of rules for generating output in response to various inputs corresponding to as many different scenarios as can be thought of.
  • the approach of the present invention can best be described with the term “story-channels,” to replace the traditional notion of a “storyline.”
  • the term is derived from the metaphor of the system of gullies and channels that are formed as rainwater drains into lakes and oceans.
  • the channels may be either linear (a single valley, for example) or may have a branching tree structure, caused when a main valley is fed by multiple sources.
  • the channels can be very wide, such that someone paddling a canoe could chose from a huge range of positions as they navigated along their way.
  • the invention's approach to interactive storytelling is akin to making the inter-actor direct a canoe upstream in a system of story-channels.
  • the storyline could potentially have significant branching structure, where certain decisions could have drastic effects on the way the story unfolds. However, most decisions will simply serve to bounce the actor from side to side within the boundaries of the channel walls, never allowing the actor to leave the channel system entirely to explore in some unforeseen direction.
  • This metaphor is useful in describing four key parts of the development and use of the invention.
  • the “authoring process” for interactive narrative is to construct the geographical terrain, to describe the (potentially branching) series of mental events that the actors should experience as they play their role in the story.
  • a “tracking process” monitors the position of the canoe, observing the actions of the characters controlled by the actors in order to gather evidence for whether or not the actors' mental states adhere to the designers' expectations.
  • a “containing process” will serve as the walls of the channels, employing a set of explicit narrative strategies to keep the actors on track and moving forward.
  • a “tutoring process” will serve as the actors' experienced canoeing partner, watching the way that they navigate upstream and looking for opportunities to throw an educationally valuable twist in their paths.
  • the simulation delivered to the participants preferably depicts a series of events, characters and places arranged in a specified order and presented via web pages and media components, such as video, audio, text and graphic elements.
  • the media components may include items such as news stories, media clips, fly-over video from reconnaissance aircraft, synthetic representations of characters, maps, electronic mail, database materials, character biographies and dossiers.
  • a specific “story-channel” (or a branching set of storylines) is constructed for the interactive environment, and the events that the participants are expected to experience are explicitly represented in the story representation system 20 .
  • the story execution system 30 initially selects the appropriate simulation elements from the content database 34 according to the story representation system 20 and the task model 22 .
  • the experience manager 40 tracks the participants' actions and reports them to the story execution system 30 for comparison with the story representation system 20 and the task model 22 .
  • Each participant action is identified, for example as being “as expected” or as “different from expectations,” although other types of identifiers may be used.
  • the experience manager 40 analyzes the participants' input and flags performance that does not correspond to expectations. In response to such unexpected performance, the experience manager 40 then generates the alert 41 and sends it to the instructor interface 50 .
  • the alert 41 not only points out when participant behavior deviates from expectations, but also suggests responses that the system or the instructor can make in reaction to the unexpected participant performance. These responses are designed to set the simulation story back on course or to plot out a new direction for the story.
  • Alerts 41 generated by the experience manager 40 pass to the instructor interface 50 for acceptance or rejection by the instructor and then back to the story execution system 30 for forwarding to the experience manager 40 .
  • Changes to events and media initiated by the instructor via the instructor interface 50 also pass to the story execution system 30 for forwarding to the experience manager 40 .
  • the chosen option is converted by the experience manager 40 into a media event 100 and inserted into the content database 34 for immediate or later playback to the participants.
  • the experience manager 40 determines that it will generate a new media event 100 , it will create a record that allows the story execution system 30 to present the media event 100 to the participant.
  • the experience manager 40 is not required to know about the intricacies of the particular participant interface 31 that the participant maintains, only the nature of the media event 100 that must be produced.
  • the participant manager 36 matches the media event 100 to the layout specifications for the participant interface 31 when triggered. Tags are substituted with the aid of the experience manager 40 and the media event 100 will be actualized by the participant workstation 32 .
  • the Battle Captain In the military example, one of the participants may play the role of the Battle Captain, who runs the operation of the TOC and ensures proper flow of information into, within and out of the TOC.
  • the Battle Captain tracks the missions that are underway, tracks the activities of friendly and enemy forces, and reacts appropriately to unforeseen events.
  • the following goals may be set up as the Battle Captain's goal hierarchy: (i) assist the commanding officer, (ii) assist in unit planning, (iii) set the conditions for the success of the brigade, and (iv) ensure that information flows in the TOC.
  • Each of these goals may have one or more sub-goals, such as (i.a) provide advice and recommendations to the commanding officer, (ii.a) assist in developing troop-leading procedures, (iii.a) synchronize the efforts of the brigade staff, and (iv.a) repeatedly monitor radios, aviation reports, and activities of friendly units.
  • sub-goals may have one or more further sub-goals, and so on.
  • a plan may be devised that hypothesizes the expected plan of a Battle Captain for a typical 12-hour shift. For example: (i) arrive at the TOC, (ii) participate in battle update activity, (iii) collaboratively schedule first staff huddle for current staff; (iv) collaboratively schedule battle update activity for next shift, (v) begin monitoring for triggered sub-plans, (vi) begin the execution of repetitive sub-plans, (vii) terminate execution of repetitive sub-plans, (viii) participate in scheduled battle update activity, (ix) terminate execution of triggered sub-plans, and (x) leave the TOC.
  • a staff battle plan is identified for responding to battle drills. These plans are the military's tool for quickly responding to unforeseen or time-critical situations. For example, the system may simulate an unforeseen communications loss with a subordinate unit, necessitating a quick response from the Battle Captain. Identifying which staff battle drills are appropriate in any given task model generally depends on the storylines that are created for each simulation.
  • Task models such as these may be authored at varying levels of detail and formality, depending on the specific needs that they will serve.
  • the content of a task model 22 preferably comes from doctrinal publications and military training manuals, but also preferably includes assumptions or tacit knowledge obtained from known military stories and anecdotes.
  • Scenarios and elements thereof may also be developed by artists and other creative people with skill in dramatic writing and storytelling, such as screenplay writers and movie makers.
  • the experience manager 40 If, however, the Battle Captain fails to check the activities and locations of enemy troops before deploying the second unit, the experience manager 40 generates an alert that the participant playing the Battle Captain is not acting as expected and sends the alert to the instructor interface 50 along with suggested responses for the instructor, such as “Employ coach to advise Battle Captain.” The instructor may then accept or reject the experience manager's 40 recommendation, depending on the instructor's desire to set the simulation back on track, to plot out a new direction for the simulation, or simply to teach the participant a valuable lesson.
  • a specific media event 100 may contain a separate simulation event record 102 for each participant, and different participants may utilize different layouts for the media in their interface. For example, while the media delivered to a participant acting as a radar sector operator would be the same as the media delivered to a participant acting as a brigade commander, their access and presentation of that media would differ. Also, some media may be treated differently on different participants' interfaces. For example, an updated inventory of aircraft would be of great importance to an aviation officer but would be of passing interest to an intelligence officer. The notice may be visually highlighted in the aviation officer's interface through an alert.
  • the information related to the event will have to contain not only a layout identifier for the media, but also qualities for different participants in the story that effect the presentational rules for the media.
  • the media may differ from participant to participant.
  • the intelligence officer may receive an audio file of a conversation while the aviation officer may only have access to a text manuscript of the file.
  • the intelligence officer may have a simulated radio communication alert him that an active communication is taking place and force him to listen to it, while the aviation officer may gain access to the file only by navigating a series of menus that present the audio file in the context of the message. While the media file is the same, the display, presentation and impact on the participants differ greatly.
  • the designers of the simulation may anticipate many kinds of variations from the normal progress of the story. These variations can be pre-produced in traditional media forms and exist in the content database 34 for future use in the event that they are called for by the participant performance.
  • the diagram of the use of these kinds of media and the new direction in which they take the story correspond to traditional branching storylines that have been used in interactive lessons in the past.
  • These options are preferably presented to the instructor on the instructor interface 50 before they are used in the simulation, although the experience manager 40 , as an artificial intelligence engine, may be programmed to deploy the elements as needed.
  • the instructor has the capability to edit many of the pre-produced options.
  • the automated coaching system 70 contains the artificial intelligence to understand the performance of the participants and judge whether it is correct or incorrect. It can then automatically and immediately articulate advice, examples or criticism to the participants that will help tutor them and guide them to the correct performance according to the pedagogical goals of the exercise. Because the simulation is story-based, the synthetic character 60 that delivers the advice to the participant can play the role of one of the characters in the story. As such, the character will display the personality and style of the character as it imparts information to the appropriate participant. As with the experience manager 40 , the artificial intelligence of the automated coaching system 70 is preferably rules-based. In another preferred embodiment, the artificial intelligence may be knowledge-based.
  • the story execution system 30 displays a media item on the participants' screens that portrays the synthetic character 60 saying the words.
  • this media item has both audio and visual components that cause the participants to believe that the character is a real human being that was participating in the simulation from an off-site location and using the same video-conferencing tools that are available to the participants.
  • the most believable media that could be presented to the participants is a pre-produced digital video file 120 , capturing an actor delivering a predetermined speech. Special effects may be added to the video file to simulate the effects of latency caused by such things as video-conferencing over the Internet, among other factors.
  • an algorithm could be created to transform textual input into audio output by voice synthesis, while accompanying a static photograph 122 of the speaking character. This enables the instructor to tailor the communications to the particular participants as necessary.
  • the synthetic text-to-speech algorithm could be used with articulation photographs 124 (i.e., photographs of actors articulating specific vowel and consonant sounds) or animated character models.

Abstract

A method and apparatus is disclosed for advanced leadership training simulation wherein the simulation teaches skills in leadership and related topics through an Internet-based distance-learning architecture. The distance-learning features link trainees at remote locations into a single collaborative experience via computer networks. Instructional storylines are created and programmed into a computer and then delivered as a simulated but realistic story to one or more participants. The participants' reactions are monitored and compared with expected results. The storyline may be altered in response to the participants' responses, and synthetic characters may be generated to act as automated participants or coaches. Constructive feedback is provided to the participants during or after the simulation.

Description

  • [0001] This invention was made with Government support under Contract No. DAAD19-99-D-0046 awarded by the United States Army Research Office. The Government has certain rights in this invention.
  • FIELD OF THE INVENTION
  • The present invention relates generally to simulation technology, and more particularly to the use of simulation technology to teach skills in leadership and related topics through an Internet-based distance-learning architecture, as well as for general consumer gaming use. The distance-learning features link participants at remote locations into a single collaborative experience via computer networks. [0002]
  • BACKGROUND OF THE INVENTION
  • Recent United States Army studies have indicated that the leadership requirements of the modern war fighting force involve several significant differences from historical experience. Some factors of particular importance to the new generation of military leaders include: (i) the broad variety of people-centered, crisis-based military missions, including counter-terrorism, peacekeeping, operations in urban terrain and the newly emphasized homeland defense, in addition to more conventional warfare; (ii) the command of and dependence on a number of complex weapon, communication and intelligence systems involving advanced technology and specialized tasks; (iii) increased robotic and automated elements present on the battlefield; (iv) distributed forces at all echelons, requiring matching forms of distributed command; and (v) increased emphasis on collaboration in planning and operations. [0003]
  • The demographics of the military leadership corps is changing in several ways, and among the positive features of this change is a high level of sophistication and experience in computer use, including computer communication gaming and data acquisition. This means that modern training simulations must be as motivating and as well-implemented as commercial gaming and information products in order to capture and hold the attention of the new army generation. [0004]
  • There are currently highly developed aircraft, tank and other ground vehicle virtual simulators that realistically present military terrain and the movement of the vehicles within the terrain. Such simulators are very effective at teaching basic operational skills. Networks of virtual simulators, including SIMNET, CCTT and the CATT family, are also available to teach leader coordination of combined arms weapons systems during conventional and MOUT (Military Operations on Urbanized Terrain) warfare in highly lifelike settings. Likewise, constructive simulations such as BBS, Janus, WARSIM, WARSIM 2000 and others are very effective in focusing on the tactical aspects of leadership—representing movement of material, weapons and personnel—particularly for higher echelon maneuvers. [0005]
  • But the same level of developmental effort has not been directed toward equally effective virtual and/or constructive simulators for training leadership and related cognitive skills in scenarios involving substantial human factor challenges. Driving a tank does not require the background knowledge, the collaboration or the complex political, diplomatic and psychological judgments that must be made in a difficult, people-centered crisis leadership situation. These judgments depend largely on the actual and estimated behavior of human participants, both friend and foe, in the crisis situation. And unfortunately, the complete modeling of complex human behavior is still beyond current technical capabilities. [0006]
  • As a result, these kinds of leadership skills have routinely been taught in the classroom through lectures and exercises featuring handouts and videotapes. It is possible for a good instructor to build the tension needed to approximate a leadership crisis, but sustaining the tension is difficult to do. Showing the heartbreak of the crisis and the gut-wrenching decisions that must be made is not the strong suit of paper-and-pencil materials or low budget, home-grown videos. [0007]
  • Large classroom exercises such as “Army After Next” and “The Crisis Decision Exercise” at the National Defense University have attempted to give some sense of the leaders' experience through week-long exercises that involve months of planning. These exercises are effective, but they cannot be distributed widely. Also, they are not easy to update and modify, and they require a large contingent of designers and developers, as well as on-site operators, to run them after months of planning time. [0008]
  • Story-based simulations, on the other hand, increase participant attention and retention because story-based experiences are more involving and easier to remember. Participants are also able to build judgmental, cognitive and decision-making leadership skills because the simulations provide realistic context in which to model outstanding leadership behavior. Story-based simulations can teach innovation because they are able to challenge participants by providing dramatic encounters with unexpected events and possibilities. Also, story-based simulations overcome the limitations of current constructive and virtual simulations in modeling complex human behavior, which is an increasing part of today's leadership challenges. [0009]
  • A prime consideration in training modern leadership skills is the establishment of a simulation network for collective training that reflects the real world network of distributed command nodes. Today's budgetary constraints, which necessitate the most efficient use of resources, require that collective as well as individualized training simulation be delivered remotely via distance learning as well as in classrooms, to avoid costly travel and subsistence. [0010]
  • Crisis-based leadership training requires an awareness of human factors that has been especially difficult to teach through media or the classroom. Giving complexity to an adversary's personality or turning a political confrontation into a battle of wits and will (things that, in fact, represent so much of today's military decision making) are easier to talk about than to practice or simulate. [0011]
  • From a computational perspective, the greatest challenge in the development of interactive storytelling environments is handling the autonomy and unpredictability of the participants. In non-interactive storytelling genres, the focus of development can be placed entirely on a single storyline that is to be experienced by the audience. However, when the audience itself becomes an actor in the story, the number of potential storylines that could unfold becomes much larger, based on the number of times the actors have the possibility of taking an action, and the number of possible actions that they could take at those times. [0012]
  • Given the autonomy of the actors' characters in the storyline, the story composer is immediately faced with a number of critical problems: How can the composer prevent the actor from taking actions in the imagined world that will move the story in a completely unforeseen direction, or from taking actions that will derail the storyline entirely? How can the composer allow the actors to make critical decisions, devise creative plans, and explore different options without giving up the narrative control that is necessary to deliver a compelling experience? And in the case of interactive tutoring systems, how can the composer understand enough about the beliefs and abilities of the actors to create an experience that has some real educational value, i.e., that improves the quality of the decisions that they would make when faced with similar situations in the real world?[0013]
  • Therefore, what is needed is a method and apparatus for advanced leadership training simulation that allows the participants to make real-time critical decisions, devise creative plans and explore different options without relinquishing the composer's narrative control and while allowing the composer to create an experience that improves the quality of leadership decision-making and delivers a compelling experience. [0014]
  • The present invention proposes to overcome the above limitations and problems through a broad, long-range solution that creates a unique, fully immersive type of leadership training simulation that provides complex, realistic human interactions through a highly innovative and adaptive story-generation technology. The same technology may also be applied to simulations created for consumer gaming. [0015]
  • SUMMARY OF THE INVENTION
  • The present application discloses simulation technology that teaches skills in leadership and related topics through an Internet-based distance-learning architecture. The simulations are extremely compelling and memorable because they employ dramatic, people-centered stories and real-time instructional feedback managed by artificial intelligence software tools. [0016]
  • The advanced leadership training simulation system comprises a story representation system for representing simulation content for use in the training simulation, a story execution system for delivering the simulation content to one or more participants via a computer network, and an experience manager system for monitoring the participants responses to the simulation content, providing feedback to the participants and adjusting story events to match a change in the story's direction. [0017]
  • The story representation system provides a computer model of a story divided into discrete tasks, actions, goals or contingencies to be achieved by the participants in an engrossing story format. The experience manager monitors the progress of the simulation with respect to the story representation tasks achieved by the participants and reports progress to an instructor interface. An instructor monitoring the instructor interface may intervene in the simulation to adjust the direction of the simulation to maximize the dramatic and educational effectiveness of the simulation. In a gaming application, such a system would serve the needs of the game manager or game monitor. [0018]
  • The instructor may intervene in the simulation by changing the events of the story, by giving direct instruction to the participants, or by introducing a synthetic character into the simulation to change the simulation in a desired manner or to encourage certain responses from the participants. An automated coaching system may also be used as part of or instead of the instructor intervention. [0019]
  • The system may also comprise an immersive audio system for enhancing realistic situations and an authoring tools system for developing new simulation scenarios, as well as tools allowing interoperability with other systems and/or simulations.[0020]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • While the specification concludes with claims specifically pointing out and distinctly claiming the subject matter of the invention, it is believed the invention will be better understood from the following description taken in conjunction with the accompanying drawings wherein like reference characters designate the same or similar elements and wherein: [0021]
  • FIG. 1 is a diagram of the main components of the preferred embodiment as disclosed in the present application; [0022]
  • FIG. 2 is a diagram of certain components of the content delivery process of the preferred embodiment; [0023]
  • FIG. 3 is a diagram of the monitoring process of the preferred embodiment; [0024]
  • FIG. 4 illustrates an example of the monitoring process of the preferred embodiment; [0025]
  • FIG. 5 is a diagram of the media record structure of the preferred embodiment; and [0026]
  • FIG. 6 is a diagram of synthetic character generation for the preferred embodiment.[0027]
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • It is to be understood that the figures and descriptions of the present invention have been simplified to illustrate elements that are relevant for a clear understanding of the invention, while eliminating, for purposes of clarity, other elements that may be well known. Those of ordinary skill in the art will recognize that other elements are desirable and/or required in order to implement the present invention. However, because such elements are well known in the art, and because they do not facilitate a better understanding of the present invention, a discussion of such elements is not provided herein. The detailed description will be provided hereinbelow with reference to the attached drawings. [0028]
  • The present invention's distance-learning and general gaming technology employs a computer-based architecture that operates over the Internet to bring together dispersed participants into a single collaborative activity that simulates a realistic experience. However, the experience is designed to be fully immersive and engaging in many ways, and to have the interactivity of a leading-edge multi-player game in order to appeal to and motivate a new generation of game-savvy participants. [0029]
  • Referring to FIG. 1, the [0030] story representation system 20 is a computer program that provides a representation model within the system, i.e., it represents stories, structure and events in the program (akin to a storyboard) and allows integration of media and characters to a series of events and includes a task model 22. Expected participant behavior can be mapped onto the task model 22, which is a list of tasks to be performed and goals to be reached. By turning blocks of expository text into numbered sets of task steps, with preconditions, structured contingencies and action descriptions that are more algorithmic in nature, the task model 22 may be used as an expectation of participant action. By comparing the specific actions of a participant to the task model 22 for the participant's ideal real-world counterpart, the participant's progress may be tracked, and deviations warranting pedagogical or dramatic interventions may be flagged. The task model 22 preferably has three components. First, there is a goal hierarchy 24, which is an outline of all the goals that are to be achieved in the task, where each major goal may be subdivided into a set of sub-goals, which in turn may be subdivided into sub-goals of their own, and so on. Sub-goals may be thought of as necessary conditions for the achievement of the parent goal, but not always sufficient conditions. Second, there is an expected plan 26, which is a recipe for the successful attainment of the goals in the goal hierarchy 24. The expected plan 26 is initially presented as a linear plan of action, which itself begins the execution of a set of repetitive sub-plans and the monitoring for trigger conditions of a set of triggered plans. Thus, the expected plan 26 may branch into a system of plans and sub-plans, wherein the repetitive plans are those that the participant is expected to repeat at certain intervals, such as repeated communications with other officers or repeated checking of maps and charts. Triggered plans, as the name suggests, are triggered by certain events or conditions, such as transferring control to a Tactical Command Center once certain conditions are met. The third component of the task model is a staff battle plan 28. A staff battle plan 28 is a set of prescribed activities that the participants and other characters are expected to follow in the event of an unforeseen occurrence. The occurrence is unforeseen, but, as with the expected plan 26, the possibilities and the proper activities for handling it are well defined.
  • Referring to FIGS. 1 and 2, a [0031] story execution system 30, is a computer that selects the story elements and delivers them to the participants through a participant interface 31 connected to each participant's workstation 32. The story execution system 30 sends the story elements to the participant workstations 32 and records participant reaction to these elements, which is inputted into the participant workstations 32 by the participants. Thus, the story execution system 30 provides for both input and output for the run-time operation of the simulated environment. Additionally, participants preferably have video connectivity so that they can see their fellow participants on their computer screens.
  • The [0032] story execution system 30 includes a story execution server 33, which is a web server, such as an Apache Web Server, having additional server-side logic that manages the simulation. A content database 34 is linked to the story execution server 33 and delivers to it the media content for the simulation according to the programmed story execution server logic 35 derived from the task model 22 and in response to input from the participants and/or input from the instructor. The story execution server 33 then delivers the media content to the participants' workstations 32 through the participant interface 31, which relies on readily-available web technology to communicate with the story execution server 33. The story execution server 33 also creates and delivers the simulation's web pages in accordance with known web page construction techniques and inserts keyed Hypertext Reference (HREF) Anchors to the interactive controls so that the server can track and relate the participants' actions. The participant workstations 32 can then be web browsers that use plug-in components, such as a Shockwave Player, and basic scripting for display and interaction with the media. It also allows the participants to use a variety of existing media presentation components without source modification. FIGS. 1 and 2 show three participant workstations 32, although more or less than three may be used as necessary, depending on the number of participants.
  • The [0033] story execution server 33 preferably includes a participant manager 36, which is a web page publishing engine that creates and maintains all interactions with the participant workstations 32. The participant manager 36 keeps the tables listing the current state of the participant interface and the triggers for the experience manager 40 (discussed below). It also outputs to a system activity database 37, which is the log of all activity of the participants and the system itself.
  • The [0034] story execution server 33 further includes a page output engine 38, which is a server that creates and delivers the formatted output (web pages and media content) to the participant workstations 32. The page output engine 38 utilizes tag substitution, which is managed by the participant manager 36. Tag substitution works to create a normal reference between the display control element on the participant workstations 32 and the related function on the story execution server 33 that the tag will trigger. The participant manager 36 can then pre-process and forward the related command to the story execution server 33 components to influence the simulation's future course. Dynamic tags are thereby generated that are specific to the singular nature of the currently running simulation, not relying upon hard coded tags generated during authoring that would not support a dynamic experience manager 40. This allows different simulation events to use the same content files in various ways and with various individuals with alternative feedback results.
  • The [0035] participant manager 36 is preferably broad enough to maintain connections to any remote entity that utilizes or communicates with the story execution server 33. This allows for a pass-through design where tagged elements can be normalized with remote simulations who may not be in the same simulation environment. The participant manager 36 provides a common interface through which the simulations may inter-communicate. The participant manager's 36 tag substitution allows alternative tag types for various participant types. Such a structure also allows for automated systems to interact as virtual participants or for media generators to create dynamic new media with the system as necessary. This remote capability frees up the story execution server 33 to support the output and create a platform-independent runtime environment for automated media generation.
  • Creation and delivery of the output page is done by dynamically allocating media elements into a set of templates that are specific to the participant. In this way, a unique control set can be created for each participant that is specific to their function. This also allows for support of multiple browsers or client platforms that react in different ways to HTML layout rules. [0036]
  • Time is often of the essence for the participant's character, but occasionally time may be suspended while the participant receives advice or criticism from the instructor or, in a gaming application, from the game manager or game monitor. Thus, the [0037] story execution server 33 further includes a master clock 39, which can receive external commands that will suspend or halt the story execution server 33, or suspend a participant's time. Time preferably may be halted for the entire simulation, for any set of participants, or for any event. When time is halted for an individual or exclusive group during a simulation, it may be thought of as a suspension, after which the participant or participants will rejoin at the current system time, missing events that have occurred during the suspension period. If desired, reactions may be automatically inserted by the story execution server 33 to default selections specified during the authoring process. When the suspended participants re-enter the scenario, their participant interfaces 31 are refreshed to bring them up to date with the current simulation. This mechanism is also used to allow for participants who drop their connection to the story execution server 33 to be processed by the story execution server 33, which provides default responses to the scenario enabling the simulation to play out without adversely affecting the continuity of the experience. Alternatively, the instructor may wish to use the dropped connection as part of the exercise.
  • Referring to FIGS. 1, 2 and [0038] 3, an experience manager 40 is an artificial intelligence rule engine residing on the story execution server 33 that monitors the progress of participants in the simulation and compares the progress to the pedagogical and dramatic goals of the simulation as expressed in the story representation system 20. When differences cause specific rules to be triggered, the experience manager 40 generates an alert 41 and recommends modifications to the storyline that help keep the simulation on track. Participants' reactions to the simulation events are expressed through the interactive components, such as audio/video conference, that are part of the participant interface 31.
  • Referring to FIGS. 1 and 4, an [0039] instructor interface 50 is a web client that communicates as a special class of participant through the story execution system 30 with the content database 34 and the experience manager 40 in order to present to the instructor an event-by-event description of the simulation as it actually unfolds and to display the participants' expected and actual behaviors. In a general gaming application, the game manager or game monitor may use the instructor interface 50 in much the same way as an instructor would. A plug-in, such as Java Applets or Shockwave Player, manages the communications from the instructor interface 50 through the story execution system 30 in order to update media event records, call routines that would affect properties that influence the experience manager 40, select alternative media for a participant, or manage the story state. Thus, the instructor may adjust the direction of the simulation to maximize the dramatic and educational effectiveness of the simulation and to interject new elements and information when necessary. The instructor interface 50 includes a heading 51, which indicates the name or number of the simulation. Also present on the instructor interface 50 is an experience manager display 52, a story representation display 53 and a participant display 54. Alerts 41 and corresponding recommendations generated by the experience manager 40 are displayed in the experience manager display 52. The story representation display 53 depicts the expected storyline and the way it is affected by the participants' behavior. The participant display 54, along with various access tools 55, gives the instructor access to all of the participant elements, such as maps, charts, newscasts, tools and so forth. The instructor may preview any or all of these elements and may also modify them as necessary. The instructor interface 50 also includes various other tools, such as an email tool 56 for communicating with participants, a synthetic character development tool 57 for generating and inserting synthetic characters 60 (discussed below), and a clock 58 for keeping track of time in each story state.
  • The [0040] instructor interface 50 handles the master state of the story. Present on the instructor interface 50 is a master list of states for all media to be presented in the expected story, along with a set of entries that represent each media element that must be selected in order to transition to the next state. The state of the instructor interface 50 is defined as the totality of media that is currently displayed and that can be triggered in the immediate future by selecting any interactive control on the instructor interface 50. The transition from one state to the next is the updating of the media on the participant interface 31 by initiating a selection that alters what is seen or what may be selected in the immediate future. As the participants access each media element, an identification tag is sent to the instructor interface 50 to be presented as text and icons in the story representation display 53 and the participant display 54. To progress to the next state in the story, each required item in the current state must be accessed while in that current state. Participants may access other media not related to the current state, and these will be transmitted to the instructor interface 50 as well, but without influencing the state transition. Once all of the required media elements are selected, the state then transitions to the next state, and this transition is reflected accordingly on the clock 58.
  • Returning to FIG. 1, [0041] synthetic characters 60, which are computer-generated speaking images, may be introduced into the simulation for various reasons. For example, a synthetic character may be required to play the role of a character in the story or the role of another participant.
  • Alternatively, it may be required to provide coaching to participants automatically or through directives from the instructor via the [0042] instructor interface 50. They can play adversaries or friends or other personalities that say or do things that make it necessary for the story to head in the required direction. They can also substitute as participants when sufficient numbers of live simulation participants are unavailable.
  • An [0043] automated coaching system 70 is a computer program connected to the story execution system 30 that provides pre-programmed advice and instruction to the participants, either automatically or when prompted by the instructor. It uses artificial intelligence technology to monitor participant performance and recommend appropriate actions to the participants.
  • [0044] Authoring tools 80, which are applications connected to the story representation system 20, enable non-programmers to create new simulations based on new or existing storylines. The authoring tools 80 are a collection of applications that allow for the generation and integration of the media that represents the story into the content database 34. They are image, video, audio, graphic and text editors, interactive tools (such as for simulated radio communications or radar displays), interface template layout editors, or tools that integrate these media elements into the story. The authoring tools 80 enable non-programmers to create new scenarios that take into consideration pedagogical goals and the principles of good drama and storytelling.
  • [0045] Immersive audio 90 is connected to the story representation system 20 and may be used to give the experience an especially rich and authentic feel. Immersive audio 90 provides a level of realism that helps propel the participants' emotional states and raise the credibility of the simulation.
  • The system is preferably designed to support a story-based simulation. Story-based simulations depend upon information transferred to the active participants and upon the participants' interaction with that content. The information is presented to the participants in terms of content media. The media may take any form of representation that the [0046] participant workstations 32 are able to present to the participants. The media may play out in a multitude of representational contexts. For example, audio may be a recorded speech, the sound of a communications center or a simulated interactive radio call. These three examples could be represented with different participant interfaces, yet they are all audio files or streams.
  • Referring to FIGS. 2 and 5, the [0047] story execution system 30 obtains the simulation media components from the content database 34. All simulation-related media and references have record definitions in the content database 34 that define them as media events 100. Media events 100 are the master records for content that is presented by the story execution system 30. A media event 100 is a description of information related to the nature of the corresponding media component and the impact it has on the simulation, required content media, positioning and playback control information. Not only can media components be played out from the content database 34, but they can be created and inserted into the content database 34 during authoring (i.e., internally) or from an external system during the runtime. Information related to the story representation system 20 and required by the experience manager 40 is also expressed as a media event 100. The media events 100 not only allow for markers for authoring, monitoring and evaluation, but also provide required data to assist the experience manager 40 in processing directives.
  • [0048] Media events 100 can be different to different participants and preferably support polymorphism. This is due to the fact that participants' interfaces 31 may be different in terms of display components, alert importance and desired representational form.
  • The records of [0049] media events 100 preferably contain one or more simulation event records 102. Each simulation event record 102 contains information related to action and performance of the simulation event in a particular participant interface. The simulation event records 102 contain the parameters for the individual component they will represent. They also contain the identification symbols for the components and parameters that manage their layout. This data is transferred to and referenced by the participant manager 36, which acts as the repository of current state information for the experience manager 40.
  • The [0050] simulation event records 102 hold the information that is related to the role of the media in the participants' interfaces 31. If required, a specific media event 100 may contain a separate simulation event record 102 for each participant. Different participants may utilize different layouts for the media in their interface.
  • A [0051] simulation event record 102 is linked to content media 104 through a media operation record 106. The media operation record 116 is specific to the simulation event record's 102 usage of the media. The content media 104 is a generic media record that is indifferent to playback component requirements. This many-to-one relationship between media operation records 106 and content media 104 facilitates effective polymorphic usage of the media and its application. All participant interaction and simulation milestones are logged into the system activity database 37, which allows for manual review and re-creation of a simulation.
  • Several of the components disclosed herein rely on artificial intelligence technology. These artificial intelligence engines are preferably rules-based systems, wherein a computer is programmed with a set of rules for generating output in response to various inputs corresponding to as many different scenarios as can be thought of. [0052]
  • The approach of the present invention can best be described with the term “story-channels,” to replace the traditional notion of a “storyline.” The term is derived from the metaphor of the system of gullies and channels that are formed as rainwater drains into lakes and oceans. Globally, the channels may be either linear (a single valley, for example) or may have a branching tree structure, caused when a main valley is fed by multiple sources. Locally, the channels can be very wide, such that someone paddling a canoe could chose from a huge range of positions as they navigated along their way. In the same manner, the invention's approach to interactive storytelling is akin to making the inter-actor direct a canoe upstream in a system of story-channels. The storyline could potentially have significant branching structure, where certain decisions could have drastic effects on the way the story unfolds. However, most decisions will simply serve to bounce the actor from side to side within the boundaries of the channel walls, never allowing the actor to leave the channel system entirely to explore in some unforeseen direction. This metaphor is useful in describing four key parts of the development and use of the invention. First, the “authoring process” for interactive narrative is to construct the geographical terrain, to describe the (potentially branching) series of mental events that the actors should experience as they play their role in the story. Second, during the actual running of the simulation, a “tracking process” monitors the position of the canoe, observing the actions of the characters controlled by the actors in order to gather evidence for whether or not the actors' mental states adhere to the designers' expectations. Third, a “containing process” will serve as the walls of the channels, employing a set of explicit narrative strategies to keep the actors on track and moving forward. Fourth, a “tutoring process” will serve as the actors' experienced canoeing partner, watching the way that they navigate upstream and looking for opportunities to throw an educationally valuable twist in their paths. [0053]
  • The simulation delivered to the participants preferably depicts a series of events, characters and places arranged in a specified order and presented via web pages and media components, such as video, audio, text and graphic elements. The media components may include items such as news stories, media clips, fly-over video from reconnaissance aircraft, synthetic representations of characters, maps, electronic mail, database materials, character biographies and dossiers. Initially, a specific “story-channel” (or a branching set of storylines) is constructed for the interactive environment, and the events that the participants are expected to experience are explicitly represented in the [0054] story representation system 20. The story execution system 30 initially selects the appropriate simulation elements from the content database 34 according to the story representation system 20 and the task model 22.
  • The [0055] experience manager 40 tracks the participants' actions and reports them to the story execution system 30 for comparison with the story representation system 20 and the task model 22. Each participant action is identified, for example as being “as expected” or as “different from expectations,” although other types of identifiers may be used. The experience manager 40 analyzes the participants' input and flags performance that does not correspond to expectations. In response to such unexpected performance, the experience manager 40 then generates the alert 41 and sends it to the instructor interface 50. The alert 41 not only points out when participant behavior deviates from expectations, but also suggests responses that the system or the instructor can make in reaction to the unexpected participant performance. These responses are designed to set the simulation story back on course or to plot out a new direction for the story.
  • [0056] Alerts 41 generated by the experience manager 40 pass to the instructor interface 50 for acceptance or rejection by the instructor and then back to the story execution system 30 for forwarding to the experience manager 40. Changes to events and media initiated by the instructor via the instructor interface 50 also pass to the story execution system 30 for forwarding to the experience manager 40. The chosen option is converted by the experience manager 40 into a media event 100 and inserted into the content database 34 for immediate or later playback to the participants. Thus, when the experience manager 40 determines that it will generate a new media event 100, it will create a record that allows the story execution system 30 to present the media event 100 to the participant. As such, the experience manager 40 is not required to know about the intricacies of the particular participant interface 31 that the participant maintains, only the nature of the media event 100 that must be produced. The participant manager 36 matches the media event 100 to the layout specifications for the participant interface 31 when triggered. Tags are substituted with the aid of the experience manager 40 and the media event 100 will be actualized by the participant workstation 32.
  • By way of example, multiple participants may be placed in the roles of United States Army personnel in a Tactical Operations Center (TOC) during a Stability and Security Operations, and may be presented with a number of challenging decisions that must be addressed. Or, to imagine a simple example in general game-play, the United States Army personnel described below may be replaced with the crew of a 24th Century spacecraft. Actions and decisions that are made by the participants cause changes in the simulated environment, ultimately causing the system to adapt the storyline in ways to achieve certain pedagogical or dramatic goals. [0057]
  • In the military example, one of the participants may play the role of the Battle Captain, who runs the operation of the TOC and ensures proper flow of information into, within and out of the TOC. The Battle Captain tracks the missions that are underway, tracks the activities of friendly and enemy forces, and reacts appropriately to unforeseen events. Thus, the following goals, among many others, may be set up as the Battle Captain's goal hierarchy: (i) assist the commanding officer, (ii) assist in unit planning, (iii) set the conditions for the success of the brigade, and (iv) ensure that information flows in the TOC. Each of these goals may have one or more sub-goals, such as (i.a) provide advice and recommendations to the commanding officer, (ii.a) assist in developing troop-leading procedures, (iii.a) synchronize the efforts of the brigade staff, and (iv.a) repeatedly monitor radios, aviation reports, and activities of friendly units. Each of these sub-goals may have one or more further sub-goals, and so on. [0058]
  • Next, by combining the goal hierarchy with evidence from actual military documents, a plan may be devised that hypothesizes the expected plan of a Battle Captain for a typical 12-hour shift. For example: (i) arrive at the TOC, (ii) participate in battle update activity, (iii) collaboratively schedule first staff huddle for current staff; (iv) collaboratively schedule battle update activity for next shift, (v) begin monitoring for triggered sub-plans, (vi) begin the execution of repetitive sub-plans, (vii) terminate execution of repetitive sub-plans, (viii) participate in scheduled battle update activity, (ix) terminate execution of triggered sub-plans, and (x) leave the TOC. [0059]
  • Next in the example, a staff battle plan is identified for responding to battle drills. These plans are the military's tool for quickly responding to unforeseen or time-critical situations. For example, the system may simulate an unforeseen communications loss with a subordinate unit, necessitating a quick response from the Battle Captain. Identifying which staff battle drills are appropriate in any given task model generally depends on the storylines that are created for each simulation. [0060]
  • Task models such as these may be authored at varying levels of detail and formality, depending on the specific needs that they will serve. The content of a [0061] task model 22 preferably comes from doctrinal publications and military training manuals, but also preferably includes assumptions or tacit knowledge obtained from known military stories and anecdotes.
  • Scenarios and elements thereof may also be developed by artists and other creative people with skill in dramatic writing and storytelling, such as screenplay writers and movie makers. [0062]
  • Continuing with the Battle Captain example, after an unforeseen loss of communications with a subordinate unit, it may be expected that the Battle Captain first checks recent activities and locations of enemy troops and then sends a second unit towards the location of the first unit. [0063]
  • If, however, the Battle Captain fails to check the activities and locations of enemy troops before deploying the second unit, the [0064] experience manager 40 generates an alert that the participant playing the Battle Captain is not acting as expected and sends the alert to the instructor interface 50 along with suggested responses for the instructor, such as “Employ coach to advise Battle Captain.” The instructor may then accept or reject the experience manager's 40 recommendation, depending on the instructor's desire to set the simulation back on track, to plot out a new direction for the simulation, or simply to teach the participant a valuable lesson.
  • As discussed, a [0065] specific media event 100 may contain a separate simulation event record 102 for each participant, and different participants may utilize different layouts for the media in their interface. For example, while the media delivered to a participant acting as a radar sector operator would be the same as the media delivered to a participant acting as a brigade commander, their access and presentation of that media would differ. Also, some media may be treated differently on different participants' interfaces. For example, an updated inventory of aircraft would be of great importance to an aviation officer but would be of passing interest to an intelligence officer. The notice may be visually highlighted in the aviation officer's interface through an alert. As such, the information related to the event will have to contain not only a layout identifier for the media, but also qualities for different participants in the story that effect the presentational rules for the media. Also, the media may differ from participant to participant. The intelligence officer may receive an audio file of a conversation while the aviation officer may only have access to a text manuscript of the file. On the other hand, the intelligence officer may have a simulated radio communication alert him that an active communication is taking place and force him to listen to it, while the aviation officer may gain access to the file only by navigating a series of menus that present the audio file in the context of the message. While the media file is the same, the display, presentation and impact on the participants differ greatly.
  • The designers of the simulation may anticipate many kinds of variations from the normal progress of the story. These variations can be pre-produced in traditional media forms and exist in the [0066] content database 34 for future use in the event that they are called for by the participant performance. The diagram of the use of these kinds of media and the new direction in which they take the story correspond to traditional branching storylines that have been used in interactive lessons in the past. These options are preferably presented to the instructor on the instructor interface 50 before they are used in the simulation, although the experience manager 40, as an artificial intelligence engine, may be programmed to deploy the elements as needed. Moreover, the instructor has the capability to edit many of the pre-produced options.
  • Other options, such as the use of the [0067] synthetic characters 60 as coaches, are not pre-produced but can be generated by the system or the instructor on the spot. The synthetic character engine has the capability to select an appropriate response to the participant action and create that response in real time. However, the original response is preferably presented to the instructor in the instructor interface 50 so that it can be approved and/or edited by the instructor before it is implemented. Once the response is created and approved, the experience manager 40 sends it to the story execution system 30. Approved options are converted by the experience manager 40 into media event records and inserted into the content database 34.
  • The [0068] automated coaching system 70 contains the artificial intelligence to understand the performance of the participants and judge whether it is correct or incorrect. It can then automatically and immediately articulate advice, examples or criticism to the participants that will help tutor them and guide them to the correct performance according to the pedagogical goals of the exercise. Because the simulation is story-based, the synthetic character 60 that delivers the advice to the participant can play the role of one of the characters in the story. As such, the character will display the personality and style of the character as it imparts information to the appropriate participant. As with the experience manager 40, the artificial intelligence of the automated coaching system 70 is preferably rules-based. In another preferred embodiment, the artificial intelligence may be knowledge-based.
  • Turning to FIG. 6, once the decision has been made by the system and the instructor to deploy a [0069] synthetic character 60 with a specific statement, the story execution system 30 displays a media item on the participants' screens that portrays the synthetic character 60 saying the words. Preferably, this media item has both audio and visual components that cause the participants to believe that the character is a real human being that was participating in the simulation from an off-site location and using the same video-conferencing tools that are available to the participants.
  • The most believable media that could be presented to the participants is a pre-produced [0070] digital video file 120, capturing an actor delivering a predetermined speech. Special effects may be added to the video file to simulate the effects of latency caused by such things as video-conferencing over the Internet, among other factors. Alternatively, an algorithm could be created to transform textual input into audio output by voice synthesis, while accompanying a static photograph 122 of the speaking character. This enables the instructor to tailor the communications to the particular participants as necessary. As a further alternative, the synthetic text-to-speech algorithm could be used with articulation photographs 124 (i.e., photographs of actors articulating specific vowel and consonant sounds) or animated character models.
  • Although the invention has been described in terms of particular embodiments in an application, one of ordinary skill in the art, in light of the teachings herein, can generate additional embodiments and modifications without departing from the spirit of, or exceeding the scope of, the claimed invention. Nothing in the above description is meant to limit the present invention to any specific materials, geometry, or orientation of elements. Many part/orientation substitutions are contemplated within the scope of the present invention and will be apparent to those skilled in the art. Accordingly, it is understood that the drawings, descriptions and examples herein are proffered only to facilitate comprehension of the invention and should not be construed to limit the scope thereof. [0071]

Claims (39)

What is claimed is:
1. A method of training comprising the steps of
generating simulation content;
delivering the simulation content to one or more participants via a computer network;
monitoring the one or more participants' responses to the simulation content; and
providing feedback to the one or more participants.
2. The method of claim 1, further including the step of generating one or more synthetic characters.
3. The method of claim 2, wherein the feedback is provided by the one or more synthetic characters.
4. The method of claim 2, wherein the one or more synthetic characters are used to alter the simulation content.
5. The method of claim 1, wherein the feedback is provided by an instructor.
6. The method of claim 1, further comprising the steps of
generating a representation of expected responses to the simulation content; and
alerting an instructor of the one or more participants' responses when the one or more participants' responses deviate from the representation of expected responses to the simulation content.
7. The method of claim 1, further comprising the step of altering the simulation content in response to the one or more participants' responses.
8. The method of claim 1, wherein the simulation content depicts military scenarios.
9. The method of claim 1, further comprising the step of delivering immersive audio to the one or more participants.
10. The method of claim 1, wherein the computer network comprises the Internet.
11. A training apparatus comprising
means for generating simulation content;
means for delivering the simulation content to one or more participants via a computer network;
means for monitoring the one or more participants' responses to the simulation content; and
means for providing feedback to the one or more participants.
12. The apparatus of claim 11, further including means for generating one or more synthetic characters.
13. The apparatus of claim 12, wherein the feedback is provided by the one or more synthetic characters.
14. The apparatus of claim 12, wherein the one or more synthetic characters are used to alter the simulation content.
15. The apparatus of claim 11, wherein the feedback is provided by an instructor.
16. The apparatus of claim 11, further comprising
means for generating a representation of expected responses to the simulation content; and
means for alerting an instructor of the one or more participants' responses when the one or more participants' responses deviate from the representation of expected responses to the simulation content.
17. The apparatus of claim 11, further comprising means for altering the simulation content in response to the one or more participants' responses.
18. The apparatus of claim 11, wherein the simulation content depicts military scenarios.
19. The apparatus of claim 11, further comprising a means for delivering immersive audio to the one or more participants.
20. The apparatus of claim 11, wherein the computer network comprises the Internet.
21. A simulation method comprising the steps of
generating simulation content;
generating a representation of expected responses to the simulation content;
delivering the simulation content to one or more participants via a computer network;
monitoring the one or more participants' responses to the simulation content;
comparing the one or more participants' responses with the representation of expected responses to the simulation content; and
altering the simulation content in response to the one or more participants' responses.
22. The method of claim 21, further including the step of generating one or more synthetic characters.
23. The method of claim 21, wherein the simulation content depicts military scenarios.
24. The method of claim 21, further comprising the step of delivering immersive audio to the one or more participants.
25. The method of claim 21, wherein the computer network comprises the Internet.
26. A simulation apparatus comprising
means for generating simulation content;
means for generating a representation of expected responses to the simulation content;
means for delivering the simulation content to one or more participants via a computer network;
means for monitoring the one or more participants' responses to the simulation content;
means for comparing the one or more participants' responses with the representation of expected responses to the simulation content; and
means for altering the simulation content in response to the one or more participants' responses.
27. The apparatus of claim 26, further including a means for generating one or more synthetic characters.
28. The apparatus of claim 26, wherein the simulation content depicts military scenarios.
29. The apparatus of claim 26, further comprising a means for delivering immersive audio to the one or more participants.
30. The apparatus of claim 26, wherein the computer network comprises the Internet.
31. A simulation apparatus comprising
a database containing simulation content;
one or more participant workstations;
a web server for delivering the simulation content to the one or more participant workstations;
an instructor interface for displaying information to an instructor and receiving input from the instructor;
one or more participant interfaces connecting the web server to the respective one or more participant workstations; and
an artificial intelligence engine for analyzing input into the one or more participant workstations and altering the simulation content in response to the input.
32. The apparatus of claim 31, further comprising a means for generating one or more synthetic characters.
33. The apparatus of claim 32, wherein the one or more synthetic characters are represented by digital video.
34. The apparatus of claim 32, wherein the one or more synthetic characters are represented by one or more static photographs.
35. The apparatus of claim 32, wherein the one or more synthetic characters are represented by a plurality of articulation photographs.
36. The apparatus of claim 31, further comprising one or more authoring tools for generating additional simulation content.
37. The apparatus of claim 31, further comprising a means for delivering immersive audio to the one or more participant workstations.
38. The apparatus of claim 31, further comprising a means for providing feedback.
39. The apparatus of claim 31, further comprising a system activity database for logging information generated in response to the simulation content.
US10/036,107 2001-11-09 2001-11-09 Method and apparatus for advanced leadership training simulation Abandoned US20030091970A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US10/036,107 US20030091970A1 (en) 2001-11-09 2001-11-09 Method and apparatus for advanced leadership training simulation
PCT/US2002/032175 WO2003042955A1 (en) 2001-11-09 2002-10-08 Method and apparatus for advanced leadership training simulation
CA002466309A CA2466309A1 (en) 2001-11-09 2002-10-08 Method and apparatus for advanced leadership training simulation
EP02792186A EP1444673A1 (en) 2001-11-09 2002-10-08 Method and apparatus for advanced leadership training simulation
US10/356,462 US7155158B1 (en) 2001-11-09 2003-01-31 Method and apparatus for advanced leadership training simulation and gaming applications

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10/036,107 US20030091970A1 (en) 2001-11-09 2001-11-09 Method and apparatus for advanced leadership training simulation

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US10/356,462 Continuation-In-Part US7155158B1 (en) 2001-11-09 2003-01-31 Method and apparatus for advanced leadership training simulation and gaming applications

Publications (1)

Publication Number Publication Date
US20030091970A1 true US20030091970A1 (en) 2003-05-15

Family

ID=21886652

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/036,107 Abandoned US20030091970A1 (en) 2001-11-09 2001-11-09 Method and apparatus for advanced leadership training simulation
US10/356,462 Expired - Fee Related US7155158B1 (en) 2001-11-09 2003-01-31 Method and apparatus for advanced leadership training simulation and gaming applications

Family Applications After (1)

Application Number Title Priority Date Filing Date
US10/356,462 Expired - Fee Related US7155158B1 (en) 2001-11-09 2003-01-31 Method and apparatus for advanced leadership training simulation and gaming applications

Country Status (4)

Country Link
US (2) US20030091970A1 (en)
EP (1) EP1444673A1 (en)
CA (1) CA2466309A1 (en)
WO (1) WO2003042955A1 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030113698A1 (en) * 2001-12-14 2003-06-19 Von Der Geest Michael Method and system for developing teaching and leadership characteristics and skills
US20030125924A1 (en) * 2001-12-28 2003-07-03 Testout Corporation System and method for simulating computer network devices for competency training and testing simulations
US20030154204A1 (en) * 2002-01-14 2003-08-14 Kathy Chen-Wright System and method for a hierarchical database management system for educational training and competency testing simulations
WO2003098376A2 (en) * 2002-05-21 2003-11-27 Tay Kim Huat Abel Paschal Strategic business simulation
US20040180315A1 (en) * 2003-03-14 2004-09-16 Toohey Shane A. Training system & method
US20040234934A1 (en) * 2003-05-23 2004-11-25 Kevin Shin Educational and training system
US20060048092A1 (en) * 2004-08-31 2006-03-02 Kirkley Eugene H Jr Object oriented mixed reality and video game authoring tool system and method
US20070020604A1 (en) * 2005-07-19 2007-01-25 Pranaya Chulet A Rich Media System and Method For Learning And Entertainment
US20070117503A1 (en) * 2005-11-21 2007-05-24 Warminsky Michael F Airflow ceiling ventilation system for an armored tactical personnel and collective training facility
US20070113487A1 (en) * 2005-11-21 2007-05-24 Amec Earth & Environmental, Inc. Re-configurable armored tactical personnel and collective training facility
US20070260968A1 (en) * 2004-04-16 2007-11-08 Howard Johnathon E Editing system for audiovisual works and corresponding text for television news
US20080027692A1 (en) * 2003-01-29 2008-01-31 Wylci Fables Data visualization methods for simulation modeling of agent behavioral expression
US20080050711A1 (en) * 2006-08-08 2008-02-28 Doswell Jayfus T Modulating Computer System Useful for Enhancing Learning
US20080077870A1 (en) * 2004-01-09 2008-03-27 Suzanne Napoleon Method and apparatus for producing structured sgml/xml student compositions
US20080114708A1 (en) * 2006-05-05 2008-05-15 Lockheed Martin Corporation Systems and Methods of Developing Intuitive Decision-Making Trainers
US20080166692A1 (en) * 2007-01-08 2008-07-10 David Smith System and method of reinforcing learning
US20080261185A1 (en) * 2007-04-18 2008-10-23 Aha! Process, Incorporated Simulated teaching environment
US20090282749A1 (en) * 2005-11-21 2009-11-19 Warminsky Michael F Re-configurable armored tactical personnel and collective training facility
US20100227297A1 (en) * 2005-09-20 2010-09-09 Raydon Corporation Multi-media object identification system with comparative magnification response and self-evolving scoring
US20120165102A1 (en) * 2009-08-14 2012-06-28 Wikistrat Ltd. Distributed multi-player strategy game
US20140065576A1 (en) * 2012-05-07 2014-03-06 Eads Deutschland Gmbh Device for Visualizing Military Operations
CN105225564A (en) * 2014-06-10 2016-01-06 河南省电力勘测设计院 Based on the communication of power system anti-accident exercising platform of three-dimensional
CN111308907A (en) * 2019-12-20 2020-06-19 中国航空工业集团公司沈阳飞机设计研究所 Automatic battle-level airplane simulation control method, control plug-in and simulation system

Families Citing this family (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8147334B2 (en) 2003-09-04 2012-04-03 Jean-Marie Gatto Universal game server
US20090035736A1 (en) * 2004-01-16 2009-02-05 Harold Wolpert Real-time training simulation system and method
US7761262B2 (en) * 2004-01-28 2010-07-20 Hntb Corporation Method and system for implementing a training facility
US20060008784A1 (en) * 2004-07-09 2006-01-12 Don Schmincke Leadership training method
US20070111169A1 (en) * 2005-08-15 2007-05-17 University Of Southern California Interactive Story Development System with Automated Goal Prioritization
US20070136672A1 (en) * 2005-12-12 2007-06-14 Michael Cooper Simulation authoring tool
WO2007076513A2 (en) * 2005-12-27 2007-07-05 Bonnie Johnson Virtual counseling practice
US8277315B2 (en) * 2006-03-01 2012-10-02 Hybrid Learning Systems, Inc. Game simulation based on current events
US7599861B2 (en) 2006-03-02 2009-10-06 Convergys Customer Management Group, Inc. System and method for closed loop decisionmaking in an automated care system
US7809663B1 (en) 2006-05-22 2010-10-05 Convergys Cmg Utah, Inc. System and method for supporting the utilization of machine language
US8379830B1 (en) 2006-05-22 2013-02-19 Convergys Customer Management Delaware Llc System and method for automated customer service with contingent live interaction
GB0616107D0 (en) 2006-08-15 2006-09-20 Iti Scotland Ltd Games-based learning
US8245157B2 (en) * 2006-08-23 2012-08-14 Kelly Alan D Decision making and planning system, method, and software utilizing a taxonomic table of irreducible strategies or plays
US8571463B2 (en) 2007-01-30 2013-10-29 Breakthrough Performancetech, Llc Systems and methods for computerized interactive skill training
US8714987B2 (en) * 2007-03-28 2014-05-06 Breakthrough Performancetech, Llc Systems and methods for computerized interactive training
US8622831B2 (en) * 2007-06-21 2014-01-07 Microsoft Corporation Responsive cutscenes in video games
US8368721B2 (en) * 2007-10-06 2013-02-05 Mccoy Anthony Apparatus and method for on-field virtual reality simulation of US football and other sports
US20090138813A1 (en) 2007-11-14 2009-05-28 Lamontagne Entertainment, Llc System and method for providing an objective to a user
US20090124386A1 (en) * 2007-11-14 2009-05-14 Lamontagne Joel David Method and system for randomly altering information and content within web pages to create a new and unique website and online game
US7890534B2 (en) * 2007-12-28 2011-02-15 Microsoft Corporation Dynamic storybook
US8226477B1 (en) * 2008-07-23 2012-07-24 Liveops, Inc. Automatic simulation of call center scenarios
AU2009276721B2 (en) 2008-07-28 2015-06-18 Breakthrough Performancetech, Llc Systems and methods for computerized interactive skill training
US8489887B1 (en) 2008-12-31 2013-07-16 Bank Of America Corporation Biometric authentication for video communication sessions
US20110113315A1 (en) * 2008-12-31 2011-05-12 Microsoft Corporation Computer-assisted rich interactive narrative (rin) generation
US20110113316A1 (en) * 2008-12-31 2011-05-12 Microsoft Corporation Authoring tools for rich interactive narratives
US9092437B2 (en) * 2008-12-31 2015-07-28 Microsoft Technology Licensing, Llc Experience streams for rich interactive narratives
US20120310772A1 (en) * 2009-10-29 2012-12-06 Clayton Richard Morlock Universal registry system and method of use and creation thereof
US20110111385A1 (en) * 2009-11-06 2011-05-12 Honeywell International Inc. Automated training system and method based on performance evaluation
US8812538B2 (en) * 2010-01-29 2014-08-19 Wendy Muzatko Story generation methods, story generation apparatuses, and articles of manufacture
US9589253B2 (en) * 2010-06-15 2017-03-07 Microsoft Technology Licensing, Llc Workflow authoring environment and runtime
EP2729058B1 (en) 2011-07-05 2019-03-13 Saudi Arabian Oil Company Floor mat system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees
US10108783B2 (en) 2011-07-05 2018-10-23 Saudi Arabian Oil Company Systems, computer medium and computer-implemented methods for monitoring health of employees using mobile devices
US9833142B2 (en) 2011-07-05 2017-12-05 Saudi Arabian Oil Company Systems, computer medium and computer-implemented methods for coaching employees based upon monitored health conditions using an avatar
US9844344B2 (en) 2011-07-05 2017-12-19 Saudi Arabian Oil Company Systems and method to monitor health of employee when positioned in association with a workstation
US9710788B2 (en) 2011-07-05 2017-07-18 Saudi Arabian Oil Company Computer mouse system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees
US9492120B2 (en) 2011-07-05 2016-11-15 Saudi Arabian Oil Company Workstation for monitoring and improving health and productivity of employees
US10307104B2 (en) 2011-07-05 2019-06-04 Saudi Arabian Oil Company Chair pad system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees
US10339550B2 (en) * 2012-12-11 2019-07-02 Quest 2 Excel, Inc. Gamified project management system and method
US20140162220A1 (en) * 2012-12-11 2014-06-12 Quest 2 Excel, Inc. System, method and computer program product for gamification of business processes
US9722472B2 (en) 2013-12-11 2017-08-01 Saudi Arabian Oil Company Systems, computer medium and computer-implemented methods for harvesting human energy in the workplace
RU2562096C1 (en) * 2014-06-25 2015-09-10 Федеральное государственное казённое военное образовательное учреждение высшего профессионального образования "Военная академия воздушно-космической обороны им. Маршала Советского Союза Г.К. Жукова" Министерства обороны Российской Федерации Training command post of main rocket attack warning centre
US11250630B2 (en) 2014-11-18 2022-02-15 Hallmark Cards, Incorporated Immersive story creation
US9402054B2 (en) * 2014-12-08 2016-07-26 Blue Jeans Network Provision of video conference services
US10067775B2 (en) * 2015-02-19 2018-09-04 Disney Enterprises, Inc. Guided authoring of interactive content
US9889311B2 (en) 2015-12-04 2018-02-13 Saudi Arabian Oil Company Systems, protective casings for smartphones, and associated methods to enhance use of an automated external defibrillator (AED) device
US10642955B2 (en) 2015-12-04 2020-05-05 Saudi Arabian Oil Company Devices, methods, and computer medium to provide real time 3D visualization bio-feedback
US10475351B2 (en) 2015-12-04 2019-11-12 Saudi Arabian Oil Company Systems, computer medium and methods for management training systems
US10628770B2 (en) 2015-12-14 2020-04-21 Saudi Arabian Oil Company Systems and methods for acquiring and employing resiliency data for leadership development
US10783535B2 (en) * 2016-05-16 2020-09-22 Cerebri AI Inc. Business artificial intelligence management engine
US10607595B2 (en) * 2017-08-07 2020-03-31 Lenovo (Singapore) Pte. Ltd. Generating audio rendering from textual content based on character models
US10824132B2 (en) 2017-12-07 2020-11-03 Saudi Arabian Oil Company Intelligent personal protective equipment
GB2584380A (en) 2018-11-22 2020-12-09 Thales Holdings Uk Plc Methods for generating a simulated enviroment in which the behaviour of one or more individuals is modelled
US11500999B2 (en) 2019-12-20 2022-11-15 International Business Machines Corporation Testing simulation sequence using industry specific parameters

Citations (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4591248A (en) * 1982-04-23 1986-05-27 Freeman Michael J Dynamic audience responsive movie system
US5310349A (en) * 1992-04-30 1994-05-10 Jostens Learning Corporation Instructional management system
US5441415A (en) * 1992-02-11 1995-08-15 John R. Lee Interactive computer aided natural learning method and apparatus
US5465384A (en) * 1992-11-25 1995-11-07 Actifilm, Inc. Automatic polling and display interactive entertainment system
US5544305A (en) * 1994-01-25 1996-08-06 Apple Computer, Inc. System and method for creating and executing interactive interpersonal computer simulations
US5676551A (en) * 1995-09-27 1997-10-14 All Of The Above Inc. Method and apparatus for emotional modulation of a Human personality within the context of an interpersonal relationship
US5701400A (en) * 1995-03-08 1997-12-23 Amado; Carlos Armando Method and apparatus for applying if-then-else rules to data sets in a relational data base and generating from the results of application of said rules a database of diagnostics linked to said data sets to aid executive analysis of financial data
US5721845A (en) * 1993-02-18 1998-02-24 Apple Computer, Inc. Topically organized interface with realistic dialogue
US5727950A (en) * 1996-05-22 1998-03-17 Netsage Corporation Agent based instruction system and method
US5737527A (en) * 1995-08-31 1998-04-07 U.S. Philips Corporation Interactive entertainment apparatus
US5805784A (en) * 1994-09-28 1998-09-08 Crawford; Christopher C. Computer story generation system and method using network of re-usable substories
US5813863A (en) * 1996-05-01 1998-09-29 Sloane; Sharon R. Interactive behavior modification system
US5827070A (en) * 1992-10-09 1998-10-27 Educational Testing Service System and methods for computer based testing
US5918217A (en) * 1997-12-10 1999-06-29 Financial Engines, Inc. User interface for a financial advisory system
US5963953A (en) * 1998-03-30 1999-10-05 Siebel Systems, Inc. Method, and system for product configuration
US5987443A (en) * 1998-12-22 1999-11-16 Ac Properties B. V. System, method and article of manufacture for a goal based educational system
US5999182A (en) * 1998-05-11 1999-12-07 The Board Of Trustees Of The Leland Stanford Junior University Computational architecture for reasoning involving extensible graphical representations
US6029156A (en) * 1998-12-22 2000-02-22 Ac Properties B.V. Goal based tutoring system with behavior to tailor to characteristics of a particular user
US6032141A (en) * 1998-12-22 2000-02-29 Ac Properties B.V. System, method and article of manufacture for a goal based educational system with support for dynamic tailored feedback
US6049332A (en) * 1996-10-07 2000-04-11 Sony Corporation Method and apparatus for the scheduling and ordering of elements in a multimedia environment
US6125358A (en) * 1998-12-22 2000-09-26 Ac Properties B.V. System, method and article of manufacture for a simulation system for goal based education of a plurality of students
US6134539A (en) * 1998-12-22 2000-10-17 Ac Properties B.V. System, method and article of manufacture for a goal based education and reporting system
US6171109B1 (en) * 1997-06-18 2001-01-09 Adin Research, Inc. Method for generating a multi-strata model and an intellectual information processing device
US6296487B1 (en) * 1999-06-14 2001-10-02 Ernest L. Lotecka Method and system for facilitating communicating and behavior skills training
US6308187B1 (en) * 1998-02-09 2001-10-23 International Business Machines Corporation Computer system and method for abstracting and accessing a chronologically-arranged collection of information
US6324678B1 (en) * 1990-04-06 2001-11-27 Lsi Logic Corporation Method and system for creating and validating low level description of electronic design
US6421667B1 (en) * 1996-06-11 2002-07-16 Edgar F. Codd Delta model processing logic representation and execution system
US6427063B1 (en) * 1997-05-22 2002-07-30 Finali Corporation Agent based instruction system and method
US6449603B1 (en) * 1996-05-23 2002-09-10 The United States Of America As Represented By The Secretary Of The Department Of Health And Human Services System and method for combining multiple learning agents to produce a prediction method
US20020146667A1 (en) * 2001-02-14 2002-10-10 Safe Drive Technologies, Llc Staged-learning process and system for situational awareness training using integrated media
US6470482B1 (en) * 1990-04-06 2002-10-22 Lsi Logic Corporation Method and system for creating, deriving and validating structural description of electronic system from higher level, behavior-oriented description, including interactive schematic design and simulation
US6527641B1 (en) * 1999-09-24 2003-03-04 Nokia Corporation System for profiling mobile station activity in a predictive command wireless game system
US6544040B1 (en) * 2000-06-27 2003-04-08 Cynthia P. Brelis Method, apparatus and article for presenting a narrative, including user selectable levels of detail
US6561811B2 (en) * 1999-08-09 2003-05-13 Entertainment Science, Inc. Drug abuse prevention computer game
US6622003B1 (en) * 2000-08-14 2003-09-16 Unext.Com Llc Method for developing or providing an electronic course
US6709335B2 (en) * 2001-09-19 2004-03-23 Zoesis, Inc. Method of displaying message in an interactive computer process during the times of heightened user interest
US6736642B2 (en) * 1999-08-31 2004-05-18 Indeliq, Inc. Computer enabled training of a user to validate assumptions
US20040095378A1 (en) * 2000-06-09 2004-05-20 Michael Vigue Work/training using an electronic infrastructure
US20040103148A1 (en) * 2002-08-15 2004-05-27 Clark Aldrich Computer-based learning system
US6758756B1 (en) * 1997-12-19 2004-07-06 Konami Co., Ltd. Method of controlling video game, video game device, and medium recording video game program
US7156665B1 (en) * 1999-02-08 2007-01-02 Accenture, Llp Goal based educational system with support for dynamic tailored feedback

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5734916A (en) * 1994-06-01 1998-03-31 Screenplay Systems, Inc. Method and apparatus for identifying, predicting, and reporting object relationships
WO1996031829A1 (en) * 1995-04-06 1996-10-10 Avid Technology, Inc. Graphical multimedia authoring system
US6408263B1 (en) * 1998-07-31 2002-06-18 Gary J. Summers Management training simulation method and system
US6074213A (en) * 1998-08-17 2000-06-13 Hon; David C. Fractional process simulator with remote apparatus for multi-locational training of medical teams
US20010049087A1 (en) 2000-01-03 2001-12-06 Hale Janet B. System and method of distance education
US6705869B2 (en) * 2000-06-02 2004-03-16 Darren Schwartz Method and system for interactive communication skill training
US7136791B2 (en) 2000-10-24 2006-11-14 International Business Machines Corporation Story-based organizational assessment and effect system
US20020124048A1 (en) 2001-03-05 2002-09-05 Qin Zhou Web based interactive multimedia story authoring system and method
US6739877B2 (en) * 2001-03-06 2004-05-25 Medical Simulation Corporation Distributive processing simulation method and system for training healthcare teams
US6767213B2 (en) * 2001-03-17 2004-07-27 Management Research Institute, Inc. System and method for assessing organizational leadership potential through the use of metacognitive predictors
US20020182570A1 (en) * 2001-05-31 2002-12-05 Croteau Marguerite Claire Computer-based quality enhancement training program
US20030014400A1 (en) * 2001-06-12 2003-01-16 Advanced Research And Technology Institute System and method for case study instruction
US6709272B2 (en) * 2001-08-07 2004-03-23 Bruce K. Siddle Method for facilitating firearms training via the internet
US6803925B2 (en) 2001-09-06 2004-10-12 Microsoft Corporation Assembling verbal narration for digital display images
EP1436715A4 (en) 2001-09-10 2007-09-12 My2Centences Llc Method and system for creating a collaborative work over a digital network

Patent Citations (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4591248A (en) * 1982-04-23 1986-05-27 Freeman Michael J Dynamic audience responsive movie system
US6470482B1 (en) * 1990-04-06 2002-10-22 Lsi Logic Corporation Method and system for creating, deriving and validating structural description of electronic system from higher level, behavior-oriented description, including interactive schematic design and simulation
US6324678B1 (en) * 1990-04-06 2001-11-27 Lsi Logic Corporation Method and system for creating and validating low level description of electronic design
US5441415A (en) * 1992-02-11 1995-08-15 John R. Lee Interactive computer aided natural learning method and apparatus
US5310349A (en) * 1992-04-30 1994-05-10 Jostens Learning Corporation Instructional management system
US5827070A (en) * 1992-10-09 1998-10-27 Educational Testing Service System and methods for computer based testing
US5465384A (en) * 1992-11-25 1995-11-07 Actifilm, Inc. Automatic polling and display interactive entertainment system
US5721845A (en) * 1993-02-18 1998-02-24 Apple Computer, Inc. Topically organized interface with realistic dialogue
US5544305A (en) * 1994-01-25 1996-08-06 Apple Computer, Inc. System and method for creating and executing interactive interpersonal computer simulations
US5805784A (en) * 1994-09-28 1998-09-08 Crawford; Christopher C. Computer story generation system and method using network of re-usable substories
US5701400A (en) * 1995-03-08 1997-12-23 Amado; Carlos Armando Method and apparatus for applying if-then-else rules to data sets in a relational data base and generating from the results of application of said rules a database of diagnostics linked to said data sets to aid executive analysis of financial data
US5737527A (en) * 1995-08-31 1998-04-07 U.S. Philips Corporation Interactive entertainment apparatus
US5676551A (en) * 1995-09-27 1997-10-14 All Of The Above Inc. Method and apparatus for emotional modulation of a Human personality within the context of an interpersonal relationship
US5813863A (en) * 1996-05-01 1998-09-29 Sloane; Sharon R. Interactive behavior modification system
US5727950A (en) * 1996-05-22 1998-03-17 Netsage Corporation Agent based instruction system and method
US6449603B1 (en) * 1996-05-23 2002-09-10 The United States Of America As Represented By The Secretary Of The Department Of Health And Human Services System and method for combining multiple learning agents to produce a prediction method
US6421667B1 (en) * 1996-06-11 2002-07-16 Edgar F. Codd Delta model processing logic representation and execution system
US6049332A (en) * 1996-10-07 2000-04-11 Sony Corporation Method and apparatus for the scheduling and ordering of elements in a multimedia environment
US6427063B1 (en) * 1997-05-22 2002-07-30 Finali Corporation Agent based instruction system and method
US6171109B1 (en) * 1997-06-18 2001-01-09 Adin Research, Inc. Method for generating a multi-strata model and an intellectual information processing device
US5918217A (en) * 1997-12-10 1999-06-29 Financial Engines, Inc. User interface for a financial advisory system
US6758756B1 (en) * 1997-12-19 2004-07-06 Konami Co., Ltd. Method of controlling video game, video game device, and medium recording video game program
US6308187B1 (en) * 1998-02-09 2001-10-23 International Business Machines Corporation Computer system and method for abstracting and accessing a chronologically-arranged collection of information
US5963953A (en) * 1998-03-30 1999-10-05 Siebel Systems, Inc. Method, and system for product configuration
US5999182A (en) * 1998-05-11 1999-12-07 The Board Of Trustees Of The Leland Stanford Junior University Computational architecture for reasoning involving extensible graphical representations
US6029156A (en) * 1998-12-22 2000-02-22 Ac Properties B.V. Goal based tutoring system with behavior to tailor to characteristics of a particular user
US5987443A (en) * 1998-12-22 1999-11-16 Ac Properties B. V. System, method and article of manufacture for a goal based educational system
US6134539A (en) * 1998-12-22 2000-10-17 Ac Properties B.V. System, method and article of manufacture for a goal based education and reporting system
US6125358A (en) * 1998-12-22 2000-09-26 Ac Properties B.V. System, method and article of manufacture for a simulation system for goal based education of a plurality of students
US6032141A (en) * 1998-12-22 2000-02-29 Ac Properties B.V. System, method and article of manufacture for a goal based educational system with support for dynamic tailored feedback
US7156665B1 (en) * 1999-02-08 2007-01-02 Accenture, Llp Goal based educational system with support for dynamic tailored feedback
US6296487B1 (en) * 1999-06-14 2001-10-02 Ernest L. Lotecka Method and system for facilitating communicating and behavior skills training
US6561811B2 (en) * 1999-08-09 2003-05-13 Entertainment Science, Inc. Drug abuse prevention computer game
US6736642B2 (en) * 1999-08-31 2004-05-18 Indeliq, Inc. Computer enabled training of a user to validate assumptions
US6527641B1 (en) * 1999-09-24 2003-03-04 Nokia Corporation System for profiling mobile station activity in a predictive command wireless game system
US20040095378A1 (en) * 2000-06-09 2004-05-20 Michael Vigue Work/training using an electronic infrastructure
US6544040B1 (en) * 2000-06-27 2003-04-08 Cynthia P. Brelis Method, apparatus and article for presenting a narrative, including user selectable levels of detail
US6622003B1 (en) * 2000-08-14 2003-09-16 Unext.Com Llc Method for developing or providing an electronic course
US20020146667A1 (en) * 2001-02-14 2002-10-10 Safe Drive Technologies, Llc Staged-learning process and system for situational awareness training using integrated media
US6709335B2 (en) * 2001-09-19 2004-03-23 Zoesis, Inc. Method of displaying message in an interactive computer process during the times of heightened user interest
US20040103148A1 (en) * 2002-08-15 2004-05-27 Clark Aldrich Computer-based learning system

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030113698A1 (en) * 2001-12-14 2003-06-19 Von Der Geest Michael Method and system for developing teaching and leadership characteristics and skills
US7200545B2 (en) * 2001-12-28 2007-04-03 Testout Corporation System and method for simulating computer network devices for competency training and testing simulations
US20030125924A1 (en) * 2001-12-28 2003-07-03 Testout Corporation System and method for simulating computer network devices for competency training and testing simulations
US20030154204A1 (en) * 2002-01-14 2003-08-14 Kathy Chen-Wright System and method for a hierarchical database management system for educational training and competency testing simulations
US7523127B2 (en) 2002-01-14 2009-04-21 Testout Corporation System and method for a hierarchical database management system for educational training and competency testing simulations
WO2003098376A2 (en) * 2002-05-21 2003-11-27 Tay Kim Huat Abel Paschal Strategic business simulation
US20060064352A1 (en) * 2002-05-21 2006-03-23 Paschal Tay Kim Huat A Strategic business simulation
WO2003098376A3 (en) * 2002-05-21 2008-05-02 Tay Kim Huat Abel Paschal Strategic business simulation
US7630874B2 (en) * 2003-01-29 2009-12-08 Seaseer Research And Development Llc Data visualization methods for simulation modeling of agent behavioral expression
US20080027692A1 (en) * 2003-01-29 2008-01-31 Wylci Fables Data visualization methods for simulation modeling of agent behavioral expression
US20040180315A1 (en) * 2003-03-14 2004-09-16 Toohey Shane A. Training system & method
US20040234934A1 (en) * 2003-05-23 2004-11-25 Kevin Shin Educational and training system
US20080077870A1 (en) * 2004-01-09 2008-03-27 Suzanne Napoleon Method and apparatus for producing structured sgml/xml student compositions
US7836389B2 (en) * 2004-04-16 2010-11-16 Avid Technology, Inc. Editing system for audiovisual works and corresponding text for television news
US20070260968A1 (en) * 2004-04-16 2007-11-08 Howard Johnathon E Editing system for audiovisual works and corresponding text for television news
US20060048092A1 (en) * 2004-08-31 2006-03-02 Kirkley Eugene H Jr Object oriented mixed reality and video game authoring tool system and method
US20070020604A1 (en) * 2005-07-19 2007-01-25 Pranaya Chulet A Rich Media System and Method For Learning And Entertainment
US20100227297A1 (en) * 2005-09-20 2010-09-09 Raydon Corporation Multi-media object identification system with comparative magnification response and self-evolving scoring
US8186109B2 (en) 2005-11-21 2012-05-29 Uxb International, Inc. Re-configurable armored tactical personnel and collective training facility
US20070117503A1 (en) * 2005-11-21 2007-05-24 Warminsky Michael F Airflow ceiling ventilation system for an armored tactical personnel and collective training facility
US20090282749A1 (en) * 2005-11-21 2009-11-19 Warminsky Michael F Re-configurable armored tactical personnel and collective training facility
US20070113487A1 (en) * 2005-11-21 2007-05-24 Amec Earth & Environmental, Inc. Re-configurable armored tactical personnel and collective training facility
US8126838B2 (en) * 2006-05-05 2012-02-28 Lockheed Martin Corporation Systems and methods of developing intuitive decision-making trainers
US20080114708A1 (en) * 2006-05-05 2008-05-15 Lockheed Martin Corporation Systems and Methods of Developing Intuitive Decision-Making Trainers
US20080050711A1 (en) * 2006-08-08 2008-02-28 Doswell Jayfus T Modulating Computer System Useful for Enhancing Learning
US20080166692A1 (en) * 2007-01-08 2008-07-10 David Smith System and method of reinforcing learning
US20080261185A1 (en) * 2007-04-18 2008-10-23 Aha! Process, Incorporated Simulated teaching environment
US20120165102A1 (en) * 2009-08-14 2012-06-28 Wikistrat Ltd. Distributed multi-player strategy game
US20140065576A1 (en) * 2012-05-07 2014-03-06 Eads Deutschland Gmbh Device for Visualizing Military Operations
US9530330B2 (en) * 2012-05-07 2016-12-27 Airbus Defence and Space GmbH Device for visualizing military operations
CN105225564A (en) * 2014-06-10 2016-01-06 河南省电力勘测设计院 Based on the communication of power system anti-accident exercising platform of three-dimensional
CN111308907A (en) * 2019-12-20 2020-06-19 中国航空工业集团公司沈阳飞机设计研究所 Automatic battle-level airplane simulation control method, control plug-in and simulation system

Also Published As

Publication number Publication date
EP1444673A1 (en) 2004-08-11
US7155158B1 (en) 2006-12-26
CA2466309A1 (en) 2003-05-22
WO2003042955A1 (en) 2003-05-22

Similar Documents

Publication Publication Date Title
US20030091970A1 (en) Method and apparatus for advanced leadership training simulation
Kenny et al. Building interactive virtual humans for training environments
Swartout et al. Toward the holodeck: Integrating graphics, sound, character and story
Mavor et al. Modeling human and organizational behavior: Application to military simulations
Hill et al. Pedagogically structured game-based training: Development of the ELECT BiLAT simulation
Sims Reusable, lifelike virtual humans for mentoring and role-playing
Rudestam et al. Handbook of online learning
Soliman et al. Intelligent pedagogical agents in immersive virtual learning environments: A review
Molenda et al. Creating
Pagano Immersive learning: Designing for authentic practice
Saleeb Closing the chasm between virtual and physical delivery for innovative learning spaces using learning analytics
Klabbers Social problem solving: Beyond method
Hill Jr et al. Guided Conversations about Leadership: Mentoring with Movies and Interactive Characters.
Osborn An agent-based architecture for generating interactive stories
Adams et al. Media and Literacy: Learning in the Information Age--Issues, Ideas, and Teaching Strategies
AU2002357651A1 (en) Method and apparatus for advanced leadership training simulation
Richards et al. Impacts of visualisation, interaction and immersion on learning using an agent-based training simulation
Graves et al. A comparison of interactive multimedia instruction designs addressing soldiers' learning needs
Macknight Changing educational paradigms
Wasfy et al. Virtual reality enhanced online learning environments as a substitute for classroom instruction
Popovici et al. Motivate them to communicate
Smith Amazing Races Spanning from Outdoor Instruction All the Way to Virtual Reality
Gossman et al. Command group training in the Objective Force
Birchfield et al. Sound and interaction for K-12 mediated education
Dell'Aquila et al. Traditional Settings and New Technologies for Role-Play Implementation

Legal Events

Date Code Title Description
AS Assignment

Owner name: ALTSIM, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:IUPPA, NICHOLAS V.;FAST, NATHANIEL A.;REEL/FRAME:012799/0585;SIGNING DATES FROM 20020215 TO 20020222

Owner name: SOUTHERN CALIFORNIA, UNIVERSITY OF, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GORDON, ANDREW S.;HILL, JR., RANDALL W.;LINDHEIM, RICHARD D.;AND OTHERS;REEL/FRAME:012799/0615;SIGNING DATES FROM 20020315 TO 20020324

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION