WO2008091711A2 - System and method for editing web-based video - Google Patents

System and method for editing web-based video Download PDF

Info

Publication number
WO2008091711A2
WO2008091711A2 PCT/US2008/001130 US2008001130W WO2008091711A2 WO 2008091711 A2 WO2008091711 A2 WO 2008091711A2 US 2008001130 W US2008001130 W US 2008001130W WO 2008091711 A2 WO2008091711 A2 WO 2008091711A2
Authority
WO
WIPO (PCT)
Prior art keywords
media
presentation
data
production
server
Prior art date
Application number
PCT/US2008/001130
Other languages
French (fr)
Other versions
WO2008091711A3 (en
Inventor
Andrew Gavin
Scott Shumaker
Original Assignee
Flektor, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Flektor, Inc. filed Critical Flektor, Inc.
Priority claimed from US12/011,770 external-priority patent/US8286069B2/en
Publication of WO2008091711A2 publication Critical patent/WO2008091711A2/en
Publication of WO2008091711A3 publication Critical patent/WO2008091711A3/en

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs

Definitions

  • the present invention relates to web-based video editing systems and methods, and more particularly, to a web-based video editing system and method for web-based video incorporating temporal, non-temporal and audio data.
  • FIG. 1 is a block diagram of a web-based video editing system according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram of an embodiment of an editing system of the web-based video editing system of FIG. 1.
  • FIG. 3 is a flowchart of a method of operation of the editing system according to some embodiments of the present invention.
  • FIG. 4 is a communication device according to an embodiment of the present invention.
  • FIG. S is an embodiment of a screenshot of a graphical user interface of a communication device of FIG 4.
  • FIG. 1 is a block diagram of a web-based video editing system according to a first embodiment of the present invention.
  • the editing system includes one or more communication devices, a server 120 having a connection manager 130 and an editing system 140 operating on the server, and a network 150 over which the one or more communication devices and the server communicate.
  • the communication devices may include, but are not limited to, a personal computer HOa, a mobile telephone HOb, a PDA HOc or any other communication device configured to operate as a client computer to the server.
  • Each of the communication devices is configured with a graphical user interface (GUI) or in communication with a GUI.
  • GUI graphical user interface
  • a user operates the communication device to display interactive web-based video on the graphical user interface.
  • the network to which the server and devices are coupled may be a wireless or a wireline network and may range in size from a local area network to a wide area network to the Internet.
  • a dedicated open socket connection exists between the connection manager and the communication devices.
  • the socket connection may allow the creation of a device network including devices/clients and the server components.
  • an HTTP-based proxy mechanism may be used in lieu of the socket connection.
  • the HTTP-based proxy mechanism may be used when a direct socket is not feasible.
  • one or more client computers are configured to transmit information to and receive information from the server.
  • each of the client computers is configured to send a query for information and the server is configured to respond to the query by sending the requested information to the client computer.
  • one or more of the client computers is configured to transmit commands to the server and the server is configured to perform functions in response to the command.
  • each of the client computers is configured with an application for displaying multimedia on the graphical user interface of the client computer.
  • the application may be Adobe Flash® or any other application capable of displaying multimedia.
  • the connection manager is configured to determine the condition of the server and perform asynchronous messaging to one or more of the client computers over the dedicated open socket connection.
  • the content of the messages is indicative of the state of the server.
  • the server is configured to receive requests from one or more of the client computers and perform functions in response to the received requests.
  • the server may perform any number of functions typically performed in the server of a web-based video editing system.
  • the server may also provide an editing system for editing video.
  • the editing system 140 is executed on the server. In other embodiments, the editing system is executed on a computer that is remote from but in communication with the server.
  • the editing system may be configured to allow a user to edit a web-based video. The user may edit a web-based video by creating a new video or by modifying an existing web-based video.
  • the editing system 140 is executed on the server. In other embodiments, the editing system is executed on a computer that is remote from but in communication with the server. The editing system may be configured to allow a user to edit a web-based video.
  • FIG. 2 is a block diagram of an embodiment of an editing system 140 of the web-based video editing system of FIG. 1.
  • the editing system includes a processor 212, memory 214 and computer code product including instructions stored on the memory and adapted to cause the processor, and thereby cause the editing system, to receive and process user interactive video editing requests and transmit to a communication device information indicative of the processed user interactive video editing request.
  • the memory also stores information indicative of the user interactive video editing requests.
  • the memory may be any type of read-write memory, including, but not limited to, random access memory.
  • receiving the user interactive video editing request includes receiving the identity of the user, the selected data that the user desires to include in a web-based video and the location of the data.
  • the location of the data is relative to other data that is selected. In other embodiments, the location of the data is relative to a location on a timeline for the web-based video that the communication device is configured to display to the user.
  • the web-based video produced as a result of the user interactive video editing requests is made of various virtual elements.
  • the virtual elements may be static, such as elements including, but not limited to, a photo or a video within the web-based video production. Static elements often contain additional information for styling, cropping, etc. the content of the web-based video production.
  • Another type of virtual element is a linkage element that defines interrelationships between static elements. In many embodiments, the linkage elements are part of a linked list.
  • a first photo followed by a second photo in a web-based video would have a first linkage to connect the first and the second photos.
  • the first linkage would be destroyed, the third photo would be added to the web-based video production, a second linkage would be added between the first photo and the third photo, and a third linkage would be added between the third photo and the second photo.
  • the linkages of elements of the web-based video production may be kept separate from the elements themselves.
  • different types of linkages between static elements are be added as linkage elements. Such linkages are determined by the system designer.
  • processing the user interactive video editing request includes creating information indicative of a linked list.
  • the information is indicative of one or more data selected by the user and the relative ordering of the selected data.
  • transmitting to the communication device information indicative of the processed user interactive video editing request includes transmitting to the communication device the information indicative of the linked list.
  • the server also transmits to the communication device actual data selected by the user.
  • the selected data may be stored in the editing system or at a remote location coupled to the network of FIG. 1.
  • the data may be data provided by the web-based video editing system or data generated by the user.
  • the data may include video, audio, a still picture, a photo album, a poll or the like.
  • the data may also include elements composed of coding that performs a selected function, such as a transition.
  • an element may include, but is not limited to, a wipe transition effect or a fade transition effect.
  • the data may be indicative of an advertisement.
  • the data may also be temporal, non-temporal or audio data.
  • Temporal data includes, but is not limited to, video and effects.
  • Temporal data is active for a duration indicative of the time that the data is active. For example, video is active during a duration indicative of the time length of a selected video file.
  • Non-temporal data includes, but is not limited to, a chat, a poll or a photo album.
  • Non-temporal data is active for an indeterminate amount of time.
  • a system designer assigns an initial duration of time during which it is active. The initial duration of time is modified by the duration of time that actually elapses before the user takes an action that terminates the current display of the non- temporal data.
  • FIG. 3 is a flowchart of a method of operation of the editing system according to some embodiments of the present invention.
  • the editing system receives 310 one or more user interactive video editing requests from a user.
  • the one or more requests include information indicative of the identity of the user, selected data that the user desires to include in an interactive video and the location of the selected data.
  • the system creates 320 information indicative of a linked list. The information is indicative of one or more data requested by the user and the relative ordering of the data.
  • the system transmits 330 the information to the communication device.
  • FIG. 4 is a communication device according to an embodiment of the present invention.
  • the communication device is in communication with a GUI.
  • the device 10 may be configured with the GUI or merely in communication with a GUI such that the device may cause the GUI to display information.
  • the communication device includes a processor 412, memory
  • the memory 414 and computer code product including instructions stored on the memory and adapted to cause the processor, and thereby cause the communication device system, to receive and process 1 S information from the editing system and display on the graphical user interface (GUI) indicators of the received information.
  • the memory also stores information indicative of the user interactive video editing requests.
  • the memory may be any type of read-write memory, including, but not limited to, random access memory.
  • the received information includes the 20 information indicative of the linked list.
  • processing the information received from the editing system includes examining 450 the current state of the interactive video and generating 460 information indicative of the examined current state of the interactive video.
  • examining 450 the current state of the interactive video includes determining 470 the current data and the relative 5 ordering of the data within the video.
  • each data is categorized as temporal, non-temporal or audio. In other embodiments, each data is categorized in only one category.
  • Audio data is categorized in an audio category in some embodiments.
  • examining the current state of the interactive video includes determining 480 one or more changes between a prior version of the video and the version of the video indicative of the linked list received from the editing system.
  • determining one or more changes between the prior version and the version of the linked list received from the editing system includes determining whether there are changes in the data or relative ordering of the data for the interactive video.
  • determining one or more changes between the prior version and the version of the linked list is performed according to one or more of the methods described in U.S. provisional patent application entitled "Real Time Online Video Editing System and Method," (F462/58744), the entire content of which is expressly incorporated herein by reference.
  • generating information indicative of the examined current state of the interactive video includes generating 481 information indicative of a time arrangement of the data.
  • the interactive video will be displayed or previewed on the GUI according to the time arrangement of the data.
  • one or more data is placed in a time arrangement wherein the data overlaps in time with one or more other data.
  • audio data overlaps with one or more other data.
  • the one or more other data may be other audio data, temporal data or non- temporal data.
  • generating information indicative of the examined current state of the interactive video includes estimating 482 the duration for each data, generating 483 a set of data indicative of a timeline having discrete points in time during which a data may be active or inactive, assigning 484 an activation range and a clock range for each data and generating 485 a clock for each data categorized as non-temporal and for each data categorized as audio.
  • estimating 482 the duration for each data depends on the category of the data. For temporal data or audio data, the device reads information indicative of the data and determines the duration of the file containing the data.
  • the device assigns an estimated time dependent on the system design.
  • the estimated time for non-temporal data is arbitrarily assigned.
  • the estimated time for non-temporal data is updated to include the actual duration of the data.
  • the actual duration of the data is the time when the data becomes active to the time when the user advances to the next data.
  • the estimated time is updated while the video is played such as when the user takes action that ends a particular non-temporal data, thereby essentially turning that particular instance of non-temporal data into to temporal data.
  • generating a set of data indicative of a timeline includes generating data indicative of points in time.
  • the points in time may begin at time zero or at any other time.
  • the points in time increase with the timeline.
  • All data is placed on the timeline in accordance with the order of the time arrangement of the data.
  • a first data that is before a second data in the time arrangement will be placed at an earlier timeline position that the second data.
  • the data indicative of the timeline may be used to allow a user to scrub to a point in time in the video.
  • a block oftime on the timeline is reserved for each data in the video.
  • the block of time is indicative of the estimated duration of the data.
  • assigning activation range for each data includes determining the start time when any form of the data is presented and determining the end times when no form of the data is presented.
  • assigning a clock range for each data includes determining the start time when full logic of the data is presented and determining the end times when full logic of the data ceases to be presented.
  • a data is displayed or previewed as a game with targets falling from the sky during its activation range, although full logic of the game, including moving objects of the targets to prevent the objects from being blown up occurs during the clock range of the data.
  • a fade in effect displays or previews a video before the video animation begins at the beginning of the activation range, although the full logic of the video, including animation, begins at the beginning of the clock range.
  • the clock range starts after before the activation range starts and ends before the activation range on the timeline.
  • a first data overlaps in time with a second data when the first has an activation range that overlaps for any amount of time with the activation range of the second data.
  • generating a clock for each data categorized as non-temporal and for each data categorized as audio includes assigning the activation range and the clock range for each data.
  • the starting and ending times of the activation range and the starting and ending times of the clock range is arbitrarily assigned.
  • each data is associated with an electronic channel indicative of its category. The data in its category is ordered within the channel based on the start and stop times of the activation range. This allows data in different categories, such as video, polls, chat, and audio, to be displayed independent of each other or with only minimal dependence on each other.
  • the communication device provides information for presenting the data.
  • the information provided is the data on the one or more electronic channels.
  • a master clock in the device maintains a record of time to begin the activation range of any data. Data from one or more channels is provided to be presented to the user while the record of the time is within the activation range for the one or more data.
  • An additional clock is created for each of the channels for non-temporal data and each of the channels for audio data. Each of the additional clocks maintains a record of the time for presenting the non-temporal data or the audio data during a period of time approximately equal to the duration of the activation range.
  • each of the additional clocks is a standalone clock that is not linked to the master clock.
  • data on a temporal channel may be presented concurrently with data from an audio channel.
  • data on a non-temporal channel may be presented concurrently with data from an audio channel.
  • multiple data may be presented from the non-temporal channel in successive order.
  • the data may be presented by providing a visual display of the data and/or by providing the sound of the audio data.
  • the sound of the audio data may be provided from a device configured to receive an audio file, convert the audio file into sound and transmit the sound.
  • the sound may be transmitted from a device configured to provide audible sound to a user.
  • multiple clocks for the channels for temporal, non-temporal and audio data allow a user to perform scrubbing across a video that includes temporal and non-temporal data.
  • the clock for the channel for the temporal data is the master clock.
  • the master clock stops during the activation range of non-temporal data; however, the clock associated with the non-temporal data executes during the activation range of the non- temporal data.
  • scrubbing to a point in time during which a non-temporal data is presented accurately reflects the point in time because the communication device notes the time from the master clock and adds the additional time from the clock associated with the non-temporal data.
  • Other scrubbing may be performed according to U.S. Provisional Patent Application entitled "Video Downloading and Scrubbing System and Method" (F462/58745), the entire content of which is expressly incorporated herein by reference. [0043] FIG.
  • the screenshot includes a timeline SlO, an image 520 indicative of a first data on the timeline, an image 530 indicative of a second data on the timeline, an image 540 indicative of a third data on the timeline, an image 550 indicative of a fourth data on the timeline and a first indicator 560 indicative of a first type of channel, a second indicator 570 indicative of a second type of channel, a third indicator 580 indicative of a third type of channel.
  • the screenshot also includes images 591, 592 indicative of data that a user may select to include in an interactive video.
  • the user may select the data by any number of suitable methods well known to those skilled in the art including, but not limited to, clicking on the image or dragging and dropping the image in a location in the timeline.
  • the screenshot may also include an indicator 593 of a wastebasket.
  • One or more images in a timeline may be placed in the wastebasket to remove the corresponding data from the interactive video.
  • the user may drag and drop the image in the region of the screen indicated by the wastebasket.
  • the timeline provides a visual display of the positioning of data in the time arrangement and the amount of time that has elapsed and is outstanding for the display of data.
  • the display screen 590 is configured to provide a preview of the data currently being presented.
  • the display screen is also configured to receive inputs from a user. Inputs may be entered by touch or by any other suitable method well known to those skilled in the art.
  • images 520-550 are images indicative of temporal data.
  • 520, 530, 540, 550 refer to a Flektor system advertisement, a first video, a transition element and a second video, respectively.
  • the third indicator 580 is indicative of a channel for temporal data
  • 555 is an image indicative of audio data.
  • the indicator 560 is indicative of a channel for audio data.
  • the indicator 560 is shown overlapping with the image 530 and accordingly, audio data is presented concurrently with temporal data during the time corresponding to these point in time on the timeline.
  • Image 556 is indicative of non-temporal data. Accordingly, indicator 580 is indicative of a channel for non-temporal data.
  • the GUI also displays one or more of: user interface buttons for previewing the interactive video, one or more data to select for incorporation in an interactive video, and a trash can user interface for removing one or more data from an interactive video.
  • the user may preview the interactive video by requesting to preview the video.
  • the user can request to preview the video by transmitting a request to the device by depressing a preview button 557 on the screen or by any number of other suitable methods for transmitting a request to the device.
  • the user may preview a video by any other number of methods.
  • the video will be presented via display in the display screen 590.

Abstract

A web-based video editing system configured to edit an interactive video is provided. The web-based video editing system includes an editing system configured to receive and process one or more user interactive video editing requests, wherein the editing system is configured to process the requests by generating information indicative of a linked list; a communication device configured to: receive the generated information; examine a current state of the interactive video; and generate information indicative of the examined current state of the video; and a graphical user interface in communication with the communication device and configured to display the edited interactive video.

Description

SYSTEM AND METHOD FOR EDITING WEB-BASED VIDEO
FIELD OF THE INVENTION
[0001] The present invention relates to web-based video editing systems and methods, and more particularly, to a web-based video editing system and method for web-based video incorporating temporal, non-temporal and audio data.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] FIG. 1 is a block diagram of a web-based video editing system according to a first embodiment of the present invention.
[0003] FIG. 2 is a block diagram of an embodiment of an editing system of the web-based video editing system of FIG. 1. [0004] FIG. 3 is a flowchart of a method of operation of the editing system according to some embodiments of the present invention.
[0005] FIG. 4 is a communication device according to an embodiment of the present invention.
[0006] FIG. S is an embodiment of a screenshot of a graphical user interface of a communication device of FIG 4.
DETAILED DESCRIPTION OF THE INVENTION
[0007] FIG. 1 is a block diagram of a web-based video editing system according to a first embodiment of the present invention. The editing system includes one or more communication devices, a server 120 having a connection manager 130 and an editing system 140 operating on the server, and a network 150 over which the one or more communication devices and the server communicate. The communication devices may include, but are not limited to, a personal computer HOa, a mobile telephone HOb, a PDA HOc or any other communication device configured to operate as a client computer to the server. Each of the communication devices is configured with a graphical user interface (GUI) or in communication with a GUI. In some embodiments, a user operates the communication device to display interactive web-based video on the graphical user interface. The network to which the server and devices are coupled may be a wireless or a wireline network and may range in size from a local area network to a wide area network to the Internet. In this embodiment, a dedicated open socket connection exists between the connection manager and the communication devices. The socket connection may allow the creation of a device network including devices/clients and the server components.
[0008] In some embodiments, an HTTP-based proxy mechanism may be used in lieu of the socket connection. The HTTP-based proxy mechanism may be used when a direct socket is not feasible.
[0009] In some embodiments of the system, one or more client computers are configured to transmit information to and receive information from the server. In some embodiments, each of the client computers is configured to send a query for information and the server is configured to respond to the query by sending the requested information to the client computer. In some embodiments, one or more of the client computers is configured to transmit commands to the server and the server is configured to perform functions in response to the command. [0010] In some embodiments, each of the client computers is configured with an application for displaying multimedia on the graphical user interface of the client computer. The application may be Adobe Flash® or any other application capable of displaying multimedia. [0011] The connection manager is configured to determine the condition of the server and perform asynchronous messaging to one or more of the client computers over the dedicated open socket connection. In some embodiments, the content of the messages is indicative of the state of the server. [0012] The server is configured to receive requests from one or more of the client computers and perform functions in response to the received requests. The server may perform any number of functions typically performed in the server of a web-based video editing system. The server may also provide an editing system for editing video.
[0013] In some embodiments, the editing system 140 is executed on the server. In other embodiments, the editing system is executed on a computer that is remote from but in communication with the server. The editing system may be configured to allow a user to edit a web-based video. The user may edit a web-based video by creating a new video or by modifying an existing web-based video.
[0014] In some embodiments, the editing system 140 is executed on the server. In other embodiments, the editing system is executed on a computer that is remote from but in communication with the server. The editing system may be configured to allow a user to edit a web-based video.
[0015] FIG. 2 is a block diagram of an embodiment of an editing system 140 of the web-based video editing system of FIG. 1. In some embodiments, the editing system includes a processor 212, memory 214 and computer code product including instructions stored on the memory and adapted to cause the processor, and thereby cause the editing system, to receive and process user interactive video editing requests and transmit to a communication device information indicative of the processed user interactive video editing request. The memory also stores information indicative of the user interactive video editing requests. The memory may be any type of read-write memory, including, but not limited to, random access memory. [0016] In some embodiments, receiving the user interactive video editing request includes receiving the identity of the user, the selected data that the user desires to include in a web-based video and the location of the data. In some embodiments, the location of the data is relative to other data that is selected. In other embodiments, the location of the data is relative to a location on a timeline for the web-based video that the communication device is configured to display to the user. [0017] In many embodiments, the web-based video produced as a result of the user interactive video editing requests is made of various virtual elements. The virtual elements may be static, such as elements including, but not limited to, a photo or a video within the web-based video production. Static elements often contain additional information for styling, cropping, etc. the content of the web-based video production. Another type of virtual element is a linkage element that defines interrelationships between static elements. In many embodiments, the linkage elements are part of a linked list. By way of example, but not limitation, a first photo followed by a second photo in a web-based video would have a first linkage to connect the first and the second photos. By way of a further example, to add a third photo between the first photo and the second photo, the first linkage would be destroyed, the third photo would be added to the web-based video production, a second linkage would be added between the first photo and the third photo, and a third linkage would be added between the third photo and the second photo. In this way the linkages of elements of the web-based video production may be kept separate from the elements themselves. In various embodiments, different types of linkages between static elements are be added as linkage elements. Such linkages are determined by the system designer. Examples of such elements include timelink elements, timeline linkage elements, span elements that span across time, span link elements, and modifier elements (i.e., transitions). In various embodiments, new types may be added to the representation to represent relationships between static elements and even other linkage elements. [0018] In some embodiments, processing the user interactive video editing request includes creating information indicative of a linked list. The information is indicative of one or more data selected by the user and the relative ordering of the selected data.
[0019] In some embodiments, transmitting to the communication device information indicative of the processed user interactive video editing request includes transmitting to the communication device the information indicative of the linked list. In some embodiments, the server also transmits to the communication device actual data selected by the user. The selected data may be stored in the editing system or at a remote location coupled to the network of FIG. 1. The data may be data provided by the web-based video editing system or data generated by the user. [0020] The data may include video, audio, a still picture, a photo album, a poll or the like. The data may also include elements composed of coding that performs a selected function, such as a transition. By way of example, an element may include, but is not limited to, a wipe transition effect or a fade transition effect. In some embodiments, the data may be indicative of an advertisement.
[0021] The data may also be temporal, non-temporal or audio data. Temporal data includes, but is not limited to, video and effects. Temporal data is active for a duration indicative of the time that the data is active. For example, video is active during a duration indicative of the time length of a selected video file. Non-temporal data includes, but is not limited to, a chat, a poll or a photo album. Non-temporal data is active for an indeterminate amount of time. In various embodiments, for a particular set of non-temporal data, a system designer assigns an initial duration of time during which it is active. The initial duration of time is modified by the duration of time that actually elapses before the user takes an action that terminates the current display of the non- temporal data. In this sense, the initial duration of time could be considered a maximum duration of time in most instances. For example, a duration of time for a photo album is terminated when the user selects a next photograph in the photo album for viewing. By way of another example, a duration of time for a chat is terminated when the user enters a chat comment. The duration of audio data is the time length of a selected audio file. [0022] FIG. 3 is a flowchart of a method of operation of the editing system according to some embodiments of the present invention. The editing system receives 310 one or more user interactive video editing requests from a user. In some embodiments, the one or more requests include information indicative of the identity of the user, selected data that the user desires to include in an interactive video and the location of the selected data. S [0023] The system creates 320 information indicative of a linked list. The information is indicative of one or more data requested by the user and the relative ordering of the data. The system transmits 330 the information to the communication device.
[0024] FIG. 4 is a communication device according to an embodiment of the present invention.
In some embodiments, the communication device is in communication with a GUI. The device 10 may be configured with the GUI or merely in communication with a GUI such that the device may cause the GUI to display information.
[0025] In some embodiments, the communication device includes a processor 412, memory
414 and computer code product including instructions stored on the memory and adapted to cause the processor, and thereby cause the communication device system, to receive and process 1 S information from the editing system and display on the graphical user interface (GUI) indicators of the received information. The memory also stores information indicative of the user interactive video editing requests. The memory may be any type of read-write memory, including, but not limited to, random access memory.
[0026] With reference to FIG. 4A, in some embodiments, the received information includes the 20 information indicative of the linked list. In some embodiments, processing the information received from the editing system includes examining 450 the current state of the interactive video and generating 460 information indicative of the examined current state of the interactive video.
[0027] With reference to FIGs. 4A, 4B, and 4C, in some embodiments, examining 450 the current state of the interactive video includes determining 470 the current data and the relative 5 ordering of the data within the video. In some embodiments, each data is categorized as temporal, non-temporal or audio. In other embodiments, each data is categorized in only one category.
Audio data is categorized in an audio category in some embodiments. [0028] Returning to FIGS. 4A, 4B, and 4C, examining the current state of the interactive video includes determining 480 one or more changes between a prior version of the video and the version of the video indicative of the linked list received from the editing system. In some embodiments, determining one or more changes between the prior version and the version of the linked list received from the editing system includes determining whether there are changes in the data or relative ordering of the data for the interactive video. In other embodiments, determining one or more changes between the prior version and the version of the linked list is performed according to one or more of the methods described in U.S. provisional patent application entitled "Real Time Online Video Editing System and Method," (F462/58744), the entire content of which is expressly incorporated herein by reference.
[0029] In some embodiments, such as shown in FIG. 4B, generating information indicative of the examined current state of the interactive video includes generating 481 information indicative of a time arrangement of the data. The interactive video will be displayed or previewed on the GUI according to the time arrangement of the data.
[0030] In many embodiments, one or more data is placed in a time arrangement wherein the data overlaps in time with one or more other data. In many embodiments, audio data overlaps with one or more other data. The one or more other data may be other audio data, temporal data or non- temporal data.
[0031] In other embodiments, such as shown in FIG. 4C, generating information indicative of the examined current state of the interactive video includes estimating 482 the duration for each data, generating 483 a set of data indicative of a timeline having discrete points in time during which a data may be active or inactive, assigning 484 an activation range and a clock range for each data and generating 485 a clock for each data categorized as non-temporal and for each data categorized as audio. [0032] In some embodiments, estimating 482 the duration for each data depends on the category of the data. For temporal data or audio data, the device reads information indicative of the data and determines the duration of the file containing the data. For non-temporal data, the device assigns an estimated time dependent on the system design. In some embodiments, the estimated time for non-temporal data is arbitrarily assigned. In some embodiments, the estimated time for non-temporal data is updated to include the actual duration of the data. The actual duration of the data is the time when the data becomes active to the time when the user advances to the next data. In these embodiments, the estimated time is updated while the video is played such as when the user takes action that ends a particular non-temporal data, thereby essentially turning that particular instance of non-temporal data into to temporal data.
[0033] In some embodiments, generating a set of data indicative of a timeline includes generating data indicative of points in time. The points in time may begin at time zero or at any other time. The points in time increase with the timeline. All data is placed on the timeline in accordance with the order of the time arrangement of the data. A first data that is before a second data in the time arrangement will be placed at an earlier timeline position that the second data. Accordingly, the data indicative of the timeline may be used to allow a user to scrub to a point in time in the video. [0034] A block oftime on the timeline is reserved for each data in the video. The block of time is indicative of the estimated duration of the data. In some embodiments, for non-temporal data, the reserved time on the timeline is updated according to the actual time that elapses before the user advances to the next data. [0035] In some embodiments, assigning activation range for each data includes determining the start time when any form of the data is presented and determining the end times when no form of the data is presented. Assigning a clock range for each data includes determining the start time when full logic of the data is presented and determining the end times when full logic of the data ceases to be presented. By way of example, a data is displayed or previewed as a game with targets falling from the sky during its activation range, although full logic of the game, including moving objects of the targets to prevent the objects from being blown up occurs during the clock range of the data. By way of another example, a fade in effect displays or previews a video before the video animation begins at the beginning of the activation range, although the full logic of the video, including animation, begins at the beginning of the clock range. Accordingly, in one embodiment, the clock range starts after before the activation range starts and ends before the activation range on the timeline.
[0036] In some embodiments, a first data overlaps in time with a second data when the first has an activation range that overlaps for any amount of time with the activation range of the second data.
[0037] In some embodiments, generating a clock for each data categorized as non-temporal and for each data categorized as audio includes assigning the activation range and the clock range for each data. In one embodiment, for non-temporal data, the starting and ending times of the activation range and the starting and ending times of the clock range is arbitrarily assigned. [0038] In some embodiments, each data is associated with an electronic channel indicative of its category. The data in its category is ordered within the channel based on the start and stop times of the activation range. This allows data in different categories, such as video, polls, chat, and audio, to be displayed independent of each other or with only minimal dependence on each other. [0039] The communication device provides information for presenting the data. In some embodiments, the information provided is the data on the one or more electronic channels. In some embodiments, a master clock in the device maintains a record of time to begin the activation range of any data. Data from one or more channels is provided to be presented to the user while the record of the time is within the activation range for the one or more data. An additional clock is created for each of the channels for non-temporal data and each of the channels for audio data. Each of the additional clocks maintains a record of the time for presenting the non-temporal data or the audio data during a period of time approximately equal to the duration of the activation range. In some embodiments, each of the additional clocks is a standalone clock that is not linked to the master clock.
[0040] Accordingly, for example, data on a temporal channel may be presented concurrently with data from an audio channel. Additionally, data on a non-temporal channel may be presented concurrently with data from an audio channel. Further, multiple data may be presented from the non-temporal channel in successive order.
[0041] The data may be presented by providing a visual display of the data and/or by providing the sound of the audio data. The sound of the audio data may be provided from a device configured to receive an audio file, convert the audio file into sound and transmit the sound. The sound may be transmitted from a device configured to provide audible sound to a user. [0042] Additionally, multiple clocks for the channels for temporal, non-temporal and audio data allow a user to perform scrubbing across a video that includes temporal and non-temporal data. In some embodiments, the clock for the channel for the temporal data is the master clock. In this embodiment, the master clock stops during the activation range of non-temporal data; however, the clock associated with the non-temporal data executes during the activation range of the non- temporal data. In some embodiments when the master clock is used to designate a point in time to which the user would like to scrub, scrubbing to a point in time during which a non-temporal data is presented accurately reflects the point in time because the communication device notes the time from the master clock and adds the additional time from the clock associated with the non-temporal data. Other scrubbing may be performed according to U.S. Provisional Patent Application entitled "Video Downloading and Scrubbing System and Method" (F462/58745), the entire content of which is expressly incorporated herein by reference. [0043] FIG. 5 is an embodiment of a screenshot of a graphical user interface of a communication device of FIG 4. Referring to FIG. S, the screenshot includes a timeline SlO, an image 520 indicative of a first data on the timeline, an image 530 indicative of a second data on the timeline, an image 540 indicative of a third data on the timeline, an image 550 indicative of a fourth data on the timeline and a first indicator 560 indicative of a first type of channel, a second indicator 570 indicative of a second type of channel, a third indicator 580 indicative of a third type of channel. [0044] hi some embodiments, the screenshot also includes images 591, 592 indicative of data that a user may select to include in an interactive video. The user may select the data by any number of suitable methods well known to those skilled in the art including, but not limited to, clicking on the image or dragging and dropping the image in a location in the timeline. [0045] hi some embodiments, the screenshot may also include an indicator 593 of a wastebasket. One or more images in a timeline may be placed in the wastebasket to remove the corresponding data from the interactive video. The user may drag and drop the image in the region of the screen indicated by the wastebasket.
[0046] The timeline provides a visual display of the positioning of data in the time arrangement and the amount of time that has elapsed and is outstanding for the display of data. The display screen 590 is configured to provide a preview of the data currently being presented. The display screen is also configured to receive inputs from a user. Inputs may be entered by touch or by any other suitable method well known to those skilled in the art.
[0047] hi this embodiment, images 520-550 are images indicative of temporal data. For example, 520, 530, 540, 550 refer to a Flektor system advertisement, a first video, a transition element and a second video, respectively. Accordingly, the third indicator 580 is indicative of a channel for temporal data, hi this embodiment, 555 is an image indicative of audio data. Accordingly, the indicator 560 is indicative of a channel for audio data. Further, while the corresponding image is not shown, the indicator 560 is shown overlapping with the image 530 and accordingly, audio data is presented concurrently with temporal data during the time corresponding to these point in time on the timeline.
[0048] Image 556 is indicative of non-temporal data. Accordingly, indicator 580 is indicative of a channel for non-temporal data.
[0049] In other embodiments, the GUI also displays one or more of: user interface buttons for previewing the interactive video, one or more data to select for incorporation in an interactive video, and a trash can user interface for removing one or more data from an interactive video.
[0050] The user may preview the interactive video by requesting to preview the video. In some embodiments, the user can request to preview the video by transmitting a request to the device by depressing a preview button 557 on the screen or by any number of other suitable methods for transmitting a request to the device. In other embodiments, the user may preview a video by any other number of methods. The video will be presented via display in the display screen 590.

Claims

WHAT IS CLAIMED IS:
1. A system for editing a multimedia production comprising: a server connected to a network; a client Communications device connected to the network, wherein the client communications device is in communication with the server; media storage connected to the server, the media storage including media elements that are used in a multimedia production; production storage connected to the server, the production storage including a set of production data that defines a multimedia production that includes a plurality of media channels for simultaneous presentation, each media channel including an ordered set of media elements, the set of production data including, for each of a plurality of the media channels, a linked list of presentation data items, each presentation data item indicative of one or more media elements in the media channel and presentation parameters for presenting the one or more media elements in the multimedia presentation, wherein the server is configured to: identify each of a plurality of the media elements as either a temporal media element, when a duration of presentation of the media element within the multimedia presentation is determinant from the presentation parameters for that media element, or a non-temporal media element, when a duration of presentation of the media element within the multimedia presentation is not determinant from the presentation parameters for that media element; wherein the presentation parameters for at least one non-temporal media element includes criteria for determining the duration of presentation of that media element within the multimedia presentation; and wherein the presentation parameters for at least one media element includes a time for beginning the presentation of that media element that is determined by the duration of presentation determined for a non-temporal media element from the criteria included in the presentation parameters for that non-temporal media element; wherein the server is further configured to: receive a multimedia production editing command from the client communications device; determine whether the multimedia production editing command makes the criteria for any non-temporal media element indeterminant and when the multimedia production editing command makes the criteria for any non-temporal media element indeterminant, send a communication to the client communications device that replacement criteria for criteria made indeterminant must be supplied for the production editing command to be executed.
2. The system of claim 1 wherein the server includes the media storage.
3. The system of claim 1 wherein the server includes the production storage.
4. The system of claim 1 wherein the server includes the media storage and the production storage
5. The system of claim 1 wherein the client communications device comprises a graphical user interface.
6. The system of claim 5 wherein the client communications device comprises at least one selected from the group consisting of: a personal computer, a laptop computer, a handheld computer, a phone, and a video player.
7. The system of claim 1 wherein the network is the Internet.
8. A method for editing a multimedia production using a server connected to a network, media storage connected to the server, the media storage including media elements that are used in a multimedia production, production storage connected to the server, the production storage including a set of production data that defines a multimedia production that includes a plurality of media channels for simultaneous presentation, each media channel including an ordered set of media elements, and a client communications device connected to the network wherein the client communications device is in communication with the server, the method comprising: for each of a plurality of the media channels in a set of production data, maintaining a linked list of presentation data items, each presentation data item indicative of one or more media elements in the media channel and presentation parameters for presenting the one or more media elements in the multimedia presentation, identifying each of a plurality of the media elements as either a temporal media element, when a duration of presentation of the media element within the multimedia presentation is determinant from the presentation parameters for that media element, or a non- temporal media element, when a duration of presentation of the media element within the multimedia presentation is not determinant from the presentation parameters for that media element; after identifying a media element as a non-temporal media element, establishing criteria for determining the duration of presentation of that non-temporal media element within the multimedia presentation; and for at least one media element, setting a time for beginning the presentation of that media element that is determined by the duration of presentation determined for a non- temporal media element from the criteria included established for that non-temporal media element; receiving, by the server, a multimedia production editing command from the client communications device; determining, by the server, whether the multimedia production editing command makes the criteria for any non-temporal media element indeteπninant and when the multimedia production editing command makes the criteria for any non-temporal media element indeterminant, sending a communication to the client communications device that replacement criteria for criteria made indeterminant must be supplied for the production editing command to be executed.
9. The method of claim 8 wherein the server includes the media storage.
10. The method of claim 8 wherein the server includes the production storage.
11. The method of claim 8 wherein the server includes the media storage and the production storage
12. The method of claim 8 wherein the client communications device comprises a graphical user interface.
13. The method of claim 12 wherein the client communications device comprises at least one selected from the group consisting of: a personal computer, a laptop computer, a handheld computer, a phone, and a video player.
14. The method of claim 8 wherein the network is the Internet.
PCT/US2008/001130 2007-01-26 2008-01-28 System and method for editing web-based video WO2008091711A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US89754407P 2007-01-26 2007-01-26
US60/897,544 2007-01-26
US12/011,770 2008-01-28
US12/011,770 US8286069B2 (en) 2007-01-26 2008-01-28 System and method for editing web-based video

Publications (2)

Publication Number Publication Date
WO2008091711A2 true WO2008091711A2 (en) 2008-07-31
WO2008091711A3 WO2008091711A3 (en) 2008-10-02

Family

ID=39645096

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/001130 WO2008091711A2 (en) 2007-01-26 2008-01-28 System and method for editing web-based video

Country Status (1)

Country Link
WO (1) WO2008091711A2 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060156219A1 (en) * 2001-06-27 2006-07-13 Mci, Llc. Method and system for providing distributed editing and storage of digital media over a network

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060156219A1 (en) * 2001-06-27 2006-07-13 Mci, Llc. Method and system for providing distributed editing and storage of digital media over a network

Also Published As

Publication number Publication date
WO2008091711A3 (en) 2008-10-02

Similar Documents

Publication Publication Date Title
US8286069B2 (en) System and method for editing web-based video
US11233972B2 (en) Asynchronous online viewing party
TWI528824B (en) Method and computer program product for sharing media item
JP6961994B2 (en) Systems and methods for message management and document generation on devices, message management programs, mobile devices
CN109479159B (en) Method and apparatus for sharing user-selected video in group communication
US8473571B2 (en) Synchronizing presentation states between multiple applications
US20170293937A1 (en) Conditional advertising for instant messaging
US8756510B2 (en) Method and system for displaying photos, videos, RSS and other media content in full-screen immersive view and grid-view using a browser feature
US9542394B2 (en) Method and system for media-based event generation
JP4920161B2 (en) System for automatically providing peripheral awareness of information and method for providing dynamic objects
US7761507B2 (en) Networked chat and media sharing systems and methods
CN105095480B (en) To the real-time offer of the link of portion of media object in social networks update
JP5337146B2 (en) Video overlay
US20090113315A1 (en) Multimedia Enhanced Instant Messaging Engine
JP6961993B2 (en) Systems and methods for message management and document generation on devices, message management programs, mobile devices
US20100122174A1 (en) System and method for interfacing interactive systems with social networks and media playback devices
CN111314204B (en) Interaction method, device, terminal and storage medium
WO2009040538A1 (en) Multimedia content assembling for viral marketing purposes
US20120192231A1 (en) Web computer TV system
CN103052926A (en) Leveraging social networking for media sharing
US20220224663A1 (en) Message Display Method and Apparatus, Terminal, and Computer-Readable Storage Medium
CN108573391A (en) A kind of processing method of promotional content, apparatus and system
US20220394068A1 (en) Method for a video content distribution and messaging platform providing personalized video feeds
JP2003044393A (en) Method and system for displaying transitory message received through network and program product
WO2008091711A2 (en) System and method for editing web-based video

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08724907

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08724907

Country of ref document: EP

Kind code of ref document: A2

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: COMMUNICATION UNDER RULE 112(1) EPC, EPO FORM 1205A DATED 01/12/09

122 Ep: pct application non-entry in european phase

Ref document number: 08724907

Country of ref document: EP

Kind code of ref document: A2