Movatterモバイル変換


[0]ホーム

URL:


US10127943B1 - Systems and methods for modifying videos based on music - Google Patents

Systems and methods for modifying videos based on music
Download PDF

Info

Publication number
US10127943B1
US10127943B1US15/447,738US201715447738AUS10127943B1US 10127943 B1US10127943 B1US 10127943B1US 201715447738 AUS201715447738 AUS 201715447738AUS 10127943 B1US10127943 B1US 10127943B1
Authority
US
United States
Prior art keywords
music
visual effects
video content
frequency range
pulses
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/447,738
Inventor
Jean Patry
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GoPro Inc
Original Assignee
GoPro Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US15/447,738priorityCriticalpatent/US10127943B1/en
Application filed by GoPro IncfiledCriticalGoPro Inc
Assigned to GOPRO, INC.reassignmentGOPRO, INC.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: PATRY, JEAN
Assigned to JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENTreassignmentJPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENTSECURITY INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: GOPRO, INC.
Application grantedgrantedCritical
Priority to US16/188,679prioritypatent/US10679670B2/en
Publication of US10127943B1publicationCriticalpatent/US10127943B1/en
Assigned to JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENTreassignmentJPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENTSECURITY INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: GOPRO, INC.
Assigned to GOPRO, INC.reassignmentGOPRO, INC.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: OULES, GUILLAUME
Assigned to JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENTreassignmentJPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENTCORRECTIVE ASSIGNMENT TO CORRECT THE SCHEDULE TO REMOVE APPLICATION 15387383 AND REPLACE WITH 15385383 PREVIOUSLY RECORDED ON REEL 042665 FRAME 0065. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST.Assignors: GOPRO, INC.
Priority to US16/884,904prioritypatent/US10991396B2/en
Assigned to GOPRO, INC.reassignmentGOPRO, INC.RELEASE OF PATENT SECURITY INTERESTAssignors: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT
Priority to US17/239,590prioritypatent/US11443771B2/en
Assigned to FARALLON CAPITAL MANAGEMENT, L.L.C., AS AGENTreassignmentFARALLON CAPITAL MANAGEMENT, L.L.C., AS AGENTSECURITY INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: GOPRO, INC.
Assigned to WELLS FARGO BANK, NATIONAL ASSOCIATION, AS AGENTreassignmentWELLS FARGO BANK, NATIONAL ASSOCIATION, AS AGENTSECURITY INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: GOPRO, INC.
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

Video information defining video content may be accessed. Music information defining a music track providing an accompaniment for video content may be accessed. The music track may have pulses and one or more music events. Individual music events may correspond to different moments within the music track. One or more music events may be individually classified into one or more categories based on intensities of one or pulses occurring within the music event. One or more visual effects may be selected for different moments within the music track based on the categories of the music events. One or more visual effects may be applied to the video content. One or more visual effects may be applied to one or more moments within the video content aligned to one or more different moments within the music track.

Description

FIELD
This disclosure relates to modifying video content based on music that provides an accompaniment for the video content.
BACKGROUND
Video editing applications may allow users to manually edit video clips to introduce visual effects. Music tracks may accompany playback of video clips. Manually editing video clips to introduce visual effects based on accompanying music tracks may be time consuming and may discourage users from modifying video clips based on music tracks.
SUMMARY
This disclosure relates to modifying videos based on music. Video information defining video content may be accessed. Music information defining a music track may be accessed. The music track may provide an accompaniment for the video content. The music track may have pulses and one or more music events. Individual music events may correspond to different moments within the music track. One or more music events may be individually classified into one or more categories based on intensities of one or pulses occurring within the music event. One or more visual effects may be selected for different moments within the music track based on the categories of the music events. One or more visual effects may be applied to the video content. One or more visual effects may be applied to one or more moments within the video content aligned to one or more different moments within the music track.
A system that modifies videos based on music may include one or more physical processors, and/or other components. The processor(s) may be configured by machine-readable instructions. Executing the machine-readable instructions may cause the processor(s) to facilitate modifying videos based on music. The machine-readable instructions may include one or more computer program components. The computer program components may include one or more of an access component, a visual effects selection component, a visual effects application component, and/or other computer program components. In some implementations, the computer program components may include a music track analysis component.
The access component may be configured to access the video information defining one or more video content and/or other information. The access component may access video information from one or more storage locations. The access component may be configured to access video information defining one or more video content during acquisition of the video information and/or after acquisition of the video information by one or more image sensors.
The access component may be configured to access music information defining one or more music tracks and/or other information. The access component may access music information from one or more storage locations. The access component may access particular music information based on user selection, system information, video content, and/or other information.
A music track may provide an accompaniment for the video content. The music track may have pulses. The music track may have one or more music events. Individual music events may correspond to different moments within the music track. One or more music events may be individually classified into one or more categories based on intensities of one or more pulses occurring within the music event and/or other information. In some implementations, one or more music events may be classified into one or more categories based on the intensities of the one or more pulses within a low frequency range, a middle frequency range, and a high frequency range. In some implementations, one or more categories may include a weak category, an average category, a strong category, an intense category, and/or other categories.
In some implementations, one or more consecutive pulses within the music track may be grouped based on similarity of the intensities within the low frequency range, the middle frequency range, and the high frequency range. In some implementations, one or more consecutive pulses within the music track may be grouped based on a hidden Markov model and/or other information.
The music track analysis component may be configured to analyze the music track to classify the music event(s) within a music track into one or more categories. The music track analysis component may classify the music event(s) based on the intensities of one or more pulses occurring within the music event and/or other information. The music track analysis component may classify the music event(s) based on the intensities of the one or more pulses within a low frequency range, a middle frequency range, and a high frequency range. In some implementations, the music track analysis component may classify the music event(s) into a weak category, an average category, a strong category, an intense category, and/or other categories.
The visual effects selection component may be configured to select one or more visual effects for one or more of the different moments within the music track. The visual effects selection component may select one or more visual effects based on the categories of the one or more music events corresponding to the different moments within the music track and/or other information. In some implementations, the visual effects selection component may select one or more visual effects based on grouping(s) of consecutive pulses within the music track. In some implementations, the visual effects selection component may select one or more visual effects based on a user selection. In some implementations, the visual effects selection component may select one or more visual effects randomly from a list of visual effects.
A visual effect may refer to a change in presentation of the video content on a display. A visual effect may change the presentation of the video content for a video frame, for multiple frames, for a point in time, and/or for a duration of time. In some implementations, a visual effect may include one or more changes in perceived speed at which the video content is presented during playback. In some implementations, a visual effect may include one or more visual transformation of the video content.
The visual effects application component may be configured to apply one or more visual effects to the video content. The visual effects application component may apply one or more visual effects to one or more moments within the video content aligned to one or more of the different moments that correspond to the music event(s) within the music track. The visual effects application component may apply the visual effect(s) upon a request for playback of the video content. The visual effects application component may generate one or more files describing the visual effects and/or one or more files containing video content altered based on the application of the visual effects.
These and other objects, features, and characteristics of the system and/or method disclosed herein, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. As used in the specification and in the claims, the singular form of “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates a system that modifies videos based on music.
FIG. 2 illustrates a method for modifying videos based on music.
FIG. 3A illustrates pulses within an example music track.
FIG. 3B illustrates music events within an example music track.
FIGS. 4A-4B illustrate example speed ramps.
FIG. 5A illustrates example speed peaks applied to video content.
FIG. 5B illustrates example speed changes applied to video content.
FIG. 5C illustrates example visual transformations applied to video content.
DETAILED DESCRIPTION
FIG. 1 illustratessystem10 for modifying videos based on music.System10 may include one or more ofprocessor11,electronic storage12, interface13 (e.g., bus, wireless interface), and/or other components.Video information22 defining video content may be accessed byprocessor11. Music information24 defining a music track may be accessed. The music track may provide an accompaniment for the video content. The music track may have pulses and one or more music events. Individual music events may correspond to different moments within the music track. One or more music events may be individually classified into one or more categories based on intensities of one or pulses occurring within the music event. One or more visual effects may be selected for different moments within the music track based on the categories of the music events. One or more visual effects may be applied to the video content. One or more visual effects may be applied to one or more moments within the video content aligned to one or more different moments within the music track.
Electronic storage12 may be configured to include electronic storage medium that electronically stores information.Electronic storage12 may store software algorithms, information determined byprocessor11, information received remotely, and/or other information that enablessystem10 to function properly. For example,electronic storage12 may store information relating to video information, video content, music information, music track, pulses, intensities of the pulses, music event categories, visual effects, and/or other information.
Electronic storage12 may storevideo information22 defining one or more video content. Video content may refer to media content that may be consumed as one or more videos. Video content may include one or more videos stored in one or more formats/container, and/or other video content. A video may include a video clip captured by a video capture device, multiple video clips captured by a video capture device, and/or multiple video clips captured by separate video capture devices. A video may include multiple video clips captured at the same time and/or multiple video clips captured at different times. A video may include a video clip processed by a video application, multiple video clips processed by a video application and/or multiple video clips processed by separate video applications.
Video content may have a progress length. A progress length may be defined in terms of time durations and/or frame numbers. For example, video content may include a video having a time duration of 60 seconds. Video content may include a video having 1800 video frames. Video content having 1800 video frames may have a play time duration of 60 seconds when viewed at 30 frames/second. Other time durations and frame numbers are contemplated.
Electronic storage12 may store music information24 defining one or more music tracks. Music tracks may refer to media content that may be consumed as one or more audios. Music tracks may include recorded music, synthetized music, and/or otherwise produced music. Music track may have progress lengths. Progress lengths may be defined in terms of time durations.
Referring toFIG. 1,processor11 may be configured to provide information processing capabilities insystem10. As such,processor11 may comprise one or more of a digital processor, an analog processor, a digital circuit designed to process information, a central processing unit, a graphics processing unit, a microcontroller, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information.Processor11 may be configured to execute one or more machinereadable instructions100 to facilitate modifying videos based on music. Machinereadable instructions100 may include one or more computer program components. Machinereadable instructions100 may include one or more ofaccess component102, visual effects selection component104, visualeffects application component106, and/or other computer program components. In some implementations, the machinereadable instructions100 may include music track analysis component108.
Access component102 may be configured to access video information defining one or more video content and/or other information.Access component102 may access video information from one or more storage locations. A storage location may includeelectronic storage12, electronic storage of one or more image sensors (not shown inFIG. 1), and/or other locations. For example,access component102 may accessvideo information22 stored inelectronic storage12.Access component102 may be configured to access video information defining one or more video content during acquisition of the video information and/or after acquisition of the video information by one or more image sensors. For example,access component102 may access video information defining a video while the video is being captured by one or more image sensors.Access component102 may access video information defining a video after the video has been captured and stored in memory (e.g., electronic storage12).
Access component102 may be configured to access music information defining one or more music tracks and/or other information.Access component102 may access music information from one or more storage locations. A storage location may includeelectronic storage12, electronic storage of one or more computing devices (not shown inFIG. 1), and/or other locations. For example,access component102 may access music information24 stored inelectronic storage12.
Access component102 may access particular music information (e.g., music information24) based on user selection, system information, video content, and/or other information. For example,access component102 may access particular music information defining a particular music track based on a user's selection of the music track to use as accompaniment for the video content.Access component102 may access particular music information defining a particular music track based on a system setting (e.g., last selected music track, next music track, default music track, video summary template specifying a particular music track).Access component102 may access particular music information defining a particular music track based on what is captured (e.g., activity, object, scene, movement, person, emotion) within the video content. Access of music information based on other parameters are contemplated.
A music track may provide an accompaniment for the video content. A music track may provide a song, music, and/or other sounds for play with the playback of one or more portions of or the entire the video content. The music track may be included in a file separate from the file containing the video content or may be included in the same file as the video content. The music track may be encoded with the video content.
A music track may have pulses. A pulse may refer to the beginning of a musical note or other sounds. In some implementations, pulses may occur at a periodic interval from other pulses. Repetition of pulses at periodic duration may be perceived as “beats” in a (repeating) series. The pulses that occur at periodic interval may be grouped based on accents/emphasis of the pulses. For example, pulses consisting of a strong/stressed pulse, a weak/unstressed pulse, and a weak/unstressed pulse may form a pulse group. A reoccurring pulse-group may form a meter of the music track. For example,FIG. 3A illustrates an example music track300. Music track300 may havepulses311 that occur at a period interval from other pulses.Pulses311 may includepulses311A,311B,311C,311D,311E,311F,311G,311H. In some implementations,pulses311 may include strong/stressed pulse of pulse groups. Other regular/irregular pulses are contemplated.
A music track may have one or more music events. A music event may refer to one or more occurrences of particular sound(s) in the music track. A music event may refer to occurrences of regular patterns or irregular patterns of sounds in the music track. A musical event may refer to one or more changes in the sound(s) in the music track. The particular sound(s)/changes in sound(s) in the music track may be of interest to a viewer/user of the video content with the music track providing accompaniment. Music events may be determined based on/indicate occurrence of one of more of a beat, a tempo, a rhythm, an instrument, a volume, a vocal, a chorus, a frequency, a style, a start, an end, and/or other sounds occurring within the music track. For example, referring toFIG. 3B, music event(s)312 (312A,312B,312C,312D,312E,312F,312G,312H) may refer to occurrences of pulses311 (311A,311B,311C,311D,311E,311F,311G,311H) within music track300. Music events may be determined based on/indicate occurrence of change in one or more of a beat, a tempo, a rhythm, an instrument, a volume, a vocal, a chorus, a frequency, a style, a start, an end, and/or other sounds occurring within the music track. For example, music events may be determined based on/indicate occurrence of start/end, parts (e.g., chorus, verse), transitions between parts (e.g., drops/releases/ramps), phrases (e.g., musical phrases, instruments phrases), hits (e.g., intense percussion sounds related/unrelated to the rhythm grid), bars, beats, strong beats, semi-quaver, quavers, and/or other sounds. Music events may be determined based on/indicate occurrence of timing events (e.g., beats, hits) or time-range events (e.g., parts, phrases). In some implementations, music events may be determined based on user input (e.g., manually specified music event). Other types of music events are contemplated.
Individual music events may correspond to different moments within the music track. A moment within the music track may include a point in time within the music track or a duration of time within the music track. A music event may correspond to a point in time within the music track or a duration of time within the music track. For example, referring toFIG. 3B, one or more of music events312 may correspond to moments within music track300 corresponding to the occurrences of one or more ofpulses311. One or more of music events312 may correspond to moments within music track300 corresponding to occurrences of pulse groups in whichpulses311 are strong/stressed pulses.
One or more music events may be individually classified into one or more categories based on intensities of one or more pulses occurring within the music event and/or other information. Intensities of one or more pulses may include one or more of energy and/or amplitude of the pulses. In some implementations, one or more music events may be classified into one or more categories based on user input (e.g., manual classification of music events). In some implementations, the pulses may be classified into one or more categories based on their intensities and the music events may be classified into one or more categories based on the classification of the pulses. One or more categories may be associated with different values (e.g., ranges) of intensities of the pulses and one or more music events/pulses may be classified based on the values (e.g., individual, summed total) of the intensities of the pulses. In some implementations, categories of music events/pulses may include a weak category, an average category, a strong category, and an intense category. Other categories are contemplated.
In some implementations, one or more music events may be classified into one or more categories based on the intensities of one or more pulses within multiple frequency ranges. In some implementations, multiple frequency ranges may include a low frequency range, a middle frequency range, a high frequency range, and/or other frequency ranges. For example, a music track may be converted into a frequency space for analysis. Intensities of the pulses may be analyzed within multiple frequency ranges. For example, frequency ranges may include a low frequency range between 100-600 Hz; a middle frequency range between 1000-5000 Hz; and a high frequency range above 5000 Hz. As another example, frequency ranges may include a sub frequency range between 20-40 Hz; a low-end frequency range between 40-160 Hz; a low-mid frequency range between 160-300 Hz; a mid-end frequency range between 300-1000 Hz; a mid-high frequency range between 1000-5000 Hz; and a high-end frequency range between 5000-20000 Hz. Other frequency ranges are contemplated.
The intensities of the pulses within the individual frequency ranges may be used to classify the pulses/music events. The intensities of multiple pulses may be combined (e.g., summed, averaged) to determine into which category the pulses/music events will be classified. The categories may indicate the energy states of the music track/music events based on the analysis of the pulses.
In some implementations, one or more consecutive pulses within the music track may be grouped based on similarity of the intensities within multiple frequency ranges (e.g., the low frequency range, the middle frequency range, and the high frequency range). The similarity of the intensities within the frequency ranges may indicate that the consecutive pulses correspond to a same part of the music track. In some implementations, one or more consecutive pulses within the music track may be grouped based on a hidden Markov model and/or other information.
In some implementations, one or more music events may be individually classified into one or more categories based on occurrences of other sound(s) within or near the music events. One or more music events may be classified based on structure of the music track. One or more music events may be classified based on whether the music event is within or near (e.g., at the beginning of/introduces, at the end of/terminates) an introduction section, a verse section, a pre-chorus section, a chorus section, a refrain section, a bridge section, an instrumental section, a solo section, a conclusion section, and/or other sections. One or more music event may be classified based on other characteristics (e.g., volume, tempo) of sounds that that occur within the music track.
Music track analysis component108 may be configured to analyze the music track to classify the music event(s) within a music track into one or more categories. Music track analysis component108 may classify the music event(s) based on the intensities of one or more pulses occurring within the music event and/or other information. In some implementations, the pulses may occur at a periodic interval from other pulses. Music track analysis component108 may analyze and quantify the intensities (e.g., energy, amplitude) of the pulses. In some implementations, music track analysis component108 may classify the music event(s) based on the intensities of the one or more pulses within multiple frequency ranges (e.g., a low frequency range, a middle frequency range, and a high frequency range). The intensities of multiple pulses may be combined (e.g., summed, averaged) to determine into which category the pulses/music events may be classified.
In some implementations, music track analysis component108 may classify the pulses/music event(s) into a weak category, an average category, a strong category, an intense category, and/or other categories. Other categories are contemplated. In some implementations, the categorization of the pulses/music events may be stored in electronic storage (e.g., electronic storage12). In some implementations, the categorization of the pulses/music events may be stored with the music track or separately from the music track.
Visual effects selection component104 may be configured to select one or more visual effects. A visual effect may refer to a change in presentation of the video content on a display. A visual effect may change the presentation of the video content for a video frame, for multiple frames, for a point in time, and/or for a duration of time.
In some implementations, a visual effect may include one or more changes in perceived speed at which the video content is presented during playback. For example, the video content may normally played at a perceived speed of 1× (e.g., a video captured at 30 frames/second is displayed at 30 frames/second). A visual effect may change the perceived speed of the video content playback (e.g., increase perceived speed of playback from 1× to 2×; decrease perceived speed of playback from 1× to 0.3×) for one or more portions of the video content.
A visual effect that changes the perceived speed of video content playback may include one or more speed ramps. A speed ramp may change the perceived speed of video content playback for a portion of the video content and then return the perceived playback speed to the original perceived playback speed. For example,FIGS. 4A and 4B illustrates example speed ramps. InFIG. 4A, the speed ramp (speed peak410) may, for a portion of the video content, increase the perceived playback speed above 1× speed and then return the perceived playback speed back to 1× speed. InFIG. 4B, the speed ramp (speed dip420) may, for a portion of the video content, decrease the perceived playback speed below 1× speed and then return the perceived playback speed back to 1× speed. In some implementations, a speed ramp may change the perceived playback speed to a value different from the original perceived playback speed. Other speed ramps are contemplated.
In some implementations, a visual effect may include one or more visual transformation of the video content. A visual transformation may include one or more visual changes in how the video content is presented during playback. A visual change may be applied for a moment within the playback of the video content or for a duration within the playback of the video content. In some implementations, a visual transformation may include one or more of a visual zoom, a visual filter, a visual rotation, a visual overlay (e.g., text and/or graphics overlay), and/or a visual vibration (e.g., visual shaking).
Visual effects selection component104 may select one or more visual effects for one or more of the different moments within the music track. The different moments within the music track may correspond to different music events. For example, referring toFIG. 3B, the different moments within music track300 for which visual effect(s) are selected may correspond to moments corresponding to one or more of music events312.
Visual effects selection component104 may select one or more visual effects based on the categories of the music event(s) corresponding to the different moment(s) within the music track and/or other information. For example, referring toFIG. 3B, visual effects selection component104 may select one or more visual effects based on categories of music events312. Selecting visual effects based on categories of the music events (e.g.,312) may enable visual effects selection component104 to select visual effects based on intensities of one or more pulses occurring within the music event and/or other information. For example, visual effects selection component104 may select visual effects based on intensities of one or more pulses within multiple frequency ranges (e.g., a low frequency range, a middle frequency range, and a high frequency range).
Music events may form an ordered set—any music event may be compared with other music event(s) on an intensity scale. For example, in a given time range, higher intensity music events may be identified for provision of visual effects as disclosed herein. In some implementations, visual effects selection component104 may distinguish between low-level patterns and high-levels patterns of music events and select different types of visual effects for different levels.
In some implementations, selecting one or more visual effects based on categories of music events/intensities of pulses may include changing the amount of impact of the visual effect on the video content. For example, one or more visual effects may include changes in perceived speed at which the video content is presented during playback, and visual effects selection component104 may determine the amount of changes in the perceived speed based on the categories of music events/intensities of pulses (e.g., larger speed peaks/dips and/or speed changes based on higher categories/intensities, smaller speed peaks/dips and/or speed changes based on lower categories/intensities). One or more visual effects may include visual transformation of the video content, and visual effects selection component104 may determine the amount of changes affected by the visual transformation based on the categories of music events/intensities of pulses (e.g., more dramatic/greater visual changes based on higher categories/intensities, less dramatic/smaller visual changes based on lower categories/intensities).
In some implementations, visual effects selection component104 may select one or more visual effects based on grouping(s) of consecutive pulses within the music track. For example, referring toFIG. 3B,pulses311 may be grouped into three groups based on similarity of the intensities:pulses311A,311B,311C may be grouped into a high energy group;pulses311D,311E,311F may be grouped into a low energy group; andpulses311G,311H may be grouped into an average energy group. Visual effects selection component104 may select one or more visual effects forpulses311A,311B,311C (music events312A,312B,312C) based on its grouping within the high energy group. Visual effects selection component104 may select one or more visual effects forpulses311D,311E,311F (music events312D,312E,312F) based on its grouping within the low energy group. Visual effects selection component104 may select one or more visual effects forpulses311G,311H (music events312G,312H) based on its grouping within the average energy group.
For example, one or more visual effects may include a change in the perceived playback speed of the video content. The visual effects selected for the high energy group may include increasing the perceived playback speed (e.g., above 1× speed). The visual effect selected for the low energy group may include decreasing the perceived playback speed (e.g., below 1× speed). The visual effect selected for the average energy group may include restoring the perceived playback speed (e.g., to defaultspeed 1×).
As another example, one or more visual effects may include visual transformation of the video content. The visual effects may be selected to match the changes in energy state between the groupings of pulses. For example, one or more visual transformation (e.g., mix, dissolve, crossfade, fade, wipe, rotation, shake) may be selected for transition between different energy groups (e.g., between low, average, high, intense). Other selections of visual effects based on groupings are contemplated.
In some implementations, visual effects selection component104 may select one or more visual effects based on the location of the groupings within the music track. For example, visual effects selection component104 may select one or more visual effects based on whether a grouping is located within/near an introduction section, a verse section, a pre-chorus section, a chorus section, a refrain section, a bridge section, an instrumental section, a solo section, a conclusion section, and/or other sections, and/or whether a group is located within/near the transition between different sections of the music track. Other selections of visual effects based on location of groupings are contemplated.
In some implementations, visual effects selection component104 may select one or more visual effects based on a user selection. A user selection may include selection of visual effect(s) or selection of one or more criteria for selection of visual effect(s). For example, a user may select particular visual effects to for particular moments within the music track or change the visual effects selected by visual effects selection component104 for particular moments within the music track. A user may select a list of visual effects from which visual effects selection component104 may make the selection and/or select when/how visual effects selection component104 selects the visual effect(s) or when/how the visual effects are triggered.
In some implementations, visual effects selection component104 may select one or more visual effects randomly from a list of visual effects. For example, visual effects selection component104 may have access to a list of visual effects and may choose the visual effect(s) at random for different moments within the music track. Visual effects selection component104 may remove from the list of visual effects already selected visual effects or may reduce the priority of reselecting already selected visual effects. Such may enable visual effects selection component104 to not select the same visual effect repeatedly for a music track.
In some implementations, visual effects selection component104 may select one or more visual effects based on the duration (e.g., measured in frames/time) of the video content. For example, visual effects selection component104 may select visual transformations rather than changes in the perceived playback speed based on the video content having a short duration. Visual effects selection component104 may select changes of perceived playback speeds rather than speed ramps based on the video content have a long duration. In some implementations, visual effects selection component104 may select one or more visual effects based on the duration of the music track and/or the duration of a slot within a video summary template into which the video content will be placed.
Visualeffects application component106 may be configured to apply one or more visual effects to the video content. Visualeffects application component106 may apply one or more visual effects to one or more moments within the video content aligned to one or more of the different moments that correspond to the music event(s) within the music track. For example,FIG. 5A illustrates example speed peaks510 applied to the video content. Speed peaks510 may be applied to individual moments within the video content aligned to different moments that correspond tomusic events312A,312B,312C,312D,312E,312F,312G,312H within music track300.
FIG. 5B illustrates example speed changes520 applied to the video content. Different perceived playback speed within speed changes520 may be applied to individual moments within the video content aligned to different moments that correspond tomusic events312A,312B,312C,312D,312E,312F,312G,312H. For example, based onmusic events312A,312B,312C being categorized in a strong category, perceived playback speed greater than 1× speed may be applied to moments within the video content aligned to moments corresponding tomusic events312A,312B,312C. Based onmusic events312D,312E,312F being categorized in a weak category, perceived playback speed less than 1× speed may be applied to moments within the video content aligned to moments corresponding tomusic events312D,312E,312F. Based onmusic events312G,312H being categorized in an average category, perceived playback speed of 1× speed may be applied to moments within the video content aligned to moments corresponding tomusic events312G,312H.
FIG. 5C illustrates example visual transformations (transition effects530, movement effects540) applied to the video content. For example, based on changes in energy states of pulses and/or changes in categories of music events312 within music track300,transition effects530 may be applied to individual moments within the video content aligned to different moments that correspond tomusic events312A,312D,312G.Transition effects530 may include visual effects that emphasize transitions between energy states of a music track and/or between different portions (e.g., introduction, verse, chorus) of a music track. For example,transition effects530 may include one or more of mix, dissolve, crossfade, fade, wipe, and/or other transition effects.
Based on categories ofmusic events312B,312C,312E,312F,312H within music track300, movement effects540 may be applied to individual moments within the video content aligned to different moments that correspond tomusic events312B,312C,312E,312F,312H. Movement effects540 may include visual effects that emphasize occurrences of particular sounds within a music track. For example, movement effects540 may include one or more of rotation, shake, and/or other movement effects. The magnitude of the movement effects540 may be determined based on the intensities of the pulses/categories of the music events312.
In some implementations, visualeffects application component106 may apply the visual effect(s) to the video content responsive to a user command (e.g., command to create video edits). In some implementations, visualeffects application component106 may apply the visual effect(s) to a preview of the video content (e.g., using lower resolution and/or lower framerate). In some implementations, visualeffects application component106 may apply the visual effect(s) to the video content responsive to a request for playback of the video content and/or at other times. For example, responsive to the request for playback of the video content, visualeffects application component106 may apply the visual effects during the playback/in real time.
Application of the visual effects to the video content may/may not change the original file containing the video content. Visualeffects application component106 may generate one or more files describing the visual effects. For example, visual effects application may generate a file that identifies the visual effects and the portions of the video content to which the visual effects are to be applied. Such files may be used at a subsequent time to apply the visual effects to the video content. Visualeffects application component106 may generate one or more files containing video content altered based on the application of the visual effects. For example,visual effects component106 may encode the video content to include the visual effects (alter the video content).Visual effects component106 may encode the altered video content with the music track.
The systems/methods disclosed herein may increase the visual impact of video content and provide for synchronization between audio impact and video impact. The types and locations of visual effects may be determined based on the content of the music tracks. The systems/methods disclosed herein may provide for music driven video time mapping to alter the video content based on what is happening within the music track. The visual intensity/look/feel of the video content may be matched to the music track.
In some implementations, video content may include one or more of spherical video content, virtual reality content, and/or other video content. Spherical video content may refer to a video capture of multiple views from a single location. Spherical video content may include a full spherical video capture (360 degrees of capture) or a partial spherical video capture (less than 360 degrees of capture). Spherical video content may be captured through the use of one or more cameras/image sensors to capture images/videos from a location. The captured images/videos may be stitched together to form the spherical video content.
Virtual reality content may refer to content that may be consumed via virtual reality experience. Virtual reality content may associate different directions within the virtual reality content with different viewing directions, and a user may view a particular directions within the virtual reality content by looking in a particular direction. For example, a user may use a virtual reality headset to change the user's direction of view. The user's direction of view may correspond to a particular direction of view within the virtual reality content. For example, a forward looking direction of view for a user may correspond to a forward direction of view within the virtual reality content.
Spherical video content and/or virtual reality content may have been captured at one or more locations. For example, spherical video content and/or virtual reality content may have been captured from a stationary position (e.g., a seat in a stadium). Spherical video content and/or virtual reality content may have been captured from a moving position (e.g., a moving bike). Spherical video content and/or virtual reality content may include video capture from a path taken by the capturing device(s) in the moving position. For example, spherical video content and/or virtual reality content may include video capture from a person walking around in a music festival.
While the present disclosure may be directed to video content, one or more other implementations of the system may be configured for other types media content. Other types of media content may include one or more of audio content (e.g., music, podcasts, audio books, and/or other audio content), multimedia presentations, images, slideshows, visual content (one or more images and/or videos), and/or other media content.
Implementations of the disclosure may be made in hardware, firmware, software, or any suitable combination thereof. Aspects of the disclosure may be implemented as instructions stored on a machine-readable medium, which may be read and executed by one or more processors. A machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computing device). For example, a tangible computer readable storage medium may include read only memory, random access memory, magnetic disk storage media, optical storage media, flash memory devices, and others, and a machine-readable transmission media may include forms of propagated signals, such as carrier waves, infrared signals, digital signals, and others. Firmware, software, routines, or instructions may be described herein in terms of specific exemplary aspects and implementations of the disclosure, and performing certain actions.
Althoughprocessor11 andelectronic storage12 are shown to be connected to interface13 inFIG. 1, any communication medium may be used to facilitate interaction between any components ofsystem10. One or more components ofsystem10 may communicate with each other through hard-wired communication, wireless communication, or both. For example, one or more components ofsystem10 may communicate with each other through a network. For example,processor11 may wirelessly communicate withelectronic storage12. By way of non-limiting example, wireless communication may include one or more of radio communication, Bluetooth communication, Wi-Fi communication, cellular communication, infrared communication, or other wireless communication. Other types of communications are contemplated by the present disclosure.
Althoughprocessor11 is shown inFIG. 1 as a single entity, this is for illustrative purposes only. In some implementations,processor11 may comprise a plurality of processing units. These processing units may be physically located within the same device, orprocessor11 may represent processing functionality of a plurality of devices operating in coordination.Processor11 may be configured to execute one or more components by software; hardware; firmware; some combination of software, hardware, and/or firmware; and/or other mechanisms for configuring processing capabilities onprocessor11.
It should be appreciated that although computer components are illustrated inFIG. 1 as being co-located within a single processing unit, in implementations in whichprocessor11 comprises multiple processing units, one or more of computer program components may be located remotely from the other computer program components.
The description of the functionality provided by the different computer program components described herein is for illustrative purposes, and is not intended to be limiting, as any of computer program components may provide more or less functionality than is described. For example, one or more ofcomputer program components102,104,106, and/or108 may be eliminated, and some or all of its functionality may be provided by other computer program components. As another example,processor11 may be configured to execute one or more additional computer program components that may perform some or all of the functionality attributed to one or more ofcomputer program components102,104,106, and/or108 described herein.
The electronic storage media ofelectronic storage12 may be provided integrally (i.e., substantially non-removable) with one or more components ofsystem10 and/or removable storage that is connectable to one or more components ofsystem10 via, for example, a port (e.g., a USB port, a Firewire port, etc.) or a drive (e.g., a disk drive, etc.).Electronic storage12 may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EPROM, EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media.Electronic storage12 may be a separate component withinsystem10, orelectronic storage12 may be provided integrally with one or more other components of system10 (e.g., processor11). Althoughelectronic storage12 is shown inFIG. 1 as a single entity, this is for illustrative purposes only. In some implementations,electronic storage12 may comprise a plurality of storage units. These storage units may be physically located within the same device, orelectronic storage12 may represent storage functionality of a plurality of devices operating in coordination.
FIG. 2 illustratesmethod200 for modifying videos based on music. The operations ofmethod200 presented below are intended to be illustrative. In some implementations,method200 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. In some implementations, two or more of the operations may occur substantially simultaneously.
In some implementations,method200 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, a central processing unit, a graphics processing unit, a microcontroller, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information). The one or more processing devices may include one or more devices executing some or all of the operation ofmethod200 in response to instructions stored electronically on one or more electronic storage mediums. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operation ofmethod200.
Referring toFIG. 2 andmethod200, atoperation201, video information defining video content may be accessed. The video information may be stored in physical storage media. In some implementation,operation201 may be performed by a processor component the same as or similar to access component102 (Shown inFIG. 1 and described herein).
Atoperation202, music information defining a music track may be accessed. The music track may provide an accompaniment for the video content. The music track may have pulses and one or more music events. Individual music events may correspond to different moments within the music track. One or more music events may be individually classified into one or more categories based on intensities of one or more pulses occurring within the music event. In some implementations,operation202 may be performed by a processor component the same as or similar to access component102 (Shown inFIG. 1 and described herein).
Atoperation203, one or more visual effects may be selected for one or more of the different moments within the music track. One or more visual effects may be selected based on the categories of the music event(s). In some implementations,operation203 may be performed by a processor component the same as or similar to visual effects selection component104 (Shown inFIG. 1 and described herein).
Atoperation204, one or more visual effects may be applied to the video content. One or more visual effects may be applied to one or more moments within the video content aligned to one or more of the different moments within the music track. In some implementations,operation204 may be performed by a processor component the same as or similar to visual effects application component106 (Shown inFIG. 1 and described herein).
Although the system(s) and/or method(s) of this disclosure have been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred implementations, it is to be understood that such detail is solely for that purpose and that the disclosure is not limited to the disclosed implementations, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present disclosure contemplates that, to the extent possible, one or more features of any implementation can be combined with one or more features of any other implementation.

Claims (18)

What is claimed is:
1. A system for modifying videos based on music, the system comprising:
one or more physical processors configured by machine-readable instructions to:
access video information defining video content;
access music information defining a music track, the music track providing an accompaniment for the video content, the music track having pulses and one or more music events, the individual music events corresponding to different moments within the music track, wherein the one or more music events are individually classified into one or more categories based on intensities of one or more pulses occurring within the music event, wherein the one or more music events are classified into the one or more categories based on the intensities of the one or more pulses within a low frequency range, a middle frequency range, and a high frequency range;
select one or more visual effects for one or more of the different moments within the music track based on the categories of the one or more music events; and
apply the one or more visual effects to the video content, the one or more visual effects applied to one or more moments within the video content aligned to the one or more of the different moments within the music track.
2. The system ofclaim 1, wherein the one or more categories include a weak category, an average category, a strong category, and an intense category.
3. The system ofclaim 1, wherein one or more consecutive pulses are grouped based on similarity of the intensities within the low frequency range, the middle frequency range, and the high frequency range.
4. The system ofclaim 3, wherein the one or more visual effects are selected further based on the grouping of the consecutive pulses.
5. The system ofclaim 1, wherein the one or more visual effects are selected further based on a user selection.
6. The system ofclaim 1, wherein the one or more visual effects include one or more changes in a perceived speed at which the video content is presented during playback.
7. The system ofclaim 1, wherein the one or more visual effects include one or more visual transformations of the video content.
8. The system ofclaim 1, wherein the one or more physical processors are further configured by machine-readable instructions to analyze the music track to classify the one or more music events into the one or more categories.
9. A method for modifying videos based on music, the method comprising:
accessing video information defining video content;
accessing music information defining a music track, the music track providing an accompaniment for the video content, the music track having pulses and one or more music events, the individual music events corresponding to different moments within the music track, wherein the one or more music events are individually classified into one or more categories based on intensities of one or more pulses occurring within the music event, wherein the one or more music events are classified into the one or more categories based on the intensities of the one or more pulses within a low frequency range, a middle frequency range, and a high frequency range;
selecting one or more visual effects for one or more of the different moments within the music track based on the categories of the one or more music events; and
applying the one or more visual effects to the video content, the one or more visual effects applied to one or more moments within the video content aligned to the one or more of the different moments within the music track.
10. The method ofclaim 9, wherein the one or more categories include a weak category, an average category, a strong category, and an intense category.
11. The method ofclaim 9, wherein one or more consecutive pulses are grouped based on similarity of the intensities within the low frequency range, the middle frequency range, and the high frequency range.
12. The method ofclaim 11, wherein the one or more visual effects are selected further based on the grouping of the consecutive pulses.
13. The method ofclaim 9, wherein the one or more visual effects are selected further based on a user selection.
14. The method ofclaim 9, wherein the one or more visual effects include one or more changes in a perceived speed at which the video content is presented during playback.
15. The method ofclaim 9, wherein the one or more visual effects include one or more visual transformations of the video content.
16. The method ofclaim 9, further comprising analyzing the music track to classify the one or more music events into the one or more categories.
17. A system for modifying videos based on music, the system comprising:
one or more physical processors configured by machine-readable instructions to:
access video information defining video content;
access music information defining a music track, the music track providing an accompaniment for the video content, the music track having pulses and one or more music events, the individual music events corresponding to different moments within the music track, wherein the one or more music events are individually classified into a weak category, an average category, a strong category, or an intense category based on intensities within a low frequency range, a middle frequency range, and a high frequency range of one or more pulses occurring within the music event;
select one or more visual effects for one or more of the different moments within the music track based on the categories of the one or more music events; and
apply the one or more visual effects to the video content, the one or more visual effects applied to one or more moments within the video content aligned to the one or more of the different moments within the music track.
18. The system ofclaim 17, wherein:
one or more consecutive pulses are grouped by similarity of the intensities within the low frequency range, the middle frequency range, and the high frequency range based on a hidden Markov model; and
the one or more visual effects are selected further based on the grouping of the consecutive pulses.
US15/447,7382017-03-022017-03-02Systems and methods for modifying videos based on musicActiveUS10127943B1 (en)

Priority Applications (4)

Application NumberPriority DateFiling DateTitle
US15/447,738US10127943B1 (en)2017-03-022017-03-02Systems and methods for modifying videos based on music
US16/188,679US10679670B2 (en)2017-03-022018-11-13Systems and methods for modifying videos based on music
US16/884,904US10991396B2 (en)2017-03-022020-05-27Systems and methods for modifying videos based on music
US17/239,590US11443771B2 (en)2017-03-022021-04-24Systems and methods for modifying videos based on music

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US15/447,738US10127943B1 (en)2017-03-022017-03-02Systems and methods for modifying videos based on music

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US16/188,679ContinuationUS10679670B2 (en)2017-03-022018-11-13Systems and methods for modifying videos based on music

Publications (1)

Publication NumberPublication Date
US10127943B1true US10127943B1 (en)2018-11-13

Family

ID=64050952

Family Applications (4)

Application NumberTitlePriority DateFiling Date
US15/447,738ActiveUS10127943B1 (en)2017-03-022017-03-02Systems and methods for modifying videos based on music
US16/188,679ActiveUS10679670B2 (en)2017-03-022018-11-13Systems and methods for modifying videos based on music
US16/884,904ActiveUS10991396B2 (en)2017-03-022020-05-27Systems and methods for modifying videos based on music
US17/239,590ActiveUS11443771B2 (en)2017-03-022021-04-24Systems and methods for modifying videos based on music

Family Applications After (3)

Application NumberTitlePriority DateFiling Date
US16/188,679ActiveUS10679670B2 (en)2017-03-022018-11-13Systems and methods for modifying videos based on music
US16/884,904ActiveUS10991396B2 (en)2017-03-022020-05-27Systems and methods for modifying videos based on music
US17/239,590ActiveUS11443771B2 (en)2017-03-022021-04-24Systems and methods for modifying videos based on music

Country Status (1)

CountryLink
US (4)US10127943B1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US10629173B2 (en)*2016-03-302020-04-21Pioneer DJ CoporationMusical piece development analysis device, musical piece development analysis method and musical piece development analysis program
CN113055738A (en)*2019-12-262021-06-29北京字节跳动网络技术有限公司Video special effect processing method and device
WO2022005442A1 (en)2020-07-032022-01-06Назар Юрьевич ПОНОЧЕВНЫЙSystem (embodiments) for harmoniously combining video files and audio files and corresponding method
WO2023144279A1 (en)*2022-01-272023-08-03Soclip!Dynamic visual intensity rendering
US11955142B1 (en)*2021-03-152024-04-09Gopro, Inc.Video editing using music characteristics

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9558029B2 (en)*2015-05-172017-01-31Nicira, Inc.Logical processing for containers
US10871981B2 (en)2015-11-012020-12-22Nicira, Inc.Performing logical network functionality within data compute nodes
US10915566B2 (en)2019-03-012021-02-09Soundtrack Game LLCSystem and method for automatic synchronization of video with music, and gaming applications related thereto
WO2022187057A1 (en)2021-03-052022-09-09Applied Materials, Inc.Detecting an excursion of a cmp component using time-based sequence of images
US12376822B2 (en)2021-05-262025-08-05The Regents Of The University Of CaliforniaThree-dimensional mapping of deep tissue modulus by stretchable ultrasonic arrays
US20230009672A1 (en)*2021-07-092023-01-12Soclip!Automatic modulation of display timing based on beat

Citations (163)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5130794A (en)1990-03-291992-07-14Ritchey Kurtis JPanoramic display system
WO2001020466A1 (en)1999-09-152001-03-22Hotv Inc.Method and apparatus for integrating animation in interactive video
US6337683B1 (en)1998-05-132002-01-08Imove Inc.Panoramic movies which simulate movement through multidimensional space
US6593956B1 (en)1998-05-152003-07-15Polycom, Inc.Locating an audio source
US20040128317A1 (en)2000-07-242004-07-01Sanghoon SullMethods and apparatuses for viewing, browsing, navigating and bookmarking videos and displaying images
US20050025454A1 (en)2003-07-282005-02-03Nobuo NakamuraEditing system and control method thereof
US20050251532A1 (en)*2004-05-072005-11-10Regunathan RadhakrishnanFeature identification of events in multimedia
US20060122842A1 (en)2004-12-032006-06-08Magix AgSystem and method of automatically creating an emotional controlled soundtrack
US7222356B1 (en)1999-01-142007-05-22Canon Kabushiki KaishaCommunication apparatus, storage medium, camera and processing method
US20070173296A1 (en)2005-02-152007-07-26Canon Kabushiki KaishaCommunication apparatus having power-saving communication function, and communication method
US20070204310A1 (en)2006-02-272007-08-30Microsoft CorporationAutomatically Inserting Advertisements into Source Video Content Playback Streams
US20070230461A1 (en)2006-03-292007-10-04Samsung Electronics Co., Ltd.Method and system for video data packetization for transmission over wireless channels
US20080044155A1 (en)2006-08-172008-02-21David KuspaTechniques for positioning audio and video clips
US20080123976A1 (en)2006-09-222008-05-29Reuters LimitedRemote Picture Editing
US20080152297A1 (en)2006-12-222008-06-26Apple Inc.Select Drag and Drop Operations on Video Thumbnails Across Clip Boundaries
US20080163283A1 (en)2007-01-032008-07-03Angelito Perez TanBroadband video with synchronized highlight signals
US20080177706A1 (en)1998-11-302008-07-24Yuen Henry CSearch engine for video and graphics
US20080183751A1 (en)*2007-01-252008-07-31Cazier Robert PApplying visual effect to image data based on audio data
US20080208791A1 (en)2007-02-272008-08-28Madirakshi DasRetrieving images based on an example image
US20080253735A1 (en)2007-04-162008-10-16Adobe Systems IncorporatedChanging video playback rate
US20080313541A1 (en)2007-06-142008-12-18Yahoo! Inc.Method and system for personalized segmentation and indexing of media
US7483618B1 (en)2003-12-042009-01-27Yesvideo, Inc.Automatic editing of a visual recording to eliminate content of unacceptably low quality and/or very little or no interest
WO2009040538A1 (en)2007-09-252009-04-02British Telecommunications Public Limited CompanyMultimedia content assembling for viral marketing purposes
US20090213270A1 (en)2008-02-222009-08-27Ryan IsmertVideo indexing and fingerprinting for video enhancement
US20090263100A1 (en)*2001-09-152009-10-22Apple Inc.Dynamic variation of output media signal in response to input media signal
US20090274339A9 (en)1998-08-102009-11-05Cohen Charles JBehavior recognition system
US20090327856A1 (en)2008-06-282009-12-31Mouilleseaux Jean-Pierre MAnnotation of movies
US20100045773A1 (en)2007-11-062010-02-25Ritchey Kurtis JPanoramic adapter system and method with spherical field-of-view coverage
US20100064219A1 (en)2008-08-062010-03-11Ron GabriskoNetwork Hosted Media Production Systems and Methods
US20100086216A1 (en)2008-10-082010-04-08Samsung Electronics Co., Ltd.Apparatus and method for ultra-high resolution video processing
US20100104261A1 (en)*2008-10-242010-04-29Zhu LiuBrief and high-interest video summary generation
US20100183280A1 (en)2008-12-102010-07-22Muvee Technologies Pte Ltd.Creating a new video production by intercutting between multiple video clips
US20100231730A1 (en)2009-03-132010-09-16Yuka IchikawaImage sensing device and camera
US20100245626A1 (en)2009-03-302010-09-30David Brian WoycechowskyDigital Camera
US20100251295A1 (en)2009-03-312010-09-30At&T Intellectual Property I, L.P.System and Method to Create a Media Content Summary Based on Viewer Annotations
US20100278504A1 (en)2009-04-302010-11-04Charles LyonsTool for Grouping Media Clips for a Media Editing Application
US20100281386A1 (en)2009-04-302010-11-04Charles LyonsMedia Editing Application with Candidate Clip Management
US20100281375A1 (en)2009-04-302010-11-04Colleen PendergastMedia Clip Auditioning Used to Evaluate Uncommitted Media Content
US20100278509A1 (en)2007-12-102010-11-04Kae NaganoElectronic Apparatus, Reproduction Method, and Program
US20100287476A1 (en)2006-03-212010-11-11Sony Corporation, A Japanese CorporationSystem and interface for mixing media content
US20100299630A1 (en)2009-05-222010-11-25Immersive Media CompanyHybrid media viewing application including a region of interest within a wide field of view
US20100318660A1 (en)2009-06-152010-12-16Qualcomm IncorporatedResource management for a wireless device
US20100321471A1 (en)2009-06-222010-12-23Casolara MarkMethod and system for performing imaging
US20110025847A1 (en)2009-07-312011-02-03Johnson Controls Technology CompanyService management using video processing
US20110069189A1 (en)2008-05-202011-03-24Pelican Imaging CorporationCapturing and processing of images using monolithic camera array with heterogeneous imagers
US20110069148A1 (en)2009-09-222011-03-24Tenebraex CorporationSystems and methods for correcting images in a multi-sensor system
US20110075990A1 (en)2009-09-252011-03-31Mark Kenneth EyerVideo Bookmarking
US20110093798A1 (en)2009-10-152011-04-21At&T Intellectual Property I, L.P.Automated Content Detection, Analysis, Visual Synthesis and Repurposing
US20110134240A1 (en)2009-12-082011-06-09Trueposition, Inc.Multi-Sensor Location and Identification
US20110173565A1 (en)2010-01-122011-07-14Microsoft CorporationViewing media in the context of street-level images
US20110206351A1 (en)2010-02-252011-08-25Tal GivoliVideo processing system and a method for editing a video asset
US20110211040A1 (en)2008-11-052011-09-01Pierre-Alain LindemannSystem and method for creating interactive panoramic walk-through applications
US20110258049A1 (en)2005-09-142011-10-20Jorey RamerIntegrated Advertising System
US20110293250A1 (en)2010-05-252011-12-01Deever Aaron TDetermining key video snippets using selection criteria
US20110320322A1 (en)2010-06-252011-12-29Symbol Technologies, Inc.Inventory monitoring using complementary modes for item identification
US20120014673A1 (en)2008-09-252012-01-19Igruuv Pty LtdVideo and audio content system
US20120027381A1 (en)2010-07-302012-02-02Kabushiki Kaisha ToshibaRecording/reading apparatus, method of generating tag list for recording/reading apparatus, and control unit for recording/reading apparatus
US20120030029A1 (en)2004-05-202012-02-02Manyworlds, Inc.System and Method for Adaptive Videos
US20120057852A1 (en)2009-05-072012-03-08Christophe DevleeschouwerSystems and methods for the autonomous production of videos from multi-sensored data
US20120123780A1 (en)2010-11-152012-05-17Futurewei Technologies, Inc.Method and system for video summarization
US20120127169A1 (en)2010-11-242012-05-24Google Inc.Guided Navigation Through Geo-Located Panoramas
US20120206565A1 (en)2011-02-102012-08-16Jason VillmerOmni-directional camera and related viewing software
US20120311448A1 (en)2011-06-032012-12-06Maha AchourSystem and methods for collaborative online multimedia production
US20130024805A1 (en)2011-07-192013-01-24Seunghee InMobile terminal and control method of mobile terminal
US20130044108A1 (en)2011-03-312013-02-21Panasonic CorporationImage rendering device, image rendering method, and image rendering program for rendering stereoscopic panoramic images
US20130058532A1 (en)2007-03-052013-03-07Sportvision, Inc.Tracking An Object With Multiple Asynchronous Cameras
US20130063561A1 (en)2011-09-142013-03-14Karel Paul StephanVirtual advertising platform
US20130078990A1 (en)2011-09-222013-03-28Mikyung KimMobile device and method for controlling reproduction of contents in mobile device
US8446433B1 (en)2009-06-122013-05-21Lucasfilm Entertainment Company Ltd.Interactive visual distortion processing
US20130127636A1 (en)2011-11-202013-05-23Cardibo, Inc.Wireless sensor network for determining cardiovascular machine usage
US20130136193A1 (en)2011-11-302013-05-30Samsung Electronics Co. Ltd.Apparatus and method of transmitting/receiving broadcast data
US20130142384A1 (en)2011-12-062013-06-06Microsoft CorporationEnhanced navigation through multi-sensor positioning
US20130151970A1 (en)2011-06-032013-06-13Maha AchourSystem and Methods for Distributed Multimedia Production
US20130166303A1 (en)2009-11-132013-06-27Adobe Systems IncorporatedAccessing media data using metadata repository
US20130191743A1 (en)2003-01-062013-07-25Glenn ReidMethod and apparatus for controlling volume
US20130197967A1 (en)2012-02-012013-08-01James Joseph Anthony PINTOCollaborative systems, devices, and processes for performing organizational projects, pilot projects and analyzing new technology adoption
US20130195429A1 (en)2012-01-312013-08-01Todor FaySystems and methods for media pesonalization using templates
US20130208134A1 (en)2012-02-142013-08-15Nokia CorporationImage Stabilization
US20130208942A1 (en)2010-09-302013-08-15British Telecommunications Public Limited CompanyDigital video fingerprinting
US20130215220A1 (en)2012-02-212013-08-22Sen WangForming a stereoscopic video
US20130259399A1 (en)2012-03-302013-10-03Cheng-Yuan HoVideo recommendation system and method thereof
US20130263002A1 (en)2012-03-302013-10-03Lg Electronics Inc.Mobile terminal
US20130283301A1 (en)2012-04-182013-10-24Scorpcast, LlcSystem and methods for providing user generated video reviews
US20130287304A1 (en)2012-04-262013-10-31Sony CorporationImage processing device, image processing method, and program
US20130287214A1 (en)2010-12-302013-10-31Dolby International AbScene Change Detection Around a Set of Seed Points in Media Data
US20130300939A1 (en)2012-05-112013-11-14Cisco Technology, Inc.System and method for joint speaker and scene recognition in a video/audio processing environment
US20130308921A1 (en)2012-05-212013-11-21Yahoo! Inc.Creating video synopsis for use in playback
US20130318443A1 (en)2010-08-242013-11-28Apple Inc.Visual presentation composition
US8611422B1 (en)2007-06-192013-12-17Google Inc.Endpoint based video fingerprinting
US20130343727A1 (en)2010-03-082013-12-26Alex Rav-AchaSystem and method for semi-automatic video editing
US20140026156A1 (en)2012-07-182014-01-23David DeephanphongsDetermining User Interest Through Detected Physical Indicia
US20140064706A1 (en)2012-09-052014-03-06Verizon Patent And Licensing Inc.Tagging video content
US20140072285A1 (en)2012-09-102014-03-13Google Inc.Media Summarization
US20140096002A1 (en)2012-09-282014-04-03Frameblast LimitedVideo clip editing system
US20140093164A1 (en)2012-10-012014-04-03Microsoft CorporationVideo scene detection
US20140105573A1 (en)2012-10-122014-04-17Nederlandse Organisatie Voor Toegepast-Natuurwetenschappelijk Onderzoek TnoVideo access system and method based on action type detection
US8718447B2 (en)2005-10-172014-05-06Samsung Electronics Co., Ltd.Method and apparatus for providing multimedia data using event index
US8730299B1 (en)2013-11-272014-05-20Dmitry KozkoSurround image mode for multi-lens mobile devices
US20140165119A1 (en)2012-04-242014-06-12Tencent Technology (Shenzhen) Company LimitedOffline download method, multimedia file download method and system thereof
US20140161351A1 (en)2006-04-122014-06-12Google Inc.Method and apparatus for automatically summarizing video
US20140169766A1 (en)2012-12-182014-06-19Realtek Semiconductor Corp.Method and computer program product for establishing playback timing correlation between different contents to be playbacked
US8763023B1 (en)2013-03-082014-06-24Amazon Technologies, Inc.Determining importance of scenes based upon closed captioning data
US20140176542A1 (en)2012-12-262014-06-26Makoto ShoharaImage-processing system, image-processing method and program
US20140193040A1 (en)2013-01-092014-07-10Omiimii Ltd.Method and apparatus for determining location
US20140212107A1 (en)2013-01-302014-07-31Felipe Saint-JeanSystems and Methods for Session Recording and Sharing
US20140219634A1 (en)2013-02-052014-08-07Redux, Inc.Video preview creation based on environment
US20140226953A1 (en)2013-02-142014-08-14Rply, Inc.Facilitating user input during playback of content
US20140232819A1 (en)2013-02-192014-08-21Tourwrist, Inc.Systems and methods for generating and sharing panoramic moments
US20140232818A1 (en)2013-02-192014-08-21Disney Enterprises, Inc.Method and device for spherical resampling for video generation
US20140245336A1 (en)2013-02-282014-08-28Verizon and Redbox Digital Entertainment Services, LLCFavorite media program scenes systems and methods
US20140300644A1 (en)2013-04-042014-10-09Sony CorporationMethod and apparatus for generating an image cut-out
US20140328570A1 (en)2013-01-092014-11-06Sri InternationalIdentifying, describing, and sharing salient events in images and videos
US20140341528A1 (en)2013-05-152014-11-20Abb Research Ltd.Recording and providing for display images of events associated with power equipment
US8910046B2 (en)2010-07-152014-12-09Apple Inc.Media-editing application with anchored timeline
US20140366052A1 (en)2013-06-052014-12-11David J. IvesSystem for Social Media Tag Extraction
US20140376876A1 (en)2010-08-262014-12-25Blast Motion, Inc.Motion event recognition and video synchronization system and method
US20150015680A1 (en)2013-07-102015-01-15Htc CorporationMethod and electronic device for generating multiple point of view video
US20150022355A1 (en)2013-07-172015-01-22Honeywell International Inc.Surveillance systems and methods
US20150029089A1 (en)2013-07-252015-01-29Samsung Electronics Co., Ltd.Display apparatus and method for providing personalized service thereof
US20150058709A1 (en)2012-01-262015-02-26Michael Edward ZaletelMethod of creating a media composition and apparatus therefore
US8988509B1 (en)2014-03-202015-03-24Gopro, Inc.Auto-alignment of image sensors in a multi-camera system
US20150085111A1 (en)2013-09-252015-03-26Symbol Technologies, Inc.Identification using video analytics together with inertial sensor data
US9036001B2 (en)2010-12-162015-05-19Massachusetts Institute Of TechnologyImaging system for immersive surveillance
US20150154452A1 (en)2010-08-262015-06-04Blast Motion Inc.Video and motion event integration system
US20150178915A1 (en)2013-12-192015-06-25Microsoft CorporationTagging Images With Emotional State Information
US20150186073A1 (en)2013-12-302015-07-02Lyve Minds, Inc.Integration of a device with a storage network
US9077956B1 (en)2013-03-222015-07-07Amazon Technologies, Inc.Scene identification
US20150220504A1 (en)2014-02-042015-08-06Adobe Systems IncorporatedVisual Annotations for Objects
US9111579B2 (en)2011-11-142015-08-18Apple Inc.Media editing with multi-camera media clips
US20150254871A1 (en)2014-03-042015-09-10Gopro, Inc.Automatic generation of video from spherical content using location-based metadata
US9142253B2 (en)2006-12-222015-09-22Apple Inc.Associating keywords to media
US20150271483A1 (en)2014-03-202015-09-24Gopro, Inc.Target-Less Auto-Alignment Of Image Sensors In A Multi-Camera System
US9151933B2 (en)2009-12-252015-10-06Sony CorporationImage-capturing apparatus, control method for image-capturing apparatus, and program
US20150287435A1 (en)2014-04-042015-10-08Red.Com, Inc.Video camera with capture modes
US20150294141A1 (en)2008-12-052015-10-15Nike, Inc.Athletic Performance Monitoring Systems and Methods in a Team Sports Environment
US20150318020A1 (en)2014-05-022015-11-05FreshTake Media, Inc.Interactive real-time video editor and recorder
US20150339324A1 (en)2014-05-202015-11-26Road Warriors International, Inc.System and Method for Imagery Warehousing and Collaborative Search Processing
US9204039B2 (en)2013-01-072015-12-01Huawei Technologies Co., Ltd.Image processing method and apparatus
US9208821B2 (en)2007-08-062015-12-08Apple Inc.Method and system to process digital audio data
US20150375117A1 (en)2013-05-222015-12-31David S. ThompsonFantasy sports integration with video content
US20150382083A1 (en)2013-03-062015-12-31Thomson LicensingPictorial summary for video
US20160005440A1 (en)2013-03-052016-01-07British Telecommunications Public Limited CompanyProvision of video data
US20160005435A1 (en)2014-07-032016-01-07Gopro, Inc.Automatic generation of video and directional audio from spherical content
US9245582B2 (en)2011-03-292016-01-26Capshore, LlcUser interface for method for creating a custom track
US20160027475A1 (en)2014-07-232016-01-28Gopro, Inc.Video scene classification by activity
US9253533B1 (en)2013-03-222016-02-02Amazon Technologies, Inc.Scene identification
US20160055885A1 (en)2014-07-232016-02-25Gopro, Inc.Voice-Based Video Tagging
US20160088287A1 (en)2014-09-222016-03-24Samsung Electronics Company, Ltd.Image stitching for three-dimensional video
US20160098941A1 (en)2013-05-212016-04-07Double Blue Sports Analytics, Inc.Methods and apparatus for goaltending applications including collecting performance metrics, video and sensor analysis
US9317172B2 (en)2009-04-302016-04-19Apple Inc.Tool for navigating a composite presentation
US20160119551A1 (en)2014-10-222016-04-28Sentry360Optimized 360 Degree De-Warping with Virtual Cameras
US20160217325A1 (en)2010-08-262016-07-28Blast Motion Inc.Multi-sensor event analysis and tagging system
US20160225410A1 (en)2015-02-032016-08-04Garmin Switzerland GmbhAction camera content management system
US20160225405A1 (en)2015-01-292016-08-04Gopro, Inc.Variable playback speed template for video editing application
US20160234345A1 (en)2015-02-052016-08-11Qwire Holdings LlcMedia player distribution and collaborative editing
US9423944B2 (en)2011-09-062016-08-23Apple Inc.Optimized volume adjustment
US9473758B1 (en)2015-12-062016-10-18Sliver VR Technologies, Inc.Methods and systems for game video recording and virtual reality replay
US9479697B2 (en)2012-10-232016-10-25Bounce Imaging, Inc.Systems, methods and media for generating a panoramic view
US20160358603A1 (en)2014-01-312016-12-08Hewlett-Packard Development Company, L.P.Voice input command
US20160366330A1 (en)2015-06-112016-12-15Martin Paul BoliekApparatus for processing captured video data based on capture device orientation
US20160364963A1 (en)*2015-06-122016-12-15Google Inc.Method and System for Detecting an Audio Event for Smart Home Devices
US20170006214A1 (en)2015-06-302017-01-05International Business Machines CorporationCognitive recording and sharing
US9564173B2 (en)2009-04-302017-02-07Apple Inc.Media editing application for auditioning different types of media clips

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1251487C (en)*2001-06-222006-04-12诺基亚有限公司 Automatic indexing of digital video recordings
US20050182503A1 (en)*2004-02-122005-08-18Yu-Ru LinSystem and method for the automatic and semi-automatic media editing
CA2590234A1 (en)*2004-12-132006-06-22Muvee Technologies Pte LtdA method of automatically editing media recordings
WO2007073347A1 (en)*2005-12-192007-06-28Agency For Science, Technology And ResearchAnnotation of video footage and personalised video generation
US7716572B2 (en)*2006-07-142010-05-11Muvee Technologies Pte Ltd.Creating a new music video by intercutting user-supplied visual data with a pre-existing music video
US8699858B2 (en)*2008-08-292014-04-15Adobe Systems IncorporatedCombined visual and auditory processing
CN102870109B (en)*2010-03-262016-03-02富士通株式会社 Category generating device and category generating method
US9031384B2 (en)*2011-06-022015-05-12Panasonic Intellectual Property Corporation Of AmericaRegion of interest identification device, region of interest identification method, region of interest identification program, and region of interest identification integrated circuit
US9588968B2 (en)*2012-04-252017-03-07Nokia Technologies OyMethod and apparatus for acquiring event information on demand
US9083997B2 (en)*2012-05-092015-07-14YooToo Technologies, LLCRecording and publishing content on social media websites
US8995823B2 (en)*2012-07-172015-03-31HighlightCam, Inc.Method and system for content relevance score determination
US9318116B2 (en)*2012-12-142016-04-19Disney Enterprises, Inc.Acoustic data transmission based on groups of audio receivers
JP5942864B2 (en)*2013-01-182016-06-29ソニー株式会社 Terminal device, content transmission method, content transmission program, and content reproduction system
KR102195897B1 (en)*2013-06-052020-12-28삼성전자주식회사Apparatus for dectecting aucoustic event, operating method thereof, and computer-readable recording medium having embodied thereon a program which when executed by a computer perorms the method
US9378768B2 (en)*2013-06-102016-06-28Htc CorporationMethods and systems for media file management
US20150113408A1 (en)*2013-10-182015-04-23Apple Inc.Automatic custom sound effects for graphical elements
TWI486904B (en)*2013-12-042015-06-01Inst Information IndustryMethod for rhythm visualization, system, and computer-readable memory
JP6060989B2 (en)*2015-02-252017-01-18カシオ計算機株式会社 Voice recording apparatus, voice recording method, and program
US10430664B2 (en)*2015-03-162019-10-01Rohan SanilSystem for automatically editing video
US9691429B2 (en)*2015-05-112017-06-27Mibblio, Inc.Systems and methods for creating music videos synchronized with an audio track

Patent Citations (169)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5130794A (en)1990-03-291992-07-14Ritchey Kurtis JPanoramic display system
US6337683B1 (en)1998-05-132002-01-08Imove Inc.Panoramic movies which simulate movement through multidimensional space
US6593956B1 (en)1998-05-152003-07-15Polycom, Inc.Locating an audio source
US20090274339A9 (en)1998-08-102009-11-05Cohen Charles JBehavior recognition system
US20080177706A1 (en)1998-11-302008-07-24Yuen Henry CSearch engine for video and graphics
US7222356B1 (en)1999-01-142007-05-22Canon Kabushiki KaishaCommunication apparatus, storage medium, camera and processing method
WO2001020466A1 (en)1999-09-152001-03-22Hotv Inc.Method and apparatus for integrating animation in interactive video
US20040128317A1 (en)2000-07-242004-07-01Sanghoon SullMethods and apparatuses for viewing, browsing, navigating and bookmarking videos and displaying images
US20090263100A1 (en)*2001-09-152009-10-22Apple Inc.Dynamic variation of output media signal in response to input media signal
US20130191743A1 (en)2003-01-062013-07-25Glenn ReidMethod and apparatus for controlling volume
US20050025454A1 (en)2003-07-282005-02-03Nobuo NakamuraEditing system and control method thereof
US7483618B1 (en)2003-12-042009-01-27Yesvideo, Inc.Automatic editing of a visual recording to eliminate content of unacceptably low quality and/or very little or no interest
US20050251532A1 (en)*2004-05-072005-11-10Regunathan RadhakrishnanFeature identification of events in multimedia
US20120030029A1 (en)2004-05-202012-02-02Manyworlds, Inc.System and Method for Adaptive Videos
US20060122842A1 (en)2004-12-032006-06-08Magix AgSystem and method of automatically creating an emotional controlled soundtrack
US20070173296A1 (en)2005-02-152007-07-26Canon Kabushiki KaishaCommunication apparatus having power-saving communication function, and communication method
US20110258049A1 (en)2005-09-142011-10-20Jorey RamerIntegrated Advertising System
US8718447B2 (en)2005-10-172014-05-06Samsung Electronics Co., Ltd.Method and apparatus for providing multimedia data using event index
US20070204310A1 (en)2006-02-272007-08-30Microsoft CorporationAutomatically Inserting Advertisements into Source Video Content Playback Streams
US20100287476A1 (en)2006-03-212010-11-11Sony Corporation, A Japanese CorporationSystem and interface for mixing media content
US20070230461A1 (en)2006-03-292007-10-04Samsung Electronics Co., Ltd.Method and system for video data packetization for transmission over wireless channels
US20140161351A1 (en)2006-04-122014-06-12Google Inc.Method and apparatus for automatically summarizing video
US20080044155A1 (en)2006-08-172008-02-21David KuspaTechniques for positioning audio and video clips
US20080123976A1 (en)2006-09-222008-05-29Reuters LimitedRemote Picture Editing
US20080152297A1 (en)2006-12-222008-06-26Apple Inc.Select Drag and Drop Operations on Video Thumbnails Across Clip Boundaries
US9142253B2 (en)2006-12-222015-09-22Apple Inc.Associating keywords to media
US20080163283A1 (en)2007-01-032008-07-03Angelito Perez TanBroadband video with synchronized highlight signals
US20080183751A1 (en)*2007-01-252008-07-31Cazier Robert PApplying visual effect to image data based on audio data
US20080208791A1 (en)2007-02-272008-08-28Madirakshi DasRetrieving images based on an example image
US20130058532A1 (en)2007-03-052013-03-07Sportvision, Inc.Tracking An Object With Multiple Asynchronous Cameras
US20080253735A1 (en)2007-04-162008-10-16Adobe Systems IncorporatedChanging video playback rate
US20080313541A1 (en)2007-06-142008-12-18Yahoo! Inc.Method and system for personalized segmentation and indexing of media
US8611422B1 (en)2007-06-192013-12-17Google Inc.Endpoint based video fingerprinting
US9208821B2 (en)2007-08-062015-12-08Apple Inc.Method and system to process digital audio data
WO2009040538A1 (en)2007-09-252009-04-02British Telecommunications Public Limited CompanyMultimedia content assembling for viral marketing purposes
US20100045773A1 (en)2007-11-062010-02-25Ritchey Kurtis JPanoramic adapter system and method with spherical field-of-view coverage
US20100278509A1 (en)2007-12-102010-11-04Kae NaganoElectronic Apparatus, Reproduction Method, and Program
US20090213270A1 (en)2008-02-222009-08-27Ryan IsmertVideo indexing and fingerprinting for video enhancement
US20110069189A1 (en)2008-05-202011-03-24Pelican Imaging CorporationCapturing and processing of images using monolithic camera array with heterogeneous imagers
US20090327856A1 (en)2008-06-282009-12-31Mouilleseaux Jean-Pierre MAnnotation of movies
US20100064219A1 (en)2008-08-062010-03-11Ron GabriskoNetwork Hosted Media Production Systems and Methods
US20120014673A1 (en)2008-09-252012-01-19Igruuv Pty LtdVideo and audio content system
US20100086216A1 (en)2008-10-082010-04-08Samsung Electronics Co., Ltd.Apparatus and method for ultra-high resolution video processing
US20100104261A1 (en)*2008-10-242010-04-29Zhu LiuBrief and high-interest video summary generation
US20110211040A1 (en)2008-11-052011-09-01Pierre-Alain LindemannSystem and method for creating interactive panoramic walk-through applications
US20150294141A1 (en)2008-12-052015-10-15Nike, Inc.Athletic Performance Monitoring Systems and Methods in a Team Sports Environment
US20100183280A1 (en)2008-12-102010-07-22Muvee Technologies Pte Ltd.Creating a new video production by intercutting between multiple video clips
US20100231730A1 (en)2009-03-132010-09-16Yuka IchikawaImage sensing device and camera
US20100245626A1 (en)2009-03-302010-09-30David Brian WoycechowskyDigital Camera
US20100251295A1 (en)2009-03-312010-09-30At&T Intellectual Property I, L.P.System and Method to Create a Media Content Summary Based on Viewer Annotations
US20100281386A1 (en)2009-04-302010-11-04Charles LyonsMedia Editing Application with Candidate Clip Management
US9564173B2 (en)2009-04-302017-02-07Apple Inc.Media editing application for auditioning different types of media clips
US20100281375A1 (en)2009-04-302010-11-04Colleen PendergastMedia Clip Auditioning Used to Evaluate Uncommitted Media Content
US9032299B2 (en)2009-04-302015-05-12Apple Inc.Tool for grouping media clips for a media editing application
US20100278504A1 (en)2009-04-302010-11-04Charles LyonsTool for Grouping Media Clips for a Media Editing Application
US9317172B2 (en)2009-04-302016-04-19Apple Inc.Tool for navigating a composite presentation
US20120057852A1 (en)2009-05-072012-03-08Christophe DevleeschouwerSystems and methods for the autonomous production of videos from multi-sensored data
US20100299630A1 (en)2009-05-222010-11-25Immersive Media CompanyHybrid media viewing application including a region of interest within a wide field of view
US8446433B1 (en)2009-06-122013-05-21Lucasfilm Entertainment Company Ltd.Interactive visual distortion processing
US20100318660A1 (en)2009-06-152010-12-16Qualcomm IncorporatedResource management for a wireless device
US20100321471A1 (en)2009-06-222010-12-23Casolara MarkMethod and system for performing imaging
US20110025847A1 (en)2009-07-312011-02-03Johnson Controls Technology CompanyService management using video processing
US20110069148A1 (en)2009-09-222011-03-24Tenebraex CorporationSystems and methods for correcting images in a multi-sensor system
US20110075990A1 (en)2009-09-252011-03-31Mark Kenneth EyerVideo Bookmarking
US20110093798A1 (en)2009-10-152011-04-21At&T Intellectual Property I, L.P.Automated Content Detection, Analysis, Visual Synthesis and Repurposing
US20130166303A1 (en)2009-11-132013-06-27Adobe Systems IncorporatedAccessing media data using metadata repository
US20110134240A1 (en)2009-12-082011-06-09Trueposition, Inc.Multi-Sensor Location and Identification
US9151933B2 (en)2009-12-252015-10-06Sony CorporationImage-capturing apparatus, control method for image-capturing apparatus, and program
US20110173565A1 (en)2010-01-122011-07-14Microsoft CorporationViewing media in the context of street-level images
US20110206351A1 (en)2010-02-252011-08-25Tal GivoliVideo processing system and a method for editing a video asset
US20130343727A1 (en)2010-03-082013-12-26Alex Rav-AchaSystem and method for semi-automatic video editing
US20110293250A1 (en)2010-05-252011-12-01Deever Aaron TDetermining key video snippets using selection criteria
US20110320322A1 (en)2010-06-252011-12-29Symbol Technologies, Inc.Inventory monitoring using complementary modes for item identification
US8910046B2 (en)2010-07-152014-12-09Apple Inc.Media-editing application with anchored timeline
US20120027381A1 (en)2010-07-302012-02-02Kabushiki Kaisha ToshibaRecording/reading apparatus, method of generating tag list for recording/reading apparatus, and control unit for recording/reading apparatus
US20130318443A1 (en)2010-08-242013-11-28Apple Inc.Visual presentation composition
US20150154452A1 (en)2010-08-262015-06-04Blast Motion Inc.Video and motion event integration system
US20140376876A1 (en)2010-08-262014-12-25Blast Motion, Inc.Motion event recognition and video synchronization system and method
US20160217325A1 (en)2010-08-262016-07-28Blast Motion Inc.Multi-sensor event analysis and tagging system
US20130208942A1 (en)2010-09-302013-08-15British Telecommunications Public Limited CompanyDigital video fingerprinting
US20120123780A1 (en)2010-11-152012-05-17Futurewei Technologies, Inc.Method and system for video summarization
US20120127169A1 (en)2010-11-242012-05-24Google Inc.Guided Navigation Through Geo-Located Panoramas
US9036001B2 (en)2010-12-162015-05-19Massachusetts Institute Of TechnologyImaging system for immersive surveillance
US20130287214A1 (en)2010-12-302013-10-31Dolby International AbScene Change Detection Around a Set of Seed Points in Media Data
US20120206565A1 (en)2011-02-102012-08-16Jason VillmerOmni-directional camera and related viewing software
US9245582B2 (en)2011-03-292016-01-26Capshore, LlcUser interface for method for creating a custom track
US20130044108A1 (en)2011-03-312013-02-21Panasonic CorporationImage rendering device, image rendering method, and image rendering program for rendering stereoscopic panoramic images
US20130151970A1 (en)2011-06-032013-06-13Maha AchourSystem and Methods for Distributed Multimedia Production
US20120311448A1 (en)2011-06-032012-12-06Maha AchourSystem and methods for collaborative online multimedia production
US20130024805A1 (en)2011-07-192013-01-24Seunghee InMobile terminal and control method of mobile terminal
US9423944B2 (en)2011-09-062016-08-23Apple Inc.Optimized volume adjustment
US20130063561A1 (en)2011-09-142013-03-14Karel Paul StephanVirtual advertising platform
US20130078990A1 (en)2011-09-222013-03-28Mikyung KimMobile device and method for controlling reproduction of contents in mobile device
US9111579B2 (en)2011-11-142015-08-18Apple Inc.Media editing with multi-camera media clips
US20130127636A1 (en)2011-11-202013-05-23Cardibo, Inc.Wireless sensor network for determining cardiovascular machine usage
US20130136193A1 (en)2011-11-302013-05-30Samsung Electronics Co. Ltd.Apparatus and method of transmitting/receiving broadcast data
US20130142384A1 (en)2011-12-062013-06-06Microsoft CorporationEnhanced navigation through multi-sensor positioning
US20150058709A1 (en)2012-01-262015-02-26Michael Edward ZaletelMethod of creating a media composition and apparatus therefore
US20130195429A1 (en)2012-01-312013-08-01Todor FaySystems and methods for media pesonalization using templates
US20130197967A1 (en)2012-02-012013-08-01James Joseph Anthony PINTOCollaborative systems, devices, and processes for performing organizational projects, pilot projects and analyzing new technology adoption
US20130208134A1 (en)2012-02-142013-08-15Nokia CorporationImage Stabilization
US20130215220A1 (en)2012-02-212013-08-22Sen WangForming a stereoscopic video
US20130259399A1 (en)2012-03-302013-10-03Cheng-Yuan HoVideo recommendation system and method thereof
US20130263002A1 (en)2012-03-302013-10-03Lg Electronics Inc.Mobile terminal
US20130283301A1 (en)2012-04-182013-10-24Scorpcast, LlcSystem and methods for providing user generated video reviews
US20140165119A1 (en)2012-04-242014-06-12Tencent Technology (Shenzhen) Company LimitedOffline download method, multimedia file download method and system thereof
US20130287304A1 (en)2012-04-262013-10-31Sony CorporationImage processing device, image processing method, and program
US20130300939A1 (en)2012-05-112013-11-14Cisco Technology, Inc.System and method for joint speaker and scene recognition in a video/audio processing environment
US20130308921A1 (en)2012-05-212013-11-21Yahoo! Inc.Creating video synopsis for use in playback
US20140026156A1 (en)2012-07-182014-01-23David DeephanphongsDetermining User Interest Through Detected Physical Indicia
US20140064706A1 (en)2012-09-052014-03-06Verizon Patent And Licensing Inc.Tagging video content
US20140072285A1 (en)2012-09-102014-03-13Google Inc.Media Summarization
US20140096002A1 (en)2012-09-282014-04-03Frameblast LimitedVideo clip editing system
US20140093164A1 (en)2012-10-012014-04-03Microsoft CorporationVideo scene detection
US20140105573A1 (en)2012-10-122014-04-17Nederlandse Organisatie Voor Toegepast-Natuurwetenschappelijk Onderzoek TnoVideo access system and method based on action type detection
US9479697B2 (en)2012-10-232016-10-25Bounce Imaging, Inc.Systems, methods and media for generating a panoramic view
US20140169766A1 (en)2012-12-182014-06-19Realtek Semiconductor Corp.Method and computer program product for establishing playback timing correlation between different contents to be playbacked
US20140176542A1 (en)2012-12-262014-06-26Makoto ShoharaImage-processing system, image-processing method and program
US9204039B2 (en)2013-01-072015-12-01Huawei Technologies Co., Ltd.Image processing method and apparatus
US20140193040A1 (en)2013-01-092014-07-10Omiimii Ltd.Method and apparatus for determining location
US20140328570A1 (en)2013-01-092014-11-06Sri InternationalIdentifying, describing, and sharing salient events in images and videos
US20140212107A1 (en)2013-01-302014-07-31Felipe Saint-JeanSystems and Methods for Session Recording and Sharing
US20140219634A1 (en)2013-02-052014-08-07Redux, Inc.Video preview creation based on environment
US20140226953A1 (en)2013-02-142014-08-14Rply, Inc.Facilitating user input during playback of content
US20140232819A1 (en)2013-02-192014-08-21Tourwrist, Inc.Systems and methods for generating and sharing panoramic moments
US20140232818A1 (en)2013-02-192014-08-21Disney Enterprises, Inc.Method and device for spherical resampling for video generation
US20140245336A1 (en)2013-02-282014-08-28Verizon and Redbox Digital Entertainment Services, LLCFavorite media program scenes systems and methods
US20160005440A1 (en)2013-03-052016-01-07British Telecommunications Public Limited CompanyProvision of video data
US20150382083A1 (en)2013-03-062015-12-31Thomson LicensingPictorial summary for video
US8763023B1 (en)2013-03-082014-06-24Amazon Technologies, Inc.Determining importance of scenes based upon closed captioning data
US9077956B1 (en)2013-03-222015-07-07Amazon Technologies, Inc.Scene identification
US9253533B1 (en)2013-03-222016-02-02Amazon Technologies, Inc.Scene identification
US20140300644A1 (en)2013-04-042014-10-09Sony CorporationMethod and apparatus for generating an image cut-out
US20140341528A1 (en)2013-05-152014-11-20Abb Research Ltd.Recording and providing for display images of events associated with power equipment
US20160098941A1 (en)2013-05-212016-04-07Double Blue Sports Analytics, Inc.Methods and apparatus for goaltending applications including collecting performance metrics, video and sensor analysis
US20150375117A1 (en)2013-05-222015-12-31David S. ThompsonFantasy sports integration with video content
US20140366052A1 (en)2013-06-052014-12-11David J. IvesSystem for Social Media Tag Extraction
US20150015680A1 (en)2013-07-102015-01-15Htc CorporationMethod and electronic device for generating multiple point of view video
US20150022355A1 (en)2013-07-172015-01-22Honeywell International Inc.Surveillance systems and methods
US20150029089A1 (en)2013-07-252015-01-29Samsung Electronics Co., Ltd.Display apparatus and method for providing personalized service thereof
US20150085111A1 (en)2013-09-252015-03-26Symbol Technologies, Inc.Identification using video analytics together with inertial sensor data
US8730299B1 (en)2013-11-272014-05-20Dmitry KozkoSurround image mode for multi-lens mobile devices
US20150178915A1 (en)2013-12-192015-06-25Microsoft CorporationTagging Images With Emotional State Information
US20150186073A1 (en)2013-12-302015-07-02Lyve Minds, Inc.Integration of a device with a storage network
US20160358603A1 (en)2014-01-312016-12-08Hewlett-Packard Development Company, L.P.Voice input command
US20150220504A1 (en)2014-02-042015-08-06Adobe Systems IncorporatedVisual Annotations for Objects
US20150256808A1 (en)2014-03-042015-09-10Gopro, Inc.Generation of video from spherical content using edit maps
US20150254871A1 (en)2014-03-042015-09-10Gopro, Inc.Automatic generation of video from spherical content using location-based metadata
US20150256746A1 (en)2014-03-042015-09-10Gopro, Inc.Automatic generation of video from spherical content using audio/visual analysis
US20150271483A1 (en)2014-03-202015-09-24Gopro, Inc.Target-Less Auto-Alignment Of Image Sensors In A Multi-Camera System
US8988509B1 (en)2014-03-202015-03-24Gopro, Inc.Auto-alignment of image sensors in a multi-camera system
US20150287435A1 (en)2014-04-042015-10-08Red.Com, Inc.Video camera with capture modes
US20150318020A1 (en)2014-05-022015-11-05FreshTake Media, Inc.Interactive real-time video editor and recorder
US20150339324A1 (en)2014-05-202015-11-26Road Warriors International, Inc.System and Method for Imagery Warehousing and Collaborative Search Processing
US20160005435A1 (en)2014-07-032016-01-07Gopro, Inc.Automatic generation of video and directional audio from spherical content
US20160027475A1 (en)2014-07-232016-01-28Gopro, Inc.Video scene classification by activity
US20160029105A1 (en)2014-07-232016-01-28Gopro, Inc.Generating video summaries for a video using video summary templates
US20160026874A1 (en)2014-07-232016-01-28Gopro, Inc.Activity identification in video
US20160055885A1 (en)2014-07-232016-02-25Gopro, Inc.Voice-Based Video Tagging
US20160027470A1 (en)2014-07-232016-01-28Gopro, Inc.Scene and activity identification in video summary generation
US20160088287A1 (en)2014-09-222016-03-24Samsung Electronics Company, Ltd.Image stitching for three-dimensional video
US20160119551A1 (en)2014-10-222016-04-28Sentry360Optimized 360 Degree De-Warping with Virtual Cameras
US20160225405A1 (en)2015-01-292016-08-04Gopro, Inc.Variable playback speed template for video editing application
US20160225410A1 (en)2015-02-032016-08-04Garmin Switzerland GmbhAction camera content management system
US20160234345A1 (en)2015-02-052016-08-11Qwire Holdings LlcMedia player distribution and collaborative editing
US20160366330A1 (en)2015-06-112016-12-15Martin Paul BoliekApparatus for processing captured video data based on capture device orientation
US20160364963A1 (en)*2015-06-122016-12-15Google Inc.Method and System for Detecting an Audio Event for Smart Home Devices
US20170006214A1 (en)2015-06-302017-01-05International Business Machines CorporationCognitive recording and sharing
US9473758B1 (en)2015-12-062016-10-18Sliver VR Technologies, Inc.Methods and systems for game video recording and virtual reality replay

Non-Patent Citations (23)

* Cited by examiner, † Cited by third party
Title
Ernoult, Emeric, "How to Triple Your YouTube Video Views with Facebook", SocialMediaExaminer.com, Nov. 26, 2012, 16 pages.
FFmpeg, "AVPacket Struct Reference," Doxygen, Jul. 20, 2014, 24 Pages, [online] [retrieved on Jul. 13, 2015] Retrieved from the internet <URL:https://www.ffmpeg.org/doxygen/2.5/group_lavf_decoding.html>.
FFmpeg, "Demuxing," Doxygen, Dec. 5, 2014, 15 Pages, [online] [retrieved on Jul. 13, 2015] Retrieved from the internet <URL:https://www.ffmpeg.org/doxygen/2.3/group_lavf_encoding.html>.
FFmpeg, "Muxing," Doxygen, Jul. 20, 2014, 9 Pages, [online] [retrieved on Jul. 13, 2015] Retrieved from the internet <URL: https://www.ffmpeg.org/doxyg en/2. 3/structA VP a ck et. html>.
Han et al., Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding, International Conference on Learning Representations 2016, 14 pgs.
He et al., "Deep Residual Learning for Image Recognition," arXiv:1512.03385, 2015, 12 pgs.
Iandola et al., "SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size", arXiv:1602.07360v3 [cs.CV] Apr. 6, 2016 (9 pgs.).
Iandola et al., "SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size," arXiv:1602.07360, 2016, 9 pgs.
Ioffe et al., "Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift," arXiv:1502.03167, 2015, 11 pgs.
Parkhi et al., "Deep Face Recognition," Proceedings of the British Machine Vision, 2015, 12 pgs.
PCT International Preliminary Report on Patentability for PCT/US2015/023680, dated Oct. 4, 2016, 10 pages.
PCT International Search Reort for PCT/US15/18538 dated Jun. 16, 2015 (2 pages).
PCT International Search Report and Written Opinion for PCT/US15/12086 dated Mar. 17, 2016, 20 pages.
PCT International Search Report and Written Opinion for PCT/US15/18538, dated Jun. 16, 2015, 26 pages.
PCT International Search Report and Written Opinion for PCT/US2015/023680, dated Oct. 6, 2015, 13 pages.
PCT International Search Report for PCT/US15/23680 dated Aug. 3, 2015, 4 pages.
PCT International Search Report for PCT/US15/41624 dated Nov. 4, 2015, 5 pages.
PCT International Search Report for PCT/US17/16367 dated Apr. 14, 2017 (2 pages).
PCT International Written Opinion for PCT/US2015/041624, Dec. 17, 2015, 7 Pages.
Ricker, "First Click: TomTom's Bandit camera beats GoPro with software" Mar. 9, 2016 URL: http://www.theverge.com/2016/3/9/11179298/tomtom-bandit-beats-gopro (6 pages).
Schroff et al., "FaceNet: A Unified Embedding for Face Recognition and Clustering," IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, 10 pgs.
Tran et al., "Learning Spatiotemporal Features with 3D Convolutional Networks", arXiv:1412.0767 [cs.CV] Dec. 2, 2014 (9 pgs).
Yang et al., "Unsupervised Extraction of Video Highlights Via Robust Recurrent Auto-encoders" arXiv:1510.01442v1 [cs.CV] Oct. 6, 2015 (9 pgs).

Cited By (12)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US10629173B2 (en)*2016-03-302020-04-21Pioneer DJ CoporationMusical piece development analysis device, musical piece development analysis method and musical piece development analysis program
CN113055738A (en)*2019-12-262021-06-29北京字节跳动网络技术有限公司Video special effect processing method and device
WO2021129628A1 (en)*2019-12-262021-07-01北京字节跳动网络技术有限公司Video special effects processing method and apparatus
KR20220106848A (en)*2019-12-262022-07-29베이징 바이트댄스 네트워크 테크놀로지 컴퍼니, 리미티드 Video special effects processing methods and devices
US20220321802A1 (en)*2019-12-262022-10-06Beijing Bytedance Network Technology Co., Ltd.Video special effects processing method and apparatus
US11882244B2 (en)*2019-12-262024-01-23Beijing Bytedance Network Technology Co., Ltd.Video special effects processing method and apparatus
US20240106968A1 (en)*2019-12-262024-03-28Beijing Bytedance Network Technology Co., Ltd.Video special effects processing method and apparatus
KR102700232B1 (en)2019-12-262024-08-30베이징 바이트댄스 네트워크 테크놀로지 컴퍼니, 리미티드 Method and device for processing video special effects
US12155957B2 (en)*2019-12-262024-11-26Beijing Bytedance Network Technology Co., Ltd.Video special effects processing method and apparatus
WO2022005442A1 (en)2020-07-032022-01-06Назар Юрьевич ПОНОЧЕВНЫЙSystem (embodiments) for harmoniously combining video files and audio files and corresponding method
US11955142B1 (en)*2021-03-152024-04-09Gopro, Inc.Video editing using music characteristics
WO2023144279A1 (en)*2022-01-272023-08-03Soclip!Dynamic visual intensity rendering

Also Published As

Publication numberPublication date
US11443771B2 (en)2022-09-13
US20190080719A1 (en)2019-03-14
US10679670B2 (en)2020-06-09
US10991396B2 (en)2021-04-27
US20210241800A1 (en)2021-08-05
US20200286520A1 (en)2020-09-10

Similar Documents

PublicationPublication DateTitle
US11443771B2 (en)Systems and methods for modifying videos based on music
AU2021201916B2 (en)Rhythmic Synchronization Of Cross Fading For Musical Audio Section Replacement For Multimedia Playback
US10681408B2 (en)Systems and methods for creating composite videos
US9691429B2 (en)Systems and methods for creating music videos synchronized with an audio track
US20180295427A1 (en)Systems and methods for creating composite videos
US20100204811A1 (en)Realtime Editing and Performance of Digital Audio Tracks
US11689692B2 (en)Looping presentation of video content
US11790952B2 (en)Pose estimation for video editing
Chiarandini et al.A system for dynamic playlist generation driven by multimodal control signals and descriptors
CN115220625A (en)Audio playing method and device, electronic equipment and computer readable storage medium
KR100462826B1 (en)A portable multimedia playing device of synchronizing independently produced at least two multimedia data, a method for controlling the device, and a system of providing the multimedia data with the device
KR100383019B1 (en)Apparatus for authoring a music video
US11955142B1 (en)Video editing using music characteristics
WO2017081486A1 (en)Video system
HK1246965B (en)Rhythmic synchronization for cross fading of musical audio sections

Legal Events

DateCodeTitleDescription
STCFInformation on status: patent grant

Free format text:PATENTED CASE

CCCertificate of correction
MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:4


[8]ページ先頭

©2009-2025 Movatter.jp