Lip sync
From Wikipedia, the free encyclopedia
Lip-sync or Lip-synch (short for lip synchronization) is a technical term for matching lip movements with voice. The term refers both to a technique often used during musical performances and the production of film and Television programs, and to the problem of maintaining video and audio signals synchronized during post-production and transmission. It is also used to match lip movements of animated characters (including computer facial animation).
Contents |
[edit] Lip-synching in music
Though lip-synching can be used to make it appear as though actors have musical ability (e.g., The Partridge Family) or to misattribute vocals (e.g., Milli Vanilli, Ashlee Simpson), it is more often used by recording artists to create a particular effect, to enable them to perform live dance numbers, or to cover for illness or other deficiencies during live performance.
Because the film track and music track are recorded separately during the creation of a music video, artists usually lip-synch to their songs and often imitate playing musical instruments as well. Artists also sometimes move their lips at a different speed from the track, or even backwards, to create an unusual effect in the final clip.
Artists often lip-synch during strenuous dance numbers in both live and recorded performances. They may also lip-synch in situations in which their back-up bands and sound systems cannot be accommodated, such as the Macy's Thanksgiving Day Parade which features popular singers lip-synching while riding floats.
Some singers habitually lip-synch during live performance, both concert and televised. Others sing the lead part over a complete recording or over just pre-recorded music and backing vocals. Sometimes when this is done, the live vocals are less audible than the backing track. Some groups lip-synch supporting vocal parts or shared parts in order to maintain vocal harmony or to ensure balance of volume among several singers.
Some artists switch between live singing and lip-synching during performance, particularly during songs which require them to hit particularly high or low notes. Lip-synching these notes ensures that they will not be out of tune and that the artist will not strain his voice too much during an arduous concert. Once the difficult portion of the song has passed, the artist may continue to lip-synch or may resume singing live.
Some artists may choose to lip-synch during live performance because of stage fright or perceptions of inadequacy. Unlike studio recording, live performance provides only one chance to sing each song correctly. An artist may worry that his voice is not strong enough, that it will sound noticeably different from recorded versions, or that he will hit a wrong note.
Other artists have chosen to lip-synch quite obviously for comedic value. During a short, pre-recorded performance, such as a guest appearance on a TV show, some artists purposely include Easter eggs like swapping instruments between band members or playing their instruments in obviously erroneous ways.
The band Oasis has done this on several occasions. Often lead guitarist Noel Gallagher replaces his brother Liam at the microphone stand, impersonating him, emphasising his mannerisms, and playing the tambourine, while Liam stands in the background pretending to play the guitar.
During Polish group Kanał Audytywny’s performance on Kuba Wojewódzki's talk-show (video), the trumpeter held his trumpet backwards and band members wore helmets labeled PLAYBACK, the Polish name for lip-synched performance.
On at least one occasion, John Lennon of The Beatles intentionally revealed that the group was lip-synching; during the performance, he scratched his face, licked his lips, mimed incorrect words, and began dancing while playing his instrument.
The practice of-syncing also occurs in musical theatre, for much the same purpose as for musicians. A production may include a mix of lip-synced and live musical numbers. In long-running shows, this may be done to help protect the performer's voice from strain and damage, as well as to maintain a high calibre of production. A notable example of using lip-syncing as a special effect includes several touring performances of The Phantom of the Opera, where swing actors in the same costume as the lead actors, to give the illusion of the characters moving around the stage with some mystery.
Non-professionals often use lip-synching as a form of musical pantomime in which the performer moves his lips to a musical recording done by someone else. This form of lip-synching is often performed by drag queens and, more recently, drag kings.
In the United States, this hobby reached its peak during the 1980s, when several game shows, such as Puttin' on the Hits and Lip Service, were created.
[edit] Lip-synching in films
In film production lip synching is often part of the post-production phase. Most film today contains scenes where the dialogue has been re-recorded afterwards, lip synching is the technique used when animated characters speak, and lip synching is essential when films are dubbed into other languages.
[edit] ADR
Automated dialogue replacement (ADR) is a film sound technique involving the re-recording of dialogue after photography. It is called post-synchronisation (post-sync) in the UK.
[edit] Animation
The other is the art of making a character appear to speak in a pre-recorded track of dialogue. The lip sync technique to make an animated character appear to speak involves figuring out the timings of the speech (breakdown) as well as the actual animating of the lips/mouth to match the dialogue track. The earliest examples of lip-sync in animation were attempted by Max Fleischer in his 1926 short My Old Kentucky Home. The technique continues to this day, with animated films and television shows such as Shrek, Lilo & Stitch, and The Simpsons using lip-synching to make their artificial characters talk. Lip synching is also used in comedies such as This Hour Has 22 Minutes and political satire, changing totally or just partially the original wording. It has been used in conjunction with translation of films from one language to another, for example, Spirited Away. Lip synching can be a very difficult issue in translating foreign works to a domestic release, as a simple translation of the lines often leaves overrun or underrun of high dialog to mouth movements.
[edit] Language dubbing
Quality film dubbing requires that the dialogue is first translated in such a way that the words used can match the lip movements of the actor. However, this is often impossible to achieve if the translation is to stay true to the original dialogue. Very good lip synch of dubbing is also a very lengthy and expensive process.
As an unusually extreme reaction to poorly done dubbing, Saparmurat Niyazov, the former president of Turkmenistan, issued a ban on lip synching in his country in August 2005.
[edit] Lip-synching in video games
Early video games did not feature prominent use of voice, mainly being text-based. At most, games featured some generic jaw or mouth movement to convey a communication process in addition to text. However, as games become more advanced, lip sync and voice acting has become a major focus of many games.
[edit] Role-playing games
Lip sync is a minor focus in role-playing games. Because of the sheer amount of information conveyed through the game, the majority of communication is done through the use of scrolling text. Most RPGs rely solely on text, while some games display inanimate portraits to provide a better sense of who is speaking. Some games make use of some voice acting, such as Grandia II, but due to simple character models, there is no mouth movement to simulate speech. RPGs are still largely based on text, with the rare use of lip sync and voice files being reserved for full motion video cutscenes. Some newer RPGs, however, use full voice overs. These games are typically for computers or next gen systems - such as the Xbox 360 - and include such games as Star Wars: Knights of the Old Republic and The Elder Scrolls: Oblivion. In these full voice over games, lip sync is crucial.
[edit] Strategy games
Unlike RPGs, strategy games make extensive use of sound files to create an immersive battle environment. Most games simply played a recorded audio track on cue with some games providing inanimate portraits to accompany the respective voice. StarCraft used full motion video character portraits with several generic speaking animations that did not synchronise with the lines spoken in the game. The game did, however, make extensive use of recorded speech to convey the game's plot, with the speaking animations providing a good idea of the flow of the conversation. Warcraft III used fully rendered 3D models to animate speech with generic mouth movements, both as character portraits as well as the in-game units. Like the FMV portraits, the 3D models did not synchronise with actual spoken text, while in-game models tended to simulate speech by moving their heads and arms rather than using actual lip synchronisation. Similarly, the game Codename Panzers uses camera angles and hand movements to simulate speech, as the characters have no actual mouth movement.
[edit] First-person shooters
Out of all the gaming genres, first-person shooters have placed the most emphasis on lip sync.[citation needed] Due to increasingly detailed character models requiring animation, game developers assign many resources to create realistic lip synchronisation with the many lines of speech used in most FPS games. Early 3D models used basic up-and-down jaw movements to simulate speech. As technology progressed, mouth movements began to closely resemble real human speech movements. Medal of Honor: Frontline dedicated a development team to lip sync alone, producing the most accurate lip synchronisation for games. Since then, games like Medal of Honor: Pacific Assault and Half-Life 2 have made use of coding that dynamically simulates mouth movements to produce sounds as if they were spoken by a live person, resulting in astoundingly life-like characters. Gamers who create their own videos using character models with no lip movements, such as the helmeted Master Chief from Halo , improvise lip movements by moving the characters' arms, bodies and making a bobbing movement with the head (see Red vs. Blue).
[edit] Transmission synchronization
An example of a lip synchronization problem is the case in which television video and audio signals are transported via different facilities (e.g., a geosynchronous satellite radio link and a landline) that have significantly different delay times, respectively. In such cases it is necessary to delay the earlier of the two signals electronically to allow for the difference in propagation times. See also audio video sync and audio synchronizer.
Lip sync issues have become a serious problem for the television industry world wide. Lip sync problems are not only annoying, but can lead to subconscious viewer stress which in turn leads to viewer dislike of the television program they are watching. See the report "Effects of Audio-Video Asynchrony on Viewer's Memory, Evaluation of Content and Detection Ability" by Reeves and Voelker (a non-copyrighted PDF is available at http://www.lipfix.com/file/doc/reeves_and_voelker_paper.pdf). Television industry standards organizations have become involved in setting standards for lip sync errors. See for example ATSC Document IS-191 (http://www.atsc.org/standards/is_191.pdf).
[edit] Lip synching in the headlines
- In 1998 on Top of the Pops, the duo All About Eve were supposed to perform their single "Martha's Harbour". However due to a technical problem, the television audience could hear the song but the band couldn't. Lead singer Julianne Regan remained silent on a stool on stage while her backing guitarist did not play. An unseen stagehand apparently prompted them that something was wrong in time to mime along to the second verse.
- In 2004, singer Ashlee Simpson appeared on Saturday Night Live in a promotional visit. She was scheduled to sing two songs from her album Autobiography. However, when beginning to sing the album's title track, another song began playing in the background, and it was revealed that she was either lip-synching or using a vocal backing track.
[edit] See also
- Playback singing
- Lypsinka
- Lipps Inc
- Faceposer: Half-Life 2 (a first-person shooter) tool to edit facial animation and lip-synching in Mods
- look alike contest