Digital look-alike
Digital look-alike is a term used to describe a computer animation or a still image so vivid and realistic looking that it fools the watcher to believe it is a human imaged with a movie or stills camera and not a visual simulation imaged with a simulation of camera.
Digital look-alikes have been possible since early 2000s with audience debut generally considered to be in the 2003 films Matrix Reloaded in the burly brawl sequence where up-to-100 Agent Smiths fight Neo and in Matrix Revolutions where at the start of the end showdown Agent Smith's cheekbone gets punched in by Neo leaving the digital look-alike unnaturally unhurt. Subset of virtual cinematography.
History leading up to digital look-alikes[edit]
Main articles: History of computer animation and Timeline of computer animation in film and television
- In 1971 Henri Gouraud made the first CG geometry capture and representation of a human face. Modeling was his wife Sylvie Gouraud. The 3D model was a simple wire-frame model and he applied the Gouraud shader he is most known for to produce the first known representation of human-likeness on computer (view images).[1]
- The 1972 short film A Computer Animated Hand by Edwin Catmull and Fred Parke was the first time that computer-generated imagery was used in film to simulate human appearance. The film featured a computer simulated hand and face (watch film here).
- The 1976 film Futureworld reused parts of A Computer Animated Hand on the big screen.
- The 1983 music video for song Musique Non-Stop by German band Kraftwerk aired in 1986. It features non-realistic looking, but clearly recognizable computer simulations of the band members.
- The 1994 film The Crow was the first film production to make use of digital compositing of a computer simulated representation of a face onto scenes filmed using a body double. Necessity was the muse as the actor Brandon Lee portraying the protagonist was tragically killed accidentally on-stage.
Breakthrough of light stage #1: Reflectance capture[edit]
Main article: light stage
In SIGGRAPH 2000 Paul Debevec et al. presented the last missing piece required to make digital look-alikes, which was adequate and feasible capture and simulation of the bidirectional scattering distribution function (BSDF) over the human body and especially the face.[2]
Previous attempts to make digital look-alikes had problems in getting the skin to look natural until year 2000 when the portion of light that enters and exits the skin was taken into account.[2] (the models are glowing from within ever so slightly)
The method they used to find the light that travels under the skin was based on the existing scientific knowledge that light reflecting off the air-to-oil retains it's polarization while light that travels under the skin loses it's polarization.[2]
The main scientific breakthrough required the following modest equipment which made up the 1st light stage or light cage made by Debevec et al.:
- Moveable digital camera
- Moveable light source (full rotation with adjustable radius and height in the 1st light stage)
- 2 polarizers set into few angles in front of the light source and the camera
- A computer with relatively simple programs doing relatively simple tasks.[2]
After the reflectance field of the human face had been captured and simulated it became possible with advanced 3D computer graphics programs it is possible to make digital look-alikes.
Development of successive light stages[edit]
Since then Debevec and his team have constructed 6 further versions of the light stage at the University of Southern California (USC) Institute for Creative Technologies (ICT).[3] The 7th light stage of USC by Ghosh et al. is oddly named "Light stage X" and not "Light stage 7".[4] Latest USC built light stage is the mobile light stage.[5]
Process[edit]
The whole process of making digital look-alikes is very complex due to the soft body dynamics of the human appearance and it usually can be divided to the capture part and synthesis part.
Photorealistically modeling, animating, cross-mapping, and rendering of characters so lifelike and realistic that they can be passed off as pictures of human is a very complex task and for a believable result both light reflected from skin (BRDF) and within the skin (a special case of BTDF) which together make up the BSDF. Digital look-alikes may be look-alikes of no-one.
Following things need to be captured in order to make a digital look-alike
- Geometry may be acquired with a 3D scanner such as an Arius3d or Cyberware scanner, or from photographs and movies with stereo camera or multiple cameras utilising machine vision algorithms.
- Textures from RGB XYZ scanner such as Arius3d or photographs
- Reflectance field captured with a light cage and modeled with a BSDFs over the 3 dimensional surface
- Motion capture may be shot with a variety of methods mostly multi-camera computer stereo vision. Future development in machine ability may render human actor and human graphics artists unnecessary to the process.
Both physics/physiology based (i.e. skeletal animation) and image-based modeling and rendering are often employed in the synthesis part. Hybrid models employing both approaches have shown best results in realism and ease-of-use. Using displacement mapping plays an important part in getting a realistic result with fine detail of skin such as pores and wrinkles as small as 100 µm.
Applications[edit]
- Virtual cinematography
- Computer and video games
- Disinformation attacks
- Therapy - "Psychologists and counselors have also begun using avatars to deliver therapy to clients who have phobias, a history of trauma, addictions, Asperger’s syndrome or social anxiety."[6] The strong memory imprint and brain activation effects caused by watching a digital look-alike avatar of yourself is dubbed the doppelgänger effect.[6]
See examples of digital look-alikes[edit]
- Leading up to 2003 ESC Entertainment with George Borshukov in lead did the digital look-alikes of Keanu Reeves (Neo), Hugo Weaving (Agent Smith), Laurence Fishburne (Morpheus) and Randall Duk Kim (The Keymaker) for the 2003 movies Matrix Reloaded and Matrix Revolutions.
- In 2003 The Animatrix: Final Flight of the Osiris a state-of-the-art want to be digital look-alikes not quite fooling the watcher made by Square Pictures.
- One can observe from the clip "Animatrix: Final Flight of The Osiris" that following things are difficult to do digitally (this is based on an assumption that the animators were aiming for photorealism):
- where surfaces meet in contact e.g. lips-meeting-lips, eyes meeting eyelids and eyelids meeting eyelids etc. Collisions of basically any non-rigid objects are difficult to render where they meet.[citation needed]
- Additionally classically it is known that hair is difficult to do digitally, some hairstyles more ( open hair ), some less (rigid, non-moving hairdo ).[citation needed]
- Even an untrained eye notices the algorithms controlling running motion look non-realistic contrasted with the running in the end of this TED talk video made in 2009 looks perfectly natural ).
- Skin microstructure with the light scattering off the skin and back-from-the-skin is complex[3]
- In 2003 digital look-alike of Tobey Maguire was made for movies Spider-man 2 and Spider-man 3 by Sony Pictures Imageworks.[7]
- In 2009 Debevec et al. presented a new digital look-alike, made by Image Metrics, this time of actress Emily O'Brien whose reflectance was captured with the USC light stage 5[3] In this TED talk video at 00:04:59 you can see two clips, one with the real Emily shot with a real camera and one with a digital look-alike of Emily, shot with a simulation of a camera - Which is which is difficult to tell. Bruce Lawmen was scanned using USC light stage 6 in still position and also recorded running there on a treadmill. Many, many digital look-alikes of Bruce are seen running fluently and natural looking at the ending sequence of the TED talk video.[3] Motion looks fairly convincing contrasted to the clunky run in the Animatrix: Final Flight of the Osiris which was state-of-the-art in 2003 if photorealism was the intention of the animators.
- In 2009 a digital look-alike of a younger Arnold Schwarzennegger was made for the movie Terminator Salvation though the end result was critiqued to be unconvincing. Facial geometry was acquired from an 1984 mold of Schwarzenegger.
- In 2010 Walt Disney Pictures released a sci-fi sequel entitled Tron: Legacy with a digitally youngened digital look-alike of actor Jeff Bridges playing the antagonist CLU
- In SIGGGRAPH 2013 Activision and USC presented a real time "Digital Ira" a digital face look-alike of Ari Shapiro, an ICT USC research scientist,[8] utilizing the USC light stage X by Ghosh et al. for both reflectance field and motion capture.[4] The end result both precomputed and real-time rendering with the modernest game GPU shown here and looks fairly realistic.
- In 2014 the geometry, texture and reflectance of President Barack Obama was captured using the mobile light stage in a cooperation between USC ICT, the White House and the Smithsonian Institution.[5]
- For the 2015 film Furious 7 a digital look-alike of actor Paul Walker who died in an accident during the filming was done by Weta Digital to enable the completion of the film.[9]
- In 2016 techniques which allow near real-time counterfeiting of facial expressions in existing 2D video have been believably demonstrated.
- In 2016 a digital look-alike of Peter Cushing was made for the Rogue One film where its appearance would appear to be of same age as the actor was during the filming of the original 1977 Star Wars film.
References[edit]
- ↑ "Images de synthèse : palme de la longévité pour l'ombrage de Gouraud".
- ↑ 2.0 2.1 2.2 2.3 Debevec, Paul (2000). "Acquiring the reflectance field of a human face". ACM. doi:10.1145/344779.344855. Retrieved 2013-07-21.
- ↑ 3.0 3.1 3.2 3.3 Debevec, Paul (2009). "Paul Debevec animates a photo-real digital face TEDxUSC show&tell talk 2009" (video). Retrieved 2013-07-21.
- ↑ 4.0 4.1 Debevec, Paul. "Digital Ira SIGGRAPH 2013 Real-Time Live". Retrieved 2013-07-31.
- ↑ 5.0 5.1 "Scanning and printing a 3D portrait of President Barack Obama". University of Southern California. 2013. Retrieved 2015-11-04.
- ↑ 6.0 6.1 Murphy, Samantha (2011). "Scientific American: Your Avatar, Your Guide" (.pdf). Scientific American / Uni of Stanford. Retrieved 2013-08-10.
- ↑ Pighin, Frédéric. "Siggraph 2005 Digital Face Cloning Course Notes" (PDF). Retrieved 2013-07-21.
- ↑ ReForm - Hollywood's Creating Digital Clones (youtube). The Creators Project. 2015-05-19.
- ↑ Giardina, Carolyn (2015-03-25). "'Furious 7' and How Peter Jackson's Weta Created Digital Paul Walker". The Hollywood Reporter. Retrieved 2015-11-11.
- ↑ Thies, Justus (2016). "Face2Face: Real-time Face Capture and Reenactment of RGB Videos". Proc. Computer Vision and Pattern Recognition (CVPR), IEEE. Retrieved 2016-11-19.
- Category:Simulation
- Category:Computer graphics
- Category:Communication of falsehoods
- Category:Information warfare
- Category:Propaganda techniques
- Category:Information operations and warfare
- Category:Forgery controversies
- Category:Internet manipulation and propaganda
Digital look-alike is a term required for understanding the modern world since early-mid 00's[edit]
This article "Digital look-alike" is from Wikipedia. The list of its authors can be seen in its historical and/or the page Edithistory:Digital look-alike. Articles copied from Draft Namespace on Wikipedia could be seen on the Draft Namespace of Wikipedia and not main one.