Digital Humans on the Big Screen
By Don Monroe
Communications of the ACM, August 2020, Vol. 63 No. 8, Pages 12-14 10.1145/3403972
Artificial images have been around almost as long as movies. As computing power has grown and digital photography has become commonplace, special effects have increasingly been created digitally, and have become much more realistic as a result.
ACM's Turing Award for 2019 to Patrick M. Hanrahan and Edwin E. Catmull reflected in part their contributions to computer-generated imagery (CGI), notably at the pioneering animation company Pixar.
CGI is best known in science fiction or other fantastic settings, where audiences presumably already have suspended their disbelief. Similarly, exotic creatures can be compelling when they display even primitive human facial expressions. Increasingly, however, CGI is used to save money on time, extras, or sets in even mundane scenes for dramatic movies.
To represent principal characters, however, filmmakers must contend with our fine-tuned sensitivity to facial expressions. Falling short can leave viewers in the "uncanny valley," distracted or even repulsed by a zombie-like representation. "Trying to do realistic humans is still the most difficult aspect of visual effects," said Craig Barron, creative director at visual development and experience company Magnopus. Barron shared the 2008 Academy Award for Best Visual Effects for The Curious Case of Benjamin Button, in which the title character ages backward from an old man to an infant.
In the last decade, many films have included short flashbacks with younger versions of their characters. Within the last year, however, some films have used new techniques to create feature-length performances by convincingly "de-aged" actors. Artificial intelligence also increasingly will augment the labor-intensive effects-generation process, allowing filmmakers to tell new types of stories. ... "... '
No comments:
Post a Comment