Li joined Industrial Light & Magic / Lucasfilm in 2012 as a research lead to develop the next generation real-time performance capture technologies for virtual production and visual effects. He has been on the faculty in the Computer Science department at the University of Southern California since 2013, where he is currently an associate professor. In 2014, he spent a summer as a visiting professor at Weta Digital advancing the facial tracking and hair digitization technologies for the visual effects of Furious 7 and . In 2015, he founded Pinscreen, Inc., an Artificial Intelligence startup that specializes on the creation of photorealistic virtual avatars using advanced machine learning algorithms. In 2016, he was appointed director of the Vision and Graphics Lab at the USC Institute for Creative Technologies.
Research
He is best known for his work on dynamic geometry processing and data-driven techniques for making 3D human digitization and facial animation accessible to the masses. During his PhD, Li co-created the first real-time and markerless system for performance-driven facial animation based on depth sensors which won the best paper award at the ACM SIGGRAPH / Eurographics Symposium on Computer Animation in 2009. The team later commercialized a variant of this technology as the facial animation software Faceshift. His technique in deformable shape registration is used by the company C-Rad AB and widely deployed in hospitals for tracking tumors in real-time during radiation therapy. In 2013, he introduced a home scanning system that uses a Kinect to capture people into game characters or realistic miniature versions. This technology was licensed by Artec and released as a free software Shapify.me. In 2014, he was brought on as visiting professor at Weta Digital to build the high-fidelity facial performance capture pipeline for reenacting the deceased actor Paul Walker in the movie Furious 7. His recent research focuses on combining techniques in Deep Learning and Computer Graphics to facilitate the creation of 3D avatars and to enable true immersive face-to-face communication and telepresence in Virtual Reality. In collaboration with Oculus / Facebook, he developed in 2015, the first facial performance sensing head-mounted display, which allows users to transfer their facial expressions onto their digital avatars while being immersed in a virtual environment. In the same year, he founded the company Pinscreen, Inc. in Los Angeles, which introduced a technology that can generate realistic 3D avatars of a person including the hair from a single photograph. Their innovations include the development of deep neural networks that can infer photorealistic faces and expressions, which has been showcased at the Annual Meeting of the New Champions 2019 of the World Economic Forum in Dalian. Due to the ease of generating and manipulating digital faces, Li has been raising public awareness about the threat of manipulated videos such as deepfakes. In 2019, Li and media forensics expert, Hany Farid, from the University of California, Berkeley, released a research paper outlining a new method for spotting deepfakes by analyzing facial expression and movement patterns of a specific person. With the rapid progress in artificial intelligence and computer graphics, Li has predicted that genuine videos and deepfakes will become indistinguishable in as soon as 6 to 12 months, as of September 2019. In January 2020, Li spoke at the World Economic Forum Annual Meeting 2020 in Davos about deepfakes and how they could pose a danger to democracy and vulnerable groups. Li and his team at Pinscreen, Inc. also demonstrated a real-time deepfake technology at the annual meeting, where the faces of celebrities are superimposed onto the participants' face.
Office of Naval Research Young Investigator Award.
Andrew and Erna Viterbi Early Career Chair.
Okawa Foundation Research Grant.
Google Faculty Research Award.
World's top 35 innovator under 35 by MIT Technology Review.
Best Paper Award at the ACM SIGGRAPH / Eurographics Symposium on Computer Animation 2009.
Miscellaneous
For his technological contributions in visual effects, Li has been credited in major motion pictures, including Blade Runner 2049, Valerian and the City of a Thousand Planets, Furious 7, , and Noah. Li also appeared as himself in various documentaries on artificial intelligence and deepfakes, including Buzzfeed's Follow This in 2018, CBC's The Fifth Estate in 2018, and iHuman in 2019.