Scroll Top

Meet Lia, Soul Machine’s scarily lifelike digital human that reacts to your emotions


As AI and avatars improve in time they’ll be better at understanding your emotions than your partner.


Interested in the Exponential Future? Connect, download a free E-Book, watch a keynote, or browse my blog.

Have you ever dreamt of gazing into the eyes of a chatbot or avatar while explaining your frustration with a company’s product or service? Yes, me too, and now you’re in luck thanks to Soul Machines, an ambitious New Zealand startup who’s been creating so called digital humans, like Will who’s now taught over 250,000 children about renewable energy, and Eva who’s now working multiple jobs as a customer service advisor at Daimler, as well as at Natwest Bank in the UK where she’s advising customers about mortgages.


See also
Google's AI just created its own child AI and it beat human experts


For its next act though the company, who’s behind some of the most advanced digital humans, such as Baby X one of the most realistic digital babies on the planet who has his own neural network brain, has created an avatar that can not only portray human emotions, but also read human facial expressions as Soul Machine’s closes in on its goal to take chatbot services to the next level by “helping humanise the interaction between man and machine” – basically by making them more realistic and human like. And, as you can see from the links and projects, they’re succeeding.


A quick preview of Lia

The digital humans name in this case, called Lia, uses web cams in laptops and other devices to see the users she’s interacting with and get a better sense of their mood and emotions. Soul Machines call this “Emotional Intelligence” or EI, which allows avatars like Lia to connect with people on a more subtle emotional level.


See also
This singer Deepfaked then open sourced her own voice so you can sing like her


And just like AI, EI can learn through experience. The more Lia interacts with real people, the better she will get at reading people’s emotions. If a user changes his tone or facial expression, Lia will be able to pick up on that and adjust her answers to better fit the user’s emotional state.

The brain behind Lia is Mark Sagar, professor at the University of Auckland and CEO of Soul Machines. Sagar is no novice when it comes to realistic facial animations and has even won academy awards for his work on facial motion capture techniques in Avatar and King Kong.

Sagar’s expertise makes Lia incredibly life-like, almost to the point where it gets kind of creepy. But fear not, Lia is actually pretty nice. She doesn’t want to start a robot rebellion or kill all humans. Instead, she has dedicated her “life” to helping people with disabilities.


See also
Google DeepMind launches a watermarking tool for generative content


Lia, who is voiced by none other than the amazingly talented Cate Blanchett, was developed for the Australian government to improve services for people with disabilities. Lia helps users access the National Disability Insurance Scheme (NDIS) and find the information they need as well as improving their experience of the system.

People with disabilities often have a hard time dealing with the bureaucracy of NDIS due to its complexity and slowness. Lia was created to fix all that and make a more human service available anytime and anywhere, and the possibilities of a project of this nature are obviously relatively endless so you can expect a lot more from Lia and her digital colleagues in the future.

Related Posts

Leave a comment


Awesome! You're now subscribed.

Pin It on Pinterest

Share This