Sony’s newest TV has it’s own AI brain to improve your viewing pleasure

WHY THIS MATTERS IN BRIEF

AI is increasingly being embedded into devices to give them new capabilities, and Sony’s latest TV extends that trend.

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential Universityconnect, watch a keynote, or browse my blog.

A lot’s happening in the TV world after Sony released the first 16k TV, LG revealed the world’s first rollable TV, and Xiaomi unveiled the world’s first transparent TV. Now Sony has done it again and unveiled a new lineup of Bravia TVs that “mimic the human brain” to replicate how we see and hear. The devices use a new processing method that Sony calls “Cognitive Intelligence.”

 

See also
Scientists turn light into sound to create faster Photonic computers

 

The company says cognitive intelligence goes beyond standard Artificial Intelligence (AI) to create an immersive visual and sound experience: “While conventional AI can only detect and analyze picture elements like colour, contrast, and detail individually, the new Cognitive Processor XR can cross-analyze an array of elements at once, just as our brains do. By doing so, each element is adjusted to its best final outcome, in conjunction with each other, so everything is synchronized and lifelike — something that conventional AI cannot achieve.”

 

Courtesy: Sony

 

The processor divides the TV screen into zones and detects the focal point that the consumer is most likely watching in the picture then enhances it – much in the same way that Facebook’s latest Virtual Reality rendering technology, called Foveated Rendering, only renders 10 percent of the image yet as far as the consumer is concerned the entire VR world still looks hi-def. In short not only does this technique reduce the amount of processing power needed to generate great hi def images but like some of the latest AI based Codecs it also dramatically reduces the amount of bandwidth needed to stream even the highest resolution content. Win win.

 

See also
Ethereum falls after rumours of new powerful ASIC emerge

 

Sony’s tech also analyzes sound positions in the signal to match the audio to the images on that part of the screen. In a video demo, Sony said the expanding size of TVs has made viewers focus on parts of the screens rather than the entire image — like we do when viewing the real world.

 

 

“The human eye uses different resolutions when we are looking at the whole picture and when we are focusing on something specific,” said Yasuo Inoue, a Sony signal processing expert.

“The XR Processor analyzes the focal point and refers to that point as it processes the entire image to generate an image close to what a human sees.”

 

See also
Today’s virtual bloggers and influencers are dumb, AI will change that

 

It’s impossible to tell how well the AI works without seeing the TVs in person, and if you want to test it out yourself then you’ll likely need deep pockets. Pricing and availability for the new lineup will be announced in the spring.

Related Posts

Leave a comment

Get the latest futuristic news delivered directly to your inbox!

Awesome! You're now subscribed.

Pin It on Pinterest

Share This