Scroll Top

This new Chinese camera protects you from unauthorised facial recognition

WHY THIS MATTERS IN BRIEF

Increasingly you are being identified using your biometrics – everywhere you are online and offline – so this camera takes a new twist on personal privacy.

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

Facial recognition is a technology that can identify or verify a person’s identity based on their face. It can be used for various purposes, such as unlocking smartphones, verifying identities at airports, or finding missing persons. However, facial recognition also seriously threatens personal privacy, as it can be used to track, monitor, or profile people without their consent or knowledge. For example, some governments or companies may use facial recognition to spy on citizens, customers, or competitors or to collect and sell their data.

 

See also
Googles new advertising terms unmask the identities of billions of users

 

So, to protect our facial privacy, some researchers have proposed different methods to prevent facial recognition from working. These methods, collectively called Anti-Facial Recognition (AFR), aim to hide, distort, or replace the faces in images or videos. For instance, some AFR methods use masks, makeup, glasses, or hats to cover or alter facial features. Other AFR methods use software to blur, pixelate, or swap the faces in digital media.

 

The Future of Privacy 2030, by Keynote Speaker Matthew Griffin

 

However, most of the existing AFR methods have some limitations. For example, masks or makeup may not be comfortable or convenient, and they may need to work better in different lighting or angles, and then there’s the fact that governments have been working on finding new ways to use facial recognition even if you’re wearing a mask. Software based AFR methods may require user intervention or cooperation and may not be compatible with all devices or platforms. Moreover, they may affect the quality or usability of the images or videos for other purposes, such as activity recognition or social sharing.

 

See also
Researchers use AI to create a "Self-driving lab" to solve climate change

 

To overcome these challenges, a team of researchers from USSLAB at Zhejiang University has developed a new AFR technique called CamPro, which stands for Camera Privacy Protection. CamPro is a camera-sensor-based AFR technique that modifies the images at the source – the camera sensor itself – rather than after they are captured. This makes CamPro more effective and robust against malicious attempts to bypass it. The researchers call this approach “privacy-preserving by birth.”

CamPro exploits the tunable parameters of the Image Signal Processor (ISP), a hardware component that converts the raw data from the camera sensor into a standard image format. The ISP usually has some adjustable settings that can affect the image’s color, brightness, contrast, sharpness, and noise. By carefully tuning these settings, CamPro can produce unreadable images by facial recognition systems but retain enough information for other applications, such as person detection or pose estimation.

 

See also
Yet another AI has invented its own secret gibberish language to communicate

 

The researchers have tested CamPro on several widely available cameras, such as smartphones, webcams, and surveillance cameras, and they have shown that it can effectively prevent facial recognition from various state-of-the-art models. They have also demonstrated that CamPro can preserve the functionality of other computer vision applications, such as activity recognition or object detection, with minimal degradation. Furthermore, CamPro does not require additional hardware or software and can be easily integrated into existing camera modules.

However, CamPro also has some limitations. For example, it may not work well with cameras with fixed or non-tunable ISPs or with low-quality sensors or ISPs. It may also introduce some visual artifacts or distortions that may affect the aesthetic or subjective preference of the users. Moreover, it may not be able to protect the facial privacy of the users from other sources of information, such as voice, gait, or biometrics.

 

See also
BitLocker encryption gets broken in 43 seconds with a Raspberry Pi

 

The researchers plan to further improve CamPro by optimizing its performance, compatibility, and usability. They also hope to collaborate with camera manufacturers and developers to implement CamPro into more devices and platforms. They believe that CamPro can provide a novel and practical solution for facial privacy protection and inspire more research and innovation in the field of AFR. They also hope that CamPro can raise awareness and demand for facial privacy among the public and the industry and contribute to developing ethical and responsible facial recognition technology.

Related Posts

Leave a comment

FREE! 2024 TRENDS AND EMERGING TECHNOLOGY CODEXES
+

Awesome! You're now subscribed.

Pin It on Pinterest

Share This