Menu Close

Designers take on facial recognition with adversarial fashion

Italian fashion designer Cap_able is the latest to claim that its use of adversarial images renders wearers of its clothes invisible to facial recognition systems, as reported by numerous outlets.

It is just the latest in a well-established trend, with numerous clothing items and accessories that are worn by precious few people around the world introduced and marvelled over before graduating to aggregated internet lists.

“In a world where data is the new oil, Cap_able addresses the issue of privacy, opening the discussion on the importance of protecting against the misuse of biometric recognition cameras: a problem if neglected, could freeze the rights of the individual including freedom of expression, association and free movement in public spaces,” Cap_able Co-founder Rachelle Didero told dezeen.

But do they work?

In the case of Cap_able, a representative of NtechLab reached out to Biometric Update to share videos showing that the company’s algorithms can easily identify those in the designer’s demonstration videos.

The designers tested their designs with online object detection tool YOLO.

“Face recognition software developed by NtechLab has successfully detected all the faces in the video provided by Cap_able, so we have contacted the Italian startup to assist its team in further tests,” writes NtechLab Communications Director Alexander Tomas. “All facial recognition algorithms work differently, so it will be difficult to come up with clothes that can evade several algorithms at once. We are always open to cooperation with companies that are ready to offer creative solutions to trick facial recognition technology.”

A pair of videos shared by the company show face detection and facial recognition working on people wearing clothes from Cap_able.

Tomas’ point about algorithms working differently raises questions about the extent to which adversarial images can be broadly applied. The use of not just a different algorithm to back the claim of providing protection from biometric surveillance, but a different kind of algorithm altogether, seems to leave open the question of whether Cap_able’s designs work against any face detection and biometric systems deployed to security cameras in production.

For professionals with face biometrics developers who are readers of Biometric Update; does your algorithm identify people wearing adversarial designs from Cap_able?  Please let us know on social media or in the comments below. Italian fashion designer Cap_able is the latest to claim that its use of adversarial images renders wearers of its clothes invisible to facial recognition systems, as reported by numerous outlets.

It is just the latest in a well-established trend, with numerous clothing items and accessories that are worn by precious few people around the world introduced and marvelled over before graduating to aggregated internet lists.

“In a world where data is the new oil, Cap_able addresses the issue of privacy, opening the discussion on the importance of protecting against the misuse of biometric recognition cameras: a problem if neglected, could freeze the rights of the individual including freedom of expression, association and free movement in public spaces,” Cap_able Co-founder Rachelle Didero told dezeen.

But do they work?

In the case of Cap_able, a representative of NtechLab reached out to Biometric Update to share videos showing that the company’s algorithms can easily identify those in the designer’s demonstration videos.

The designers tested their designs with online object detection tool YOLO.

“Face recognition software developed by NtechLab has successfully detected all the faces in the video provided by Cap_able, so we have contacted the Italian startup to assist its team in further tests,” writes NtechLab Communications Director Alexander Tomas. “All facial recognition algorithms work differently, so it will be difficult to come up with clothes that can evade several algorithms at once. We are always open to cooperation with companies that are ready to offer creative solutions to trick facial recognition technology.”

A pair of videos shared by the company show face detection and facial recognition working on people wearing clothes from Cap_able.

Tomas’ point about algorithms working differently raises questions about the extent to which adversarial images can be broadly applied. The use of not just a different algorithm to back the claim of providing protection from biometric surveillance, but a different kind of algorithm altogether, seems to leave open the question of whether Cap_able’s designs work against any face detection and biometric systems deployed to security cameras in production.

For professionals with face biometrics developers who are readers of Biometric Update; does your algorithm identify people wearing adversarial designs from Cap_able?  Please let us know on social media or in the comments below.  Read More   

Generated by Feedzy

Disclaimer

Innov8 is owned and operated by Rolling Rock Ventures. The information on this website is for general information purposes only. Any information obtained from this website should be reviewed with appropriate parties if there is any concern about the details reported herein. Innov8 is not responsible for its contents, accuracies, and any inaccuracies. Nothing on this site should be construed as professional advice for any individual or situation. This website includes information and content from external sites that is attributed accordingly and is not the intellectual property of Innov8. All feeds ("RSS Feed") and/or their contents contain material which is derived in whole or in part from material supplied by third parties and is protected by national and international copyright and trademark laws. The Site processes all information automatically using automated software without any human intervention or screening. Therefore, the Site is not responsible for any (part) of this content. The copyright of the feeds', including pictures and graphics, and its content belongs to its author or publisher.  Views and statements expressed in the content do not necessarily reflect those of Innov8 or its staff. Care and due diligence has been taken to maintain the accuracy of the information provided on this website. However, neither Innov8 nor the owners, attorneys, management, editorial team or any writers or employees are responsible for its content, errors or any consequences arising from use of the information provided on this website. The Site may modify, suspend, or discontinue any aspect of the RSS Feed at any time, including, without limitation, the availability of any Site content.  The User agrees that all RSS Feeds and news articles are for personal use only and that the User may not resell, lease, license, assign, redistribute or otherwise transfer any portion of the RSS Feed without attribution to the Site and to its originating author. The Site does not represent or warrant that every action taken with regard to your account and related activities in connection with the RSS Feed, including, without limitation, the Site Content, will be lawful in any particular jurisdiction. It is incumbent upon the user to know the laws that pertain to you in your jurisdiction and act lawfully at all times when using the RSS Feed, including, without limitation, the Site Content.  

Close Bitnami banner
Bitnami