Privacy

SEC Allows Shareholder Votes on Amazon Facial “Rekognition”

Amazon shareholders will get the opportunity to vote on two non-binding shareholders’ resolutions concerning the Amazon Rekognition facial recognition system.

<p><span><span><strong>Amazon shareholders will get the opportunity to vote on two non-binding shareholders' resolutions concerning the Amazon Rekognition facial recognition system. </strong></span></span></p>

Amazon shareholders will get the opportunity to vote on two non-binding shareholders’ resolutions concerning the Amazon Rekognition facial recognition system.

Amazon sought to exclude the resolutions from its upcoming annual meeting (usually in May) but was informally told by the SEC that it couldn’t do so. Amazon appealed this, but were told by the SEC on 3 April 2019, “we find no basis to reconsider our position.” The votes will go ahead.

The two resolutions are couched in terms of a business threat to Amazon from sales of Rekognition, but are primarily concerned with civil liberties issues. The first resolution calls for a halt to sales of the product to government (that is, law enforcement) unless the board “concludes the technology does not pose actual or potential civil and human rights risk.” The second calls for an independent study into whether the technology may “endanger, threaten, or violate” privacy or civil rights.

Rekognition has been a controversial product for much of its life since being launched in 2016. In 2018, the ACLU — supported by numerous civil liberties groups — wrote to Amazon CEO Jeff Bezos demanding that the product not be sold to government agencies: “This product poses a grave threat to communities, including people of color and immigrants, and to the trust and respect Amazon has worked to build. Amazon must act swiftly to stand up for civil rights and civil liberties, including those of its own customers, and take Rekognition off the table for governments.”

On its website, Amazon describes Rekognition in positive terms. “Amazon Rekognition also provides highly accurate facial analysis and facial recognition. You can detect, analyze, and compare faces for a wide variety of use cases, including user verification, cataloging, people counting, and public safety.”

Critics question the statement “highly accurate facial analysis and facial recognition.” An academic study conducted in 2018 looked at three commercial gender classification algorithms — including Rekognition — and found a marked difference in accuracy based on skin color. “We found that all classifiers performed best for lighter individuals and males overall. The classifiers performed worst for darker females.” This is the basis of the civil rights concerns — that facial recognition is currently biased against colored women. More specifically, the research found much higher error rates while classifying the gender of darker skinned women than lighter skinned men (31% vs. 0%).

The New York Times followed up on this study with an article noting that two of the three systems tested responded by quickly releasing more accurate technology. But not Amazon. When the academic retested Rekognition several months later, there was no improvement. Amazon’s response came in the form of blogs. 

In ‘Thoughts on Recent Research Paper and Associated Article on Amazon Rekognition’ (26 January 2019) by Dr Matt Wood, Amazon commented, “This research paper and article are misleading and draw false conclusions.” He added, “To date (over two years after releasing the service), we have had no reported law enforcement misuses of Amazon Rekognition.”

Advertisement. Scroll to continue reading.

But the debate continues. An open letter on Medium, currently signed by 73 ‘concerned researchers’, supports the academic research and criticizes some of Dr Wood’s statements. The concerned researchers conclude with, “We call on Amazon to stop selling Rekognition to law enforcement as legislation and safeguards to prevent misuse are not in place.”

Concern over the growing use of facial recognition technologies is not limited to Rekognition. The Washington Privacy Act, introduced on January 18, 2019, requires that consumer consent must be obtained by organizations that deploy facial recognition services, while state and local government agencies are prohibited from using facial recognition for the surveillance of specific individuals in public places.

New York City introduced a law to amend the city’s administrative code by defining certain rules for the collection of biometric data — including facial biometrics. It requires all companies using such identifiers to display, both physically at an establishment entrance and online on a website, information on how long such data is retained, and whether the data is shared with a third-party (excluding government agencies).

Now Amazon shareholders will be able to express their own opinion on sales of Rekognition. The result of the votes will not be binding on the Board, but could certainly cause embarrassment if the shareholders vote down the Board’s current position.

“This is just another example of the leading internet social and commerce properties being held to a level of ‘citizenship’ that did not exist in the past,” David Ginsburg, VP of marketing at Cavirin, told SecurityWeek. “If you look at the EU’s moves with anti-trust and AI, the UK with social network misuse, and even in the US with Google and Microsoft being held to task regarding government contracts, not to mention moves within Congress for national privacy regulation, the wild-west is no longer.”

Related: As Facial Recognition Use Grows, So Do Privacy Fears 

Related: Windows Hello Face Recognition Tricked by Photo 

Related: Huge US Facial Recognition Database Flawed: Audit 

Related: The Impending Facial Recognition Singularity 

Related: Windows Hello Face Recognition Tricked by Photo

Related Content

Copyright © 2024 SecurityWeek ®, a Wired Business Media Publication. All Rights Reserved.

Exit mobile version