Home | About | Donate

Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots


#1

Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots

Jacob Snow

Amazon’s face surveillance technology is the target of growing opposition nationwide, and today, there are 28 more causes for concern. In a test the ACLU recently conducted of the facial recognition tool, called “Rekognition,” the software incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime.

The members of Congress who were falsely matched with the mugshot database we used in the test include Republicans and Democrats, men and women, and legislators of all ages, from all across the country.


#2

“[P]rofound negative unintended (?) consequences”


#3

Washington County: home of Intel, Nike and plenty of White Supremacists living on " the down low ". It may vote Democratic but the $$$ is all on the Republican side.
Phil Knight likes his black and brown sport stars on the TEEVEE, not so much on the sidewalks and streets of Washington County, apparently.
Hey Phil, " money doesn’t talk, it screams ". Can you hear it screaming?


#4

I am sure it is a giant coincidence that the CIA invested considerably in Amazon shortly before they started developing drone and surveillance technologies.

Incidentally, the same is also true for google, who for some unrelated reason no longer has “dont be evil” as a company motto.


#5

What do you mean falsely? If they were mug shots then they obviously related to members of Congress. I wouldn’t doubt that in the majority of Congressional members there are mugshots from one police station or another!


#6

Hell, most in Congress are criminals.


#7

what exactly is the point ? We all look similar to someone else somewhere on the planet. No implication is made on any criminal connection. Were the faces of the the Members of Congress indeed visually similar to their ‘criminal’ matches ? To the extent that a panel of human viewers could well conclude that the 2 pictures were likely of the same person ? If so, the technology did a good job. (in a blind test, humans might have come to the same conclusion). If the matches are ‘obviously wrong’, it’s only a matter of time before the algorithms are better trained.
Don’t blame the technology - the issue is the potentially uncontrolled use - ‘guilty by association’.

(stats note: the % of mugshots showing people of color also plays a role in the probability of a ‘disproportionate’ match)


#8

Lock them up, lock them up!!


#9

What’s the point ? The point is they will soon be using this shit technology to arrest people, right or wrong, then you will have to prove it’s incorrect. It might not be problematic for you, but it’s a big problem for the poor.
No I don’t blame technology, I blame the greedy MF’ers who invent this shit, and want to become rich hurting their fellow man.


#10

Serious matter.

But, at least they know their congresspersons.