And the targets were largely people of colour.
The long, detailed story by the NYDN is well worth a read, so I will just provide a few details, which underline the fact that the people who run Google do not have any kind of moral compass – that is, if they ever had one.
Facial recognition technologies have been found to be less effective in identifying people who have darker skin.
People who worked for the project told the NYDN that they had been sent to target homeless people in Atlanta, unsuspecting students on campuses all over the US, and those attending the BET Awards in Los Angeles, among others.
GOOGLE MOTTOS: A HISTORY— MGK Hockey 1234 (@mightygodking) 28 March 2018
1999: Don't Be Evil
2003: Try Your Hardest To Not Be Evil
2008: Make A Reasonable Effort To Avoid Being Evil
2013: What Is Evil, Really, When You Get Down To It, I Mean Really
2018: *just a series of high-pitched giggles*
The people who worked for the project are known as Google TVCs — temps, vendors or contractors — and were told to go after people of colour, hide the fact that their faces were being recorded and tell the occasional fib to improve their ability to collect more data.
Some of the TVCS were told to pretend that it was a game of selfie in order to avoid telling the subjects that their images were being stored.
Google calls this "field research". A company that makes billions in profits by slurping data through both legal and illegal means has no qualms about paying people US$5 for the images. Its founders and all those in positions of power at the company appear to have lost the last vestiges of shame.
This is the same company which is seeking lenient treatment from the Federal Government over its online advertising practices which have crippled local media. And we have prominent local publications, like the Australian Financial Review, allowing Google to spin its way through.
Looks like a lack of shame is not limited to Google; journalists appear to be willing sell their souls for a foreign junket.
The NYDN contacted Google for a response which is within the story. iTWire has contacted the company with a number of queries about what is a truly shameful practice.
The queries iTWire sent to Google were:
How does this tie in with the oft-stated Google mantra of treating people fairly?
Is the company lacking money that it cannot pay people a decent amount?
There are plenty of indications in the story that the workers who were hired by Google told lies in order to collect this data. What does the company have to say about this?
Have the people who run Google never hear of the human attribute called shame?
A Google spokesperson responded but avoided answering any of these queries. Instead he/she said: "We regularly conduct volunteer research studies. For recent studies involving the collection of face samples for machine learning training, there are two goals.
"First, we want to build fairness into Pixel 4’s face unlock feature. It’s critical we have a diverse sample, which is an important part of building an inclusive product. And second, security. Face unlock will be a powerful new security measure, and we want to make sure it protects as wide a range of people as possible."