Five consumer privacy groups have asked the European Data Protection Board to investigate how Facebook, Google, and Microsoft design their software to see whether it complies with the General Data Protection Regulation (GDPR).
Essentially, the tech giants are accused of crafting their user interfaces so that netizens are fooled into clicking away their privacy, and handing over their personal information.
In a letter sent today to chairwoman Andrea Jelinek, the BEUC (Bureau Européen des Unions de Consommateurs), the Norwegian Consumer Council (Forbrukerrådet), Consumers International, Privacy International and ANEC (just too damn long to spell out) contend that the three tech giants “employed numerous tricks and tactics to nudge or push consumers toward giving consent to sharing as much data for as many purposes as possible.”
The letter coincides with the publication a Forbrukerrådet report, “Deceived By Design,” that claims “tech companies use dark patterns to discourage us from exercising our rights to privacy.”
Dark patterns here refers to app interface design choices that attempt to influence users to do things they may not want to do because they benefit the software maker.
The report faults Google, Facebook and, to a lesser degree, Microsoft for employing default settings that dispense with privacy. It also says they use misleading language, give users an illusion of control, conceal pro-privacy choices, offer take-it-or-leave it choices and use design patterns that make it more laborious to choose privacy.
It argues that dark patterns deprive users of control, a central requirement under GDPR.
As an example of linguistic deception, the report cites Facebook text that seeks permission to use facial recognition on images:
If you keep face recognition turned off, we won’t be able to use this technology if a stranger uses your photo to impersonate you. If someone uses a screen reader, they won’t be told when you’re in a photo unless you’re tagged.
The way this is worded, the report says, pushes Facebook users to accept facial recognition by suggesting there’s a risk of impersonation if they refuse. And it implies there’s something unethical about depriving those forced to use screen readers of image descriptions, a practice known as “confirmshaming.”
Robin Edgar
Organisational Structures | Technology and Science | Military, IT and Lifestyle consultancy | Social, Broadcast & Cross Media | Flying aircraft