GDPR

GDPR Versus (Traditional) UX

Often, corporate entities hail user experience (“UX”) as an essential product feature. In fast-evolving tech markets, many believe, it is the Web-tool with the smoothest ride—the most frictionless UX—that absorbs and retains the most users. As a result, many platforms place a heavy premium on minimizing the steps between what the user wants and what the user gets. The less pages or options or hoops-to-jump-through in between, the better.

The General Data Protection Regulation (“GDPR”) purposefully disrupts this strategy.

Easily the most significant data privacy regulation in the last 20 years, the GDPR, whose compliance deadline is May 25, revolutionizes the way organizations must handle consumer information. Pivotally, the European Union (“EU”)-generated law requires any entity that collects, monitors, or targets EU residents’ data to provide such data’s subjects with broad access to and control of their information. The GDPR further requires covered entities to report data security breaches to local regulators; no longer is doing so merely a “best practice.” Perhaps the GDPR’s most monumental edict, however, lay in its muscle: entities that violate the GDPR’s strict provisions are liable for fines of up to $20 million or 4% of global turnover—whichever is greater.

The GDPR’s purpose is no secret. It is intended to disrupt monolithic data companies such as Google and Facebook, forcing them to boost their privacy and security practices to a level that EU regulators believe adequately protects the consumers that provide the endless data such companies peddle.

So: with UX on one side and increasingly complex data consent, access, and control requirements on the other, what will mega-data companies do?

On April 18, Facebook invited a host of journalists to its new Building 23 at the social media giant’s Menlo Park HQ. There, Facebook revealed its GDPR compliance plan to the reporters. And the reporters were, reportedly, underwhelmed. Their chief criticisms:

-          Facebook’s user consent prompt is placed beneath an “X” in a “big blue button.” This “X” prominently invites users to skip the GDPR bases for requiring legal, personal consent over their information.

-          Pages describing Facebook’s control of sensitive information—a crux of financial value, personal privacy, and privileged knowledge such as sexual preference, religious and political views—feature an “Accept And Continue” button in “pretty blue” and an “ugly gray” “Manage Data Settings” button. The former, which defaults to Facebook’s preferences, is selectable whether or not the user scrolls through the rules. This crucial page is “obviously designed to get users to breeze through it by offering no resistance to continue, but friction if you want to make changes.”

-          In the U.S., user interactions with political groups and events pages trigger each user’s placement in “overarching personality categories” that Facebook sells to advertisers. The only way to opt out is to “remove any info you’ve shared in these categories.”

-          Global facial recognition is enabled by default.

-          To reject Facebook’s Terms of Service, users must locate a “see your options” hyperlink that is “tiny” and “isn’t even a button.” (The “I Accept” button, however, is “big.”) This “see your options” link leads to a “scary permanent delete” button and “another tiny ‘I’m ready to delete my account’ hyperlink.” If a user selects this option, but wants to download their data first, this process can take hours. And the downloaded data’s portability has significant holes.

-          Users between 13-15 years old are off-limits to Facebook collecting their sexual and political data or serving them ads—unless the child gets parental consent. This consent is obtainable by the child providing Facebook with an email address, and that email subject granting consent via email. No further controls aim to determine whether the email subject is actually the child’s parent or guardian.

In sum: Instead of scaling back on UX to ensure that users—i.e. the providers and, per the GDPR, the proprietors of that data—understand what data Facebook elicits from users, how Facebook uses that data, and how users can adjudicate both of these processes, Facebook squeezed the GDPR’s requirements into its longstanding UX-first template.

GDPR or not, Facebook still “pushes” users “to speed though giving consent…with a design that encourages rapidly hitting the ‘Agree’ button.” Their platform “makes accepting the updates much easier than reviewing or changing them.”

Facebook and companies of its data-caliber made their bones on smooth UX. This methodology founded the bonds between users and these companies’ platforms, underwriting their success. But in an effort to continuously smooth users’ ride, UX-optimizers glossed over some weighty details. By enabling—read: training—users to hit “Agree” without reading the terms and conditions governing the services at play, data propagators obfuscated the true cost-benefit analysis underlying their products. Deprived users of a reasonable opportunity to make an informed decision re: whether they could responsibly press “Post.”

The Cambridge Analytica/Facebook controversy is the latest indicator of this dissonant status quo. On April 10 and 11 Facebook CEO Marc Zuckerberg apologized to the Internet-surfing world for his company’s untrustworthy custodianship of user data. On his watch, political marketers scraped user data, aggregated it, and built a media machine of epic proportions and historical effectiveness.

In the wake of this scandal, Internet searches for “delete Facebook” reached a five-year high. This compounded a troubling trend for Facebook at the close of 2017, when the company lost daily users in the US and Canada for the first time ever. And after the U.S. Federal Trade Commission confirmed its investigation of the company, Facebook’s stock dropped precipitously, shedding over $100 billion in value to match its lowest point since mid-2017.

The Cambridge Analytica revelations spotlighted yet again the reality of social media and many other online platforms: if you use them, your data may be forfeit. From Snowden to Yahoo to Uber to Target, this is not a new lesson for consumers who find themselves increasingly aware of the shady marketability of their data.

Aleksandr Kogan, the psychology professor hired by Cambridge Analytica to scrape millions of Facebook users’ profiles, agrees. He noted recently that users’ awareness that their data is improperly traded was a “core idea” underlying Cambridge Analytica’s practices—with a twist.

“Everybody knows,” Kogan said he and Cambridge Analytica believed, “and nobody cares.”

Now, post-fracas, Kogan believes the latter part of this theory was “wrong.” People not only know how their data is manipulated, but they care, too.

This uptick in user cognizance provides a pivotal impetus for Facebook, Google, and other blue-chip data stores to leave superficial UX, made of bubble letters and candy-colored buttons, behind. To invest in true UX via true transparency. To place a premium on educating their users on the innerworkings of the relationship between human and platform. To smooth UX not by shrouding choice, but by building trust.

That is, after all, the new preferred experience.

Otherwise, regardless of Mr. Zuckerberg’s congressional apologies, Prof. Kogan’s revisions, and whether the GDPR’s impending fines are as damning as planned, users now know what happens to their data. Who is misusing it. And, UX or not, what to do about it.

Update: Cambridge Analytica announced on May 2 that it will file for bankruptcy. Its Facebook controversy has "driven away virtually all of the company’s customers."