security

California Consumer Privacy Act of 2018

On June 28, 2018 California (“CA”) Governor Jerry Brown signed the California Consumer Privacy Act of 2018 (“Act”) into law. The Act zeroes in on the personal information (“PI”) of CA residents, at once: (i) formalizing consumers’ rights regarding their own PI, and (ii) mandating what certain businesses may and may not do (sans permission) if they collect, disclose, or sell such info. Like the recently effective GDPR—and the Internet itself—the Act reaches far beyond its ostensible borders. Its implications should therefore be tracked by any covered entity dealing in PI, as the Act defines it.

This post summarizes certain key aspects of the Act: namely, its rights, requirements, and the entities beholden to both.

Effective Date

The Act will be effective January 1, 2020. §1798.198(a). Until then, the CA legislature will likely rethink, refine, and amend it. §1798.185(a). While getting a head-start on Act-literacy is wise, keeping an eye on its evolution is key.

Covered Businesses

The Act’s requirements fall primarily upon “businesses,” which are defined as:

For-profit legal entities that,

-          collect consumers’ PI (or have PI collected on the business' behalf);

-          alone or jointly determine the purposes and means of PI processing;

-          do business in CA; and

o   have annual gross revenues over $25M;

o   alone or in combination, annually buy, sell, or share for commercial purposes the PI of 50K or more consumers, households, or devices; or

o   derive 50% or more of their annual revenues from selling PI. §1798.140(c)(1).

The Act's covered businesses also include entities that (i) control or are controlled by, and (ii) share common branding (i.e. name, trademark) with, the above businesses. §1798.140(c)(2).

Covered Data

The key phrase here is “personal information.” The Act defines PI as “information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.” §1798.140(o)(1).

The Act provides a non-exhaustive list of PI examples, including: names, SSNs, biometrics, personal property records, records of products considered or purchased, browsing history, geo data, visual, thermal or olfactory data (Note: if you know what "olfactory data" entails, let us know!), education info, and any inferences drawn from these and the other listed data points. Id. Exception: publicly available info, as defined by the Act, is not PI. §1798.140(o)(2).

Consumers and Rights

The Act grants “consumers”—defined as natural persons who are CA residents, §1798.140(g)—distinct rights pertaining to the handling of their PI, including the following.

(1) The right to know what PI each business collects.

Thanks to this right, when requested by a consumer, a business must disclose to that consumer promptly (i.e. generally within 45 days of receipt of request) and free of charge: (i) the categories of PI collected, (ii) specific pieces of PI collected; (iii) categories of sources from which a business collected such consumer's PI; and (ii) categories of third parties with which businesses share that PI.  §1798.100§1798.110.

Also, at or before the point of collection, a business must inform consumers of: (i) the categories of PI collected, and (ii) how the PI will be used. Additional collection or use is prohibited sans this notification. §1798.110.

Exceptions: Businesses are not required to disclose to consumers any unsold or un-retained PI collected for one-time transactions. Also, re-identifying or otherwise linking data that the business does not (in the ordinary course of business) maintain as PI, for the sake of disclosing that PI, is not required.  §1798.100§1798.110.

(2) The right to request the deletion of their PI.

When requested by a consumer, a business must delete that consumer's PI and direct its service providers (defined at  §1798.140(v)) to do the same. §1798.105(c). Exceptions include where the PI is necessary to: (i) perform a contract with the consumer, (ii) detect security incidents, (iii) debug, (iv) exercise a lawful right, (v) comply with certain Penal Code or other legal requirements, (vi) conduct public interest research, or (vii) otherwise use the PI “internally, in a lawful manner that is compatible with the context in which the consumer provided the information.” §1798.105(d).

(3) The right to know whether their PI is sold or disclosed, and to whom.

Upon request by a consumer, businesses who sell PI or otherwise disclose it for a business purpose must disclose to that consumer, essentially, the categories of (i) PI they collected, disclosed for a business purpose, and sold; and (ii) each third party to whom they sold the PI. §1798.115. If a business hasn’t sold the requesting consumer’s PI, such business must disclose that fact. Id.

(4) The right to prohibit—i.e. “opt out” of—the sale of their PI.

A consumer may at any time direct a business that it may not sell that consumer’s PI. §1798.120. Businesses that sell consumer PI must notify consumers that their PI may be sold, and of this opt-out right. Id. Without this notification, a business is prohibited from selling the affected PI. Id. Also, should a business receive a consumer’s opt-out, such business is prohibited from selling that consumer's PI. Id. That is, unless the consumer subsequently opts back in via an express authorization. Id. Stricter rules (e.g. a requirement that consumers “opt in” to allow their PI’s sale in the first place) apply for certain teenagers’ PI. §1798.120(d).

To comply with this requirement, businesses must provide a clear and conspicuous link on their homepage and in their privacy policy, titled “Do Not Sell My Personal Information.” §1798.135. This link must take consumers to an opt-out page; a link to this opt-out page must also appear in the business’s privacy policy, along with a description of consumer rights to prohibit the sale of their PI. Id. Exception: where a business maintains a separate, additional website for its CA consumers, it is permissible for these links to only appear on this CA-centric site, as long as the business “takes reasonable steps to ensure that California consumers are directed to the homepage for California consumers and not the homepage made available to the public generally.” Id.

Moreover, a business may not require that a consumer creates an account in order to direct the business not to sell the consumer’s PI. Id.

Bonus prohibition: third parties may not sell PI that they bought from a business unless the relevant consumer: (i) has received explicit notice that its PI may be sold; and (ii) has a chance to opt out. §1798.115(d).

(5) The right to equal service and price, even if they exercise their privacy rights under the Act.

A business may not discriminate against consumers because they have exercised any of the above rights, including by denying services to such consumers or charging them different rates. §1798.125. (While the Act provides an exception to this rule, allowing businesses to offer different rates or quality of goods or services to customers “if that price or difference is directly related to the value provided to the consumer by the consumer’s data,” id, the opacity of this exception requires further assessment.)

Finally, businesses may offer financial incentives to consumers in exchange for the collection, sale, or deletion of their PI—as long as: (i) the businesses notify consumers of these incentives; (ii) the relevant consumers opt into this arrangement, which consent is revocable anytime; and (iii) the incentive practices are not unjust, unreasonable, coercive, or usurious. Id.

Penalties and Procedures

If a business fails to implement and maintain reasonable security practices appropriate to the nature of the PI, and this failure results in a consumer’s nonencrypted or nonredacted PI being accessed and exfiltrated, stolen, or disclosed in an unauthorized manner, such consumer/s (individually or as a class) may commence a civil action for: (i) the greater of (a) up to $750 in damages per consumer and incident, and (b) actual damages; (ii) injunctive or declaratory relief; and (iii) any other relief per the court’s discretion. §1798.150. The Act further provides the factors for the court’s consideration in assessing statutory damages. Id.

Consumers have a cause of action for general Act violations—and the statutory damages that may follow—as well, subject to the Act's dispute resolution procedures. Id. A consumer must notify the business 30 days before initiating their action, identifying the allegedly violated provisions of the Act. Id. If the business cures within this period, providing an “express written statement” to this effect (which statement is enforceable), no action may be brought concerning that cured matter. Id. Exception: no notice is required by an action for actual pecuniary damages. Id.

If 30 days pass without cure, a business is in violation of the Act. §1798.155.

A consumer must also notify the Attorney General (“AG”) within 30 days of filing an action for statutory damages under the Act. §1798.150. Within 30 days following receipt of this notice, the AG must either: (i) notify the consumer of the AG’s intent to prosecute, in which case the consumer may not proceed with their action (however, if the AG doesn’t prosecute within 6 months, the consumer may proceed with their action); or (ii) notify the consumer that they may not proceed with their action. Id. If the AG does nothing within these 30 days, the consumer may proceed. Id.

Any person, business, or service provider who intentionally violates the Act may be liable for a civil penalty of up to seven thousand $7,500 for each violation. §1798.155(b).

Miscellaneous Requirements and Exclusions

-          Businesses must provide at least two methods by which consumers may make the requests for info about their PI detailed above, e.g. a phone number and web address. §1798.130.

-          The 45-day deadline for a response to a consumer request for info about their PI may be extended once by a business for another 45 days when reasonably necessary, provided the relevant consumer is notified of this extension within the initial 45 days. Id. A 90-day extension is also available based on the complexity and numerosity of requests a business receives, as are exceptions to, and even payment terms concerning, this obligation. §1798.145(g).

-          Businesses' PI disclosures must cover the 12-month period preceding the receipt of consumer’s request. §1798.130.

-          Businesses must include and update in their privacy policies every 12 months, as necessary: (i) their consumers’ rights; (ii) methods of submitting requests; (iii) the categories of PI they have collected, sold, and disclosed for business purposes in the prior 12 months. Id.

-          Businesses must ensure their relevant personnel are adequately informed of the Act’s requirements, and know how to help consumers exercise the rights it provides. Id.

-          Businesses are not obligated to provide a consumer with info on the sales or disclosures of that consumer’s PI more than twice in 12 months. Id.

-          Businesses must “respect” a consumer’s opt-out for at least 12 months before requesting that the consumer revisit their decision and authorize the business’ sale of the consumer’s PI. §1798.135.

-          A consumer may opt-out via a proxy. Id.

-          The Act does not apply to:

o   Consumer information that is “deidentified or in the aggregate consumer information.” §1798.145. ("Deidentified" is defined at §1798.140(h) and "aggregate consumer information" is defined at §1798.140(a).)

o   The collection or sale of PI “if every aspect of that commercial conduct takes place wholly outside of California.” §1798.145(a). Meaning, (i) if the business collected PI while the consumer was out of CA, (ii) no part of the PI sale occurred in CA; and (iii) no PI collected while the consumer was in CA is sold. Id. The Act cautions that this exception does not permit a business to store PI (e.g. on a device) while the relevant consumer is in CA, only to collect that PI once the consumer (and their stored PI) leaves CA. Id.

o   Evidentiary privileges. §1798.145(b).

o   Protected or health information that is collected by a covered entity governed by the Confidentiality of Medical Information Act or certain HIPAA rules. §1798.145(c).

o   PI collected, processed, sold, or disclosed pursuant to the Gramm-Leach-Bliley Act, where such law conflicts with the Act. §1798.145(e).

-          A business is not liable for violations of the Act by its service providers, if the business didn’t know (or have reason to believe), when it disclosed PI to that service provider, that it intended to commit such a violation. §1798.145(h). A service provider is similarly not liable for the businesses it deals with. Id.

-          A business is not considered by the Act to have sold PI when the relevant consumer directs the business to make such disclosure or “uses the business to intentionally interact with a third party.” §1798.140(t)(2)(A).

-          Contract provisions that purport to waive or limit consumer rights under the Act are contrary to public policy, void, and unenforceable. §1798.192.

GDPR Overlap

The Act’s implicit intent is “to further the constitutional right of privacy and to supplement existing laws relating to consumers’ personal information.” §1798.175. To this end, where other sweeping PI statutes such as GDPR conflict with the Act, “the provisions of the law that afford the greatest protection for the right of privacy for consumers shall control.” Id.

Conclusion

Though lengthy, this synopsis of the Act is not exhaustive. While the Act provides additional—and potentially pivotal—requirements and exceptions for businesses, their service providers, and third parties in relation to consumer PI, this post may serve as a guide to certain highlights of this new law and a primer for the internal discussions the Act should stimulate within entities of all (covered) stripes.

GDPR Versus (Traditional) UX

Often, corporate entities hail user experience (“UX”) as an essential product feature. In fast-evolving tech markets, many believe, it is the Web-tool with the smoothest ride—the most frictionless UX—that absorbs and retains the most users. As a result, many platforms place a heavy premium on minimizing the steps between what the user wants and what the user gets. The less pages or options or hoops-to-jump-through in between, the better.

The General Data Protection Regulation (“GDPR”) purposefully disrupts this strategy.

Easily the most significant data privacy regulation in the last 20 years, the GDPR, whose compliance deadline is May 25, revolutionizes the way organizations must handle consumer information. Pivotally, the European Union (“EU”)-generated law requires any entity that collects, monitors, or targets EU residents’ data to provide such data’s subjects with broad access to and control of their information. The GDPR further requires covered entities to report data security breaches to local regulators; no longer is doing so merely a “best practice.” Perhaps the GDPR’s most monumental edict, however, lay in its muscle: entities that violate the GDPR’s strict provisions are liable for fines of up to $20 million or 4% of global turnover—whichever is greater.

The GDPR’s purpose is no secret. It is intended to disrupt monolithic data companies such as Google and Facebook, forcing them to boost their privacy and security practices to a level that EU regulators believe adequately protects the consumers that provide the endless data such companies peddle.

So: with UX on one side and increasingly complex data consent, access, and control requirements on the other, what will mega-data companies do?

On April 18, Facebook invited a host of journalists to its new Building 23 at the social media giant’s Menlo Park HQ. There, Facebook revealed its GDPR compliance plan to the reporters. And the reporters were, reportedly, underwhelmed. Their chief criticisms:

-          Facebook’s user consent prompt is placed beneath an “X” in a “big blue button.” This “X” prominently invites users to skip the GDPR bases for requiring legal, personal consent over their information.

-          Pages describing Facebook’s control of sensitive information—a crux of financial value, personal privacy, and privileged knowledge such as sexual preference, religious and political views—feature an “Accept And Continue” button in “pretty blue” and an “ugly gray” “Manage Data Settings” button. The former, which defaults to Facebook’s preferences, is selectable whether or not the user scrolls through the rules. This crucial page is “obviously designed to get users to breeze through it by offering no resistance to continue, but friction if you want to make changes.”

-          In the U.S., user interactions with political groups and events pages trigger each user’s placement in “overarching personality categories” that Facebook sells to advertisers. The only way to opt out is to “remove any info you’ve shared in these categories.”

-          Global facial recognition is enabled by default.

-          To reject Facebook’s Terms of Service, users must locate a “see your options” hyperlink that is “tiny” and “isn’t even a button.” (The “I Accept” button, however, is “big.”) This “see your options” link leads to a “scary permanent delete” button and “another tiny ‘I’m ready to delete my account’ hyperlink.” If a user selects this option, but wants to download their data first, this process can take hours. And the downloaded data’s portability has significant holes.

-          Users between 13-15 years old are off-limits to Facebook collecting their sexual and political data or serving them ads—unless the child gets parental consent. This consent is obtainable by the child providing Facebook with an email address, and that email subject granting consent via email. No further controls aim to determine whether the email subject is actually the child’s parent or guardian.

In sum: Instead of scaling back on UX to ensure that users—i.e. the providers and, per the GDPR, the proprietors of that data—understand what data Facebook elicits from users, how Facebook uses that data, and how users can adjudicate both of these processes, Facebook squeezed the GDPR’s requirements into its longstanding UX-first template.

GDPR or not, Facebook still “pushes” users “to speed though giving consent…with a design that encourages rapidly hitting the ‘Agree’ button.” Their platform “makes accepting the updates much easier than reviewing or changing them.”

Facebook and companies of its data-caliber made their bones on smooth UX. This methodology founded the bonds between users and these companies’ platforms, underwriting their success. But in an effort to continuously smooth users’ ride, UX-optimizers glossed over some weighty details. By enabling—read: training—users to hit “Agree” without reading the terms and conditions governing the services at play, data propagators obfuscated the true cost-benefit analysis underlying their products. Deprived users of a reasonable opportunity to make an informed decision re: whether they could responsibly press “Post.”

The Cambridge Analytica/Facebook controversy is the latest indicator of this dissonant status quo. On April 10 and 11 Facebook CEO Marc Zuckerberg apologized to the Internet-surfing world for his company’s untrustworthy custodianship of user data. On his watch, political marketers scraped user data, aggregated it, and built a media machine of epic proportions and historical effectiveness.

In the wake of this scandal, Internet searches for “delete Facebook” reached a five-year high. This compounded a troubling trend for Facebook at the close of 2017, when the company lost daily users in the US and Canada for the first time ever. And after the U.S. Federal Trade Commission confirmed its investigation of the company, Facebook’s stock dropped precipitously, shedding over $100 billion in value to match its lowest point since mid-2017.

The Cambridge Analytica revelations spotlighted yet again the reality of social media and many other online platforms: if you use them, your data may be forfeit. From Snowden to Yahoo to Uber to Target, this is not a new lesson for consumers who find themselves increasingly aware of the shady marketability of their data.

Aleksandr Kogan, the psychology professor hired by Cambridge Analytica to scrape millions of Facebook users’ profiles, agrees. He noted recently that users’ awareness that their data is improperly traded was a “core idea” underlying Cambridge Analytica’s practices—with a twist.

“Everybody knows,” Kogan said he and Cambridge Analytica believed, “and nobody cares.”

Now, post-fracas, Kogan believes the latter part of this theory was “wrong.” People not only know how their data is manipulated, but they care, too.

This uptick in user cognizance provides a pivotal impetus for Facebook, Google, and other blue-chip data stores to leave superficial UX, made of bubble letters and candy-colored buttons, behind. To invest in true UX via true transparency. To place a premium on educating their users on the innerworkings of the relationship between human and platform. To smooth UX not by shrouding choice, but by building trust.

That is, after all, the new preferred experience.

Otherwise, regardless of Mr. Zuckerberg’s congressional apologies, Prof. Kogan’s revisions, and whether the GDPR’s impending fines are as damning as planned, users now know what happens to their data. Who is misusing it. And, UX or not, what to do about it.

Update: Cambridge Analytica announced on May 2 that it will file for bankruptcy. Its Facebook controversy has "driven away virtually all of the company’s customers."

ICO Contracts: Choice of Law, Venue Selection, and How Fraud Upends It All

Intro

ICOs are multiplying. Likely siphoning early stage VC funding, initial coin offerings have raised $4 billion in 2017. Bitcoin, the standard bearer of cryptocurrencies worldwide and the most common ICO currency, hit an all-time high nearing $18,000 in mid-December. With commensurate speed, lawsuits and regulator crackdowns have followed.

In particular, a series of lawsuits surrounding the startup Tezos may provide some guidance on ICO contracts. That is, not the smart contracts that administer the cryptocurrency-for-ICO token exchange at the core of certain ICOs, but the paper contracts which (hopefully!) set forth the terms and conditions of an ICO exchange, including limitations of liability, tax responsibilities, venue selection provisions, and more.

Background

Tezos threw a phenomenally successful ICO: $232 million raised by co-founders and spouses Arthur and Kathleen Breitmen, for an incomplete blockchain-based platform, in July 2017. Tezos’ haul shattered records for funds raised in an ICO—especially considering that these funds were ostensibly raised via bitcoin and ether, two currencies whose value continues to trend (substantially) up, raising the ICO’s ensuing estimated value to hit $1.3 billion.

Those Suits

Tezos faces at least five lawsuits, all class actions, filed in state and federal courts from Florida to California. One of these suits, captioned Gaviria v. Dynamic Ledger Solutions, Inc., et al., Case No. 6:17-CV-01959-ORL-40-KRS, attached to its complaint the Tezos Contribution and XTZ Allocation Terms and Explanatory Notes (“Tezos Terms”). The Tezos Terms, according to the complaint, memorialize the terms of the Tezos ICO’s fundraising offer—and are “unenforceable for a variety of reasons.” Gaviria, at 14.

Early Guidance

While the Tezos suits have yet to be resolved, and the validity of their arguments yet to be tested, guidance may be gleaned already for the fast-moving ICO space. In particular, the Tezos suits offer a lesson for ICO contract drafters on choice of law and venue selection provisions.

Choice of Law & Venue—Meet Fraud

Tezos, like certain other ICOs, sought to adjudicate litigation concerning their enterprise in a foreign jurisdiction. Via the very last provision of the Tezos Terms, any disputes “arising out of or in connection with” Tezos’ ICO are restricted “exclusively and finally [to] the ordinary courts of Zug, Switzerland.” Gaviria, at Exhibit A. Tezos’ choice of law was Swiss as well. Id.

Organizations running ICOs, like many other enterprises, don’t want to travel far to litigate, produce witnesses, and transport evidence. Hence, venue selection clauses. Also like many other organizations, those running ICOs seek regulatory havens. Jurisdictions they think align with the claims they might make (and field) should litigation arise. In fact, Kathleen Breitman told Reuters in June that Tezos chose to incorporate the Tezos Foundation in Zug since Switzerland “has a regulatory authority that had a sufficient amount of oversight but not like anything too crazy.” Each party’s assessment along these lines informs its agreement’s choice of law clause.

Generally, courts afford venue selection clauses significant deference, even when the chosen jurisdiction is a non-U.S. state. After all, the parties assumedly negotiated these clauses prior to signing the agreement. Today, the majority of federal courts (including those of the 2nd, 4th, 7th, 8th, 9th, 10th and 11th circuits—which include New York, Florida, and California) strictly enforce forum-selection clauses. The Supreme Court of the U.S. blessed this trend, ruling that “forum-selection clauses should control except in unusual cases.” Atlantic Marine Construction Co. v. United States District Court for the Western District of Texas, 571 U.S. 488 (2013). The same applies for choice of law clauses. The Restatement (Second) of the Conflicts of Laws provides that choice of law provisions are presumptively enforceable.

That said, how can Tezos be sued—multiple times—in California and Florida, at opposite ends of the country whose laws Tezos sought to avoid altogether?

Because fraud wasn’t part of the agreement.

Fraud features heavily across the Tezos litigation. For example, each in their own way, the Tezos suits allege that the utility tokens (i.e. markers of purchased services or access) that Tezos distributed to its “donors” in exchange for their “donations” during the Tezos ICO were actually unregistered securities, sold in violation of the Securities Act of 1933. Gaviria, at 31. By misleading ICO participants about the unregistered securities status of these tokens—a “material fact” highly relevant to the ICO participants—Tezos “fraudulently induced [the ICO class] to participate in the ICO.” Id., at 34. 

Fraud is kryptonite for forum selection clauses in federal court.  Decades ago the Supreme Court ruled that where enforcement of a forum selection clause would be “unreasonable and unjust, or that the clause was invalid for such reasons as fraud or overreaching,” it should not be enforced. The Bremen v. Zapata Off-Shore Co., 407 U.S. 1 (1972). As for choice of law clauses, fraud can defeat those too. Carnival Cruise Lines, Inc. v. Shute, 499 U.S. 585 (1991).

Therefore, by claiming that the Tezos Terms were “induced by fraud and overreaching” (Gaviria, at 24), the plaintiffs at play may succeed in superimposing their own venue selection—Florida, for instance—over Tezos and its Swiss preferences.

Conclusion

ICOs operate for now in a regulatory gray-space. While crypto-entrepreneurs consider the securities status of their tokens, publish ambitious marketing materials, and hunt for ICO participants, they must also consider the jurisdictional impact their decisions might have on their ICO contracts—regardless of the law and venue they select.