As California privacy law takes effect,
advocates seek more protections

Jason M. Shepard, Ph.D.
5 min readOct 29, 2019

Originally published in Fall 2019 edition of California Publisher

By Jason Shepard

Even before California’s new digital privacy law takes effect on Jan. 1, privacy advocates are pushing for a 2020 ballot initiative for even tougher protections while federal legislation stalls in Congress.

The California Consumer Privacy Act, passed in 2018 and signed by Gov. Jerry Brown, is the nation’s most stringent digital privacy law.

Silicon Valley-based technology companies are uncertain how the new laws will affect their business models, but it is expected to have oversized effect because most U.S. tech companies are based in California.

California’s law contains similar provisions to privacy laws in Europe, where individual privacy rights are generally stronger than in the United States.

The California law gives residents the right to know what personal data is being collected, sold or disclosed about them and have access to it. Citizens also have a right to opt out of the sale of personal data and can request companies delete their personal data.

The law applies to companies with gross revenues of $25 million, those with more than 50,000 customers, or those that earn more than of their revenue from selling consumers’ personal data. The law includes several exemptions, including for medical data and for newspapers and periodical publications.

Companies can be fined up to $7,500 for each intentional violation and $2,500 for each unintentional violation.

As digital media proliferates our lives, many entities, including tech companies, advertisers and the government, use people’s digital footsteps to predict and influence their behaviors. Most citizens have little idea about how much data is collected about them and by whom, nor do they fully understand how their personal data is monetized by corporations.

Facebook, the world’s largest social media network with more than 2.4 billion active users, has been central to several recent controversies.

In September, Facebook said it suspended “tens of thousands” of apps run by about 400 developers for potentially misusing personal data.

Facebook’s suspensions came months after the Federal Trade Commission voted to approve a $5 billion settlement with the company over its mishandling of personal data, the largest FTC fine in history. The FTC determined Facebook violated a 2011 settlement in which it pledged to improve privacy protections.

Under the terms of the settlement, Facebook must document its decisions about data collection and closely monitor third-party apps that collect user data. A federal judge still needs to approve the settlement.

Facebook is still reeling from revelations that it played a role in several scandals related to the 2016 presidential election.

Cambridge Analytica, a now defunct political consulting firm founded by conservative political leaders Steve Bannon and Robert Mercer, was at the forefront of political communications in the 2016 election.

The company touted itself as a leader in pairing behavioral psychology and big data collection through digital engagement to change people’s behaviors.

In one controversial move, Cambridge Analytica used data about 87 million people collected by a Facebook app to micro target voters in support of Donald Trump’s presidential campaign.

A fascinating Netflix documentary released this summer, The Great Hack, follows several individuals who helped expose Cambridge Analytica’s work, including professor David Carroll, investigative reporter Carole Cadwalladr, and former Cambridge Analytica employees Christopher Wylie and Brittany Kaiser.

Wylie told the filmmakers Cambridge Analytica was a “full scale propaganda machine” that used digital data to “build a psychological profile of each voter in all of the United States.”

During the 2016 presidential campaign, Kaiser said the company identified possible swing voters — which they called the “persuadables” — in specific precincts in four swing states, Michigan, Wisconsin, Pennsylvania and Florida.

Cambridge Analytica then designed surreptitious personalized content to trigger those particular voters and “bombarded” them Facebook with blogs, websites, videos, articles and ads, “until they saw the world the way we wanted them to — until they voted for our candidate,” Kaiser said.

Kaiser told the filmmakers Cambridge Analytica’s practices were “psychological operations” using “weapons-grade communication techniques.”

The FTC’s investigation into Cambridge Analytica is ongoing. In a court filing, the agency alleged the company deceived Facebook users by collecting data about them without their consent and then paired that with other collected data to build personality profiles of them to influence their voting habits.

Election meddling is just one motivation for calls for greater privacy protections online. Data breaches, identity theft, and concerns about invasive advertising are also growing concerns.

Several bills introduced in Congress in 2019 garnered hearings this spring and summer, but both the House and Senate failed to move forward on votes.

Tech companies are lobbying for federal legislation that would provide one federal standard and could override, or preempt, California’s law.

That’s not stopping advocates in California for pushing for stronger protections in the form a ballot initiative in 2020.

San Francisco real estate developer Alastair Mactaggart told the Los Angeles Times the 2018 is a “great baseline,” but said, “I think there are additional rights that Californians deserve.”

Mactaggart is funding a campaign to get tougher privacy laws on the ballot. The ballot measure would give residents greater control over “sensitive personal information” about individuals’ race, health, and GPS location data and would impose “opt-in” rather than “opt-out” requirements for collection of children’s data.

The law would also allow residents to file civil lawsuits for violations without having to prove individualized harms. The measure would also create a new state agency to police data privacy.

Mactaggart told the Times he decided to push for tougher standards after he attended a cocktail party with tech engineers. He said a conversation with a guest left in shock about how data companies collect about citizens.

“He said, ‘If people just knew how much we knew about them, they’d be really worried,” Mactaggart said.

Jason M. Shepard, Ph.D., is chair of the Department of Communications at California State University, Fullerton. His primary research expertise is in media law, and he teaches courses in journalism, and media law, history and ethics. Contact him at jshepard@fullerton.edu or Twitter at @jasonmshepard.

--

--

Jason M. Shepard, Ph.D.

Media law prof and COMM dept chair @CSUF. Past: @CapTimes @isthmus @TeachForAmerica @UWMadison PhD. More at jasonmshepard.com.