B it by bit, San Francisco has become a place where it’s assumed that tech companies — many of which are based here — are tracking your every move. Until now, it has largely been up to you to find ways to stay off their collective corporate radar.
But instead of having to dig deep into the privacy settings of your phone or an app to opt out of sharing your personal information to do something as pedestrian as, say, rent one of the now-ubiquitous electric scooters, imagine never having to opt in in the first place.
As city officials this spring craft a “privacy-first policy” mandated by voter-approved Proposition B, supporters hope its lofty ambitions will start to become a reality this summer. The local regulations — among the first emerging in cities nationwide — could fill potential holes in the separate landmark statewide privacy law that takes effect in 2020. San Francisco’s initiative might also accelerate efforts to pre-empt cities and states through the adoption of federal standards, which the technology and business lobbies could be expected to water down.
Already there are signs that the city could move to the forefront of enforcing limits on data collection and reshaping our relationship with technology companies.
Disrupting the Disrupters
City leaders have made bold, but so far not very specific, claims about their ability to limit the personal-information free-for-all that is at the heart of the business model for data brokers, many startups and other digital enterprises.
Since passage of the ballot measure in November, supervisors have introduced legislation to restrict or ban surveillance and facial recognition. But the idea of being able to use online services in San Francisco without providing any personal details is the most controversial proposal. It is also potentially the most transformative and disruptive to an industry that has reaped riches from “disruption.”
Related: Why Privacy Needs All of Us
As officials work to add teeth to the larger privacy framework, critics warn that aggressive regulations could chase companies that make their money mining Big Data out of San Francisco, or into the courts to battle regulators.
“I think it’s going to be very difficult to change the business model that underlies a lot of current services, because it enables things to be free,” said Kelsey Finch, an attorney at the Future of Privacy Forum, a nonprofit think tank and advocacy group based in Washington, D.C. “It’s hard to have both free products and services and not provide any data. And that will be a logistical hurdle. It may be a political hurdle.”
Brian Hofer, who chairs the Oakland Privacy Advisory Commission, agrees. “If the restrictions are perceived as being too cumbersome for the private industries, I think there’s going to be a lot of pushback,” he said.
Strict regulations would have a “far-reaching impact in a lot of different technologies,” said Cynthia Cole, a privacy attorney with Palo Alto-based Baker & Botts, which represents more than half the Fortune 500 companies. She cited geolocation and biometric data as rapidly expanding fields that could be drastically affected and whose stakeholders could respond accordingly.
“I think you could have a dissuasive impact on companies saying, ‘Are we going to put our offices in San Mateo or Oakland, or are we going to put it in San Francisco?’” Cole said. “And they might say, ‘OK, well, San Francisco has these laws that make it even more stringent, even stricter for us with respect to our business model. So, we’re going to just implant ourselves right outside San Francisco instead.’”
Challenging Big Tech
In December, District 1 Supervisor Sandra Lee Fewer introduced an ordinance to require stores to post signs and notify the city if they are using surveillance cameras to monitor, track or collect data on shoppers. Standard security cameras were exempted.
The proposal, which included unspecified “administrative penalties” for noncompliance, was sparked by a new breed of stores like partially automated Amazon Go, which have cameras that “can sense if a customer reaches out and grabs an item and puts it back down,” said Fewer aide Angelina Yu.
“We felt that it was important to be transparent and upfront that that type of data was being collected and analyzed,” she said. “It’s inferring a lot more data than just a simple transaction.
“We’re hoping this will dovetail with a lot of the efforts around ‘privacy first,’” said Yu, adding that Fewer’s office planned to meet with Amazon. “We are at this time where it’s kind of a Wild West and there’s an entirely new range of technologies and spaces where privacy questions are really impacting the real world and our everyday movements and transactions.”
In February, Fewer backed off advancing her measure “at this time” but did not say why.
Taking Aim at Surveillance
District 3 Supervisor Aaron Peskin, who sponsored Proposition B, followed Fewer’s proposal in late January by introducing the “Stop Secret Surveillance Ordinance.” It would require that the Board of Supervisors approve any city department request to acquire new surveillance technology, or use current surveillance equipment in a new way to monitor or track residents and visitors. It would also ban facial recognition by city departments, setting up San Francisco to be the first U.S. city to do so.
The proposal is in line with what Sameena Usman, with the Bay Area Council on American Islamic Relations, was hoping for last year, when her group endorsed Proposition B. Usman told the Public Press in October that she saw the city Charter amendment as “a stepping stone” toward a surveillance ordinance. Though Peskin’s plan does not directly cite the privacy-first policy, a fact sheet distributed by his office called it “an implementation vehicle” of Proposition B.
Peskin legislative aide Lee Hepner called it a way to “build on the mandate of the voters last November to protect their personal information, to protect them from unwarranted intrusion into their private worlds.” Down the line, he said, supervisors might also look at banning facial recognition at private events that receive city permits for the use of public space.
Taking Aim at Business Models
But privacy advocates in City Hall envision an expansive agenda moving forward, which if pursued could become a significant headache for business, especially local tech giants that grow by making acquisitions. Hepner said Peskin’s office is “trying to look at data privacy regulation as somewhat of an antitrust tool.” The goal would be to prevent the accumulation of massive amounts of data by large companies that purchase startups solely for that purpose.
“If you are a permittee of the City and County of San Francisco and you are acquired by Amazon or Google or Uber or Lyft, that should force a reevaluation of your permit … to see how policies are changing and shifting relative to the exchange and use of people’s personal information,” Hepner explained.
The city’s policy “does absolutely threaten the business models of some companies,” said the Electronic Frontier Foundation’s director of grassroots advocacy, Shahid Buttar, who evaluated Proposition B as it was being drafted last year. “Companies will have to let the users decide how they are to participate and how they are to be monetized, as opposed to imposing their own interests on users that want to use their platforms.”
The threat of relocation could be private industry’s primary negotiating tactic.
“There are ways to put pressure on cities that you don’t have that option on a federal level or less so even on a state level,” said attorney Cole, who has numerous gig-economy and other tech clients that expect to be affected by San Francisco’s regulations.
Seattle has developed one of the nation’s more robust citywide privacy regimes. In 2015, it created privacy principles to guide city government and companies with city contracts.
Some private entities have an attitude of “we take data, we use it, we monetize it, we share it, we do what we want with it because we’re a large company and we don’t have to negotiate that,” Armbruster said.
“It becomes complicated when we deal with large companies who say, ‘These are our data practices, love it or leave it.’ There’s really no room for discussion,” she added. “There are times when some of those data practices can be at odds with what we’re hoping to do with public data. If we had a choice, we would rather do X, but here we are doing Y because we need to provide the service.”
Companies Over-Collect Data
Armbruster said transportation-focused companies are especially guilty of over-collecting data not relevant to the services they provide. She said some of her conversations with startups have bordered on comical.
“We ran into that all the time. Like, ‘Why are you collecting this information?’ ‘Well, we just want it.’ ‘Well, I can understand why you want it, but, you know, help me with what you’re doing with it, we have to be specific,’” Armbruster recalled. “And you know, we didn’t feel like we have such a mature process for privacy that we should be talking to a private entity that doesn’t have that figured out. But those are the questions you have to ask, and you have to make some real trade-off decisions about that.”
Oakland’s privacy commission, which advises the city on collection methods and storage of residents’ data, has seen the same patterns in which a private partnership “promises to make our life better, whether it’s measuring parking spaces or traffic … and there’s always some sort of trade-off,” said Hofer.
“Usually they offer us this service for free or at very low cost, but it’s in exchange for our data, and we want to know how that data’s being used, where it goes, what the purpose is, and put some reasonable guidelines into place,” he added.
“Some of these proposals come to us pretty thin,” Hofer said. “There’s not a lot of analysis. They haven’t really studied the impact, they’re just trying to figure out how to get large volumes of data and go sell it to their customers, to third parties. The tech mentality is just ‘if we can collect it, let’s collect it,’” he said.
“Whereas we’re coming a little bit more from the legal civil liberty side: ‘Should we collect it?’”
Hofer said he hopes privacy-first policies cause companies to ask themselves, “What data do I need to achieve that purpose, and what is the minimum amount of retention to hold the data until I can delete it safely?”
Big Data, Big Profits
For tech companies, of course, data equal profit. Facebook, Google, Experian, CoreLogic, Acxiom and other data harvesters have been joined by thousands of large and small brokers that buy and sell, or give third parties access to, personal information. International Data Corp., which provides market research for technology and telecommunications companies, has projected that in 2020, Big Data and business analytics will generate more than $210 billion in revenues worldwide.
“If you’re providing a service to the public — an alternative transit service, for instance — that’s great and we want you to provide that service,” said Hepner. “But if the profit motive of your business is the acquisition of people’s personal information, or the potential misuse and abuse of that personal information, and your business model is actually like a data broker and facilitating data-broker monopolies, then that’s something that we definitely want to put up some safeguards.”
In his February State of the State Address, Gov. Gavin Newsom congratulated the Legislature for passing the nation’s first state privacy law, saying that “companies that make billions of dollars collecting, curating and monetizing our personal data have a duty to protect it. Consumers have a right to know and control how their data is being used.”
He then proposed a “data dividend” for Californians, “because we recognize that your data has value and it belongs to you.”
“California’s consumers should also be able to share in the wealth that is created from their data,” he said.
A push for standardized, potentially more conservative, national legislation could also be a way for technology giants to reign in far-reaching local or state regulations.
“My hope is that we will get to federal legislation before we get 48 different privacy state laws,” Kalinda Raina, head of global privacy at LinkedIn, said at a January event in San Francisco marking national Data Privacy Day. “It’d be wonderful to see one law that is operated at the federal level that is consistent and brings the ability for companies to plan throughout the U.S. as to how they’re going to comply.”
Cole’s clients are waiting to see the texts of both the San Francisco and California laws. She’s also tracking the evolution of and difference among local and state privacy laws across the country, including San Jose, Denver and New York. “It’s starting to get a little difficult to keep up,” she said.
To date, 11 states are considering privacy laws similar to California’s. In Washington state, one piece of legislation calls for restricting companies that use personal data for profiling and facial recognition.
The Question of Enforcement
“We already have a lot of laws in San Francisco that govern different types of information,” he said, highlighting the voter-approved Proposition D in 2006, which strengthened restrictions on when the city could disclose private information. “The city’s treatment of private information is generally fairly robust.”
The city administrator is also working in the shadow of the state attorney general, who is hammering out the details of the California Consumer Privacy Act, which becomes law on Jan. 1, 2020.
Buttar, of the Electronic Frontier Foundation, which defends civil liberties in the digital realm, said that although “Big Tech is quite active in pushing back against the regulatory schemes that the state Attorney General’s Office is building,” he hasn’t heard about similar lobbying efforts at the city level, at least not yet.
He pointed out one crucial detail of the state law that has yet to be determined: Whether individuals will have a “private right of action” to sue, and who will enforce the law — the attorney general or the courts.
“If the private enforcement goes off the table and the AG’s office is then the sole source of enforcement, it will likely get inundated by reports of violations and abuse from users around the state,” Buttar said.
San Francisco has to wrestle with a similar question: Will enforcement fall to the city attorney?
“How will the compliance mechanisms work? What will the complaint processing look like, what kinds of remedies will or will not be offered?” said the Future of Privacy Forum’s Finch, who built a tool Seattle uses to evaluate privacy risks for its open-data program. “It’s very important to be clear with the people of San Francisco what those will be.”
The city administrator’s spokesman said it was too early to comment. “As we draft this with the City Attorney’s Office, we’ll obviously look at everything,” said Barnes.
To Buttar, “The most important aspect of both of these reforms at the state and the municipal level is the empowerment of users and their agency and autonomy, and their right to choose how their information is used. If that’s all we get out of these measures, even that would be a pretty big step forward.”
A version of this article also appears in the spring 2019 print edition of the Public Press. Reporting was supported by the Fund for Investigative Journalism.