*Complete Control, Feeling Safe, Being Safe (Facebook)

(* Denotes one bliki appendage, below.) A transcript of Senator Deb Fischer (R-Nebraska) questioning Facebook founder and CEO Mark Zuckerberg April 10 shows questions and answers over how much a user is able to control their data, and whether a private user was exposed.

(A New York Times story today on Facebook, found the company sharing users' data, and users' friends' data, with 60 device manufacturers.)

The c-span clip is six minutes long, here.

Fischer: Thank you Mr. Chairman, thank you Mr Zuckerberg for being here today, I appreciate your testimony. The full scope of Facebook users' activity can print a very personal picture I think. And additionally you have those two billion users that are out there every month and so we all know that's larger than the population of most countries. So how many data categories do you store, does Facebook store, on the categories that you collect?

Zuckerberg: Senator can you clarify what you mean by data categories.

Fischer: Well there are some past reports that have been out there that indicate that Facebook collects about 96 data categories for those two billion active users. That's 192 billion data points that are being generated at any one time, from consumers, globally. So how many does Facebook store out of that. Do you store any?

Zuckerberg: Senator I'm not actually sure what that is referring to.

Fischer: On the points that you collect, uh information. If we call those "categories". How many do you store, of information that you were collecting?

Zuckerberg: ... Senator the way I think about this is there are two broad categories. This probably doesn't line up with the specific report you were seeing, and I can make sure that we follow up with you afterward to get you the information you need on that.

The two broad categories that I think about are content that a person has chosen to share, and that they have complete control over, they get to control when they put it into the service, when take it down, um, who sees it.

The other category are data they're connected to that make the ads relevant.

You have complete control over both...you can turn off the data related to ads you can choose not to share any content or control exactly who sees it or take down the content in the former category.

Fischer: Does Facebook store any of that?

Zuckerberg: Yes.

Fischer: ... How much do you store of that, all of it? All of it? Everything we click on? Is that in storage somewhere?

Zuckerberg: Senator we store, data, about what people share on the service. And the information that's required to do ranking better. To show you what you care about in news feed.

Fischer: Do you store uh text history? User content? Um, activity? Device location?

Zuckerberg: ... Senator some of that content with people's permission, we do store.

Fischer: Do you um disclose any of that?

Zuckerberg: Yes, uh, senator, in order for, people to share that information with Facebook, I believe that almost everything you just said would be opt-in.

Fischer: Right and the privacy settings it's my understanding that they limit the sharing of that data with other Facebook users, is that correct?

Zuckerberg: Senator, yes. Every person gets to control who gets to see their content.

Fischer: And does that also limit the ability for Facebook to collect and use it?

Zuckerberg: Senator yes, there are other, uh, there are controls that, uh determine what Facebook can do as well. So for example people have a control about ... face recognition. If people don't want us uh to be able to help identify when they're in photos that their friends upload, um they can turn that off. And then we won't store that kind of template for them.

Fischer: And there was uh some action taken by the FTC in 2011. And you wrote a Facebook post at the time on a public page on the internet that it "used to seem scary to people." But "as long as they could make their page private, they 'felt' safe sharing with their friends online." "Control" was "key." And you just mentioned control. Senator Hatch asked you question and you responded there about "complete control." So, you and your company have used that term repeatedly. And I believe you use it to reassure users, is that correct? That you do have "control" and "complete control" over this information?

Zuckerberg: Well Senator, this is how the service works. It's the core thing that Facebook is, and all of our services, Whatsapp, Instagram, Messenger.

Fischer: So is this then a question of Facebook, is about "feeling" safe, or are users actually safe, is Facebook "being" safe?

Zuckerberg: Senator I think Facebook is safe. I use it & my family uses it and all the people I love and care about use it all the time. These controls are not just to make people feel safe, it's actually what people actually want in the product. The reality is, is that when you, I mean just think about how you use this yourself. You don't want to share, if you take a photo, you're not going to want to, send that to the same people. Sometimes you're going to want to text it to one person, sometimes you might send it to a group. I bet you have a page. You'll probably want to put stuff out there publicly so you can communicate with your constituents.

There are all these different groups of people that someone might want to connect with, and those controls are very important in practice for operation of the service. Not just to build trust, although, I think they, that, providing people with control also does that. But actually in order to make it so people can fulfill their goals for the service.

The FTC's 2011 settlement with Facebook in May 2011 says:
The FTC complaint lists a number of instances in which Facebook allegedly made promises that it did not keep:
  • In December 2009, Facebook changed its website so certain information that users may have designated as private – such as their Friends List – was made public. They didn't warn users that this change was coming, or get their approval in advance.
  • Facebook represented that third-party apps that users installed would have access only to user information that they needed to operate. In fact, the apps could access nearly all of users' personal data – data the apps didn't need.
  • Facebook told users they could restrict sharing of data to limited audiences – for example with "Friends Only." In fact, selecting "Friends Only" did not prevent their information from being shared with third-party applications their friends used.
  • Facebook had a "Verified Apps" program & claimed it certified the security of participating apps. It didn't.
  • Facebook promised users that it would not share their personal information with advertisers. It did.
  • Facebook claimed that when users deactivated or deleted their accounts, their photos and videos would be inaccessible. But Facebook allowed access to the content, even after users had deactivated or deleted their accounts.
  • Facebook claimed that it complied with the U.S.- EU Safe Harbor Framework that governs data transfer between the U.S. and the European Union. It didn't.
The proposed settlement bars Facebook from making any further deceptive privacy claims, requires that the company get consumers' approval before it changes the way it shares their data, and requires that it obtain periodic assessments of its privacy practices by independent, third-party auditors for the next 20 years.



*Update 1: Nearly two months after this hearing, investigators showed reporters at the New York Times and Wall Street Journal that Facebook was indeed storing and sharing our page likes, and users had less control over their data than Facebook implied. As WSJ-owned MarketWatch reported on July 1: "The company disclosed it was still sharing information of users’ friends, such as name, gender, birth date, current city or hometown, photos and page likes, with 61 app developers nearly six months after it said it stopped access to this data in 2015."




This work by AJ Fish is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Popular posts from this blog

60 Minutes Segment From May 2017 - How to Fire Proof a Home

Why Ad Tech Can't Build Brands (Yet)

DrawDown #4: MicroGrids and Industrial Recycling