Interviewers, in Asking "Do Most People Care" About Privacy, Imply They Don't
It takes a person in a position of strength to say privacy matters, because it's a built-in cognitive fallacy to perceive a person who does so as either someone with something to hide, or neurotic.
Without broader cultural knowledge of a privacy framework, such as GWU law professor Solove's four-part privacy taxonomy, the necessity of privacy is a slow case to build in the public's mind, and can be easily rolled back in one day of favorable trading on the stock market.
Such was the case for Facebook (FB) this week when a business reporter took a sound premise and before our eyes, with rapid questioning led the viewer to draw a faulty conclusion which quickly was re-purposed into a sound-looking yet faulty premise from which to draw more conclusions.
Bloomberg TV's Emily Chang, being a reporter on investment television, may not admit privacy matters because investors want growth and brawn in their stock market news, not neurotic concerns about privacy.
Chang implied not a single person other than Om Malik has been off of the FB platform, which he has been for a year. (We all imply, when we urge others to "quit Facebook," that nobody has done so yet.) She also implied through her questioning that people don't "care" about the privacy breeches from the premise that they aren't aware of those breeches. It was irksome to watch, because there have been countless headlines about data misuse concerns and privacy concerns all summer long. And before that all spring long.
Chang implied:
From there, Chang daisy-chained that faulty conclusion into a faulty premise, from which to draw another conclusion:
That faulty conclusion is daisy-chained again, silently leaving the news viewer with the false notion that "people" or "most people" just "don't care" about privacy or data misuse.
Another cluster of leaps happens mid-interview, when Chang says most people aren't aware user data was misused, and therefore they won't care if it's misused in the future whether they become aware or not.
Interview video at https://www.bloomberg.com/news/videos/2018-08-06/facebook-is-one-of-the-least-trustworthy-online-companies-om-malik-says-video. Transcript:
-------------------------
Further Reading:
Facebook web traffic has dropped by half in the last two years. cnbc.com
This work by AJ Fish is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Without broader cultural knowledge of a privacy framework, such as GWU law professor Solove's four-part privacy taxonomy, the necessity of privacy is a slow case to build in the public's mind, and can be easily rolled back in one day of favorable trading on the stock market.
Such was the case for Facebook (FB) this week when a business reporter took a sound premise and before our eyes, with rapid questioning led the viewer to draw a faulty conclusion which quickly was re-purposed into a sound-looking yet faulty premise from which to draw more conclusions.
Bloomberg TV's Emily Chang, being a reporter on investment television, may not admit privacy matters because investors want growth and brawn in their stock market news, not neurotic concerns about privacy.
Chang implied not a single person other than Om Malik has been off of the FB platform, which he has been for a year. (We all imply, when we urge others to "quit Facebook," that nobody has done so yet.) She also implied through her questioning that people don't "care" about the privacy breeches from the premise that they aren't aware of those breeches. It was irksome to watch, because there have been countless headlines about data misuse concerns and privacy concerns all summer long. And before that all spring long.
Chang implied:
sound premise | leap type | faulty conclusion |
After news broke that Facebook planned to invade our financial privacy, its stock price rose by the largest percentage since just before the Zuckerberg congressional hearings in April 2018. | large leap | Therefore, Facebook saw absolutely no dropoff in user engagement. |
From there, Chang daisy-chained that faulty conclusion into a faulty premise, from which to draw another conclusion:
faulty premise | leap type | faulty conclusion |
Facebook saw absolutely no dropoff in user engagement. | small leap | "People", we assume Facebook users, or Facebook investors, don't "care" that user data was misused by Facebook, as it was revealed to have been in the Cambridge Analytica scandal. |
That faulty conclusion is daisy-chained again, silently leaving the news viewer with the false notion that "people" or "most people" just "don't care" about privacy or data misuse.
faulty premise | leap type | faulty conclusion |
"People", we assume Facebook users, or Facebook investors, don't "care" that user data was misused by Facebook. | small leap | "People", we assume the population-at-large, don't care that it's perfectly legal if their data is misused through *secondary use* without their consent, sold, misappropriated, collected, aggregated or weaponized. And "people" have no desire for personal data privacy laws to take effect. |
Another cluster of leaps happens mid-interview, when Chang says most people aren't aware user data was misused, and therefore they won't care if it's misused in the future whether they become aware or not.
Interview video at https://www.bloomberg.com/news/videos/2018-08-06/facebook-is-one-of-the-least-trustworthy-online-companies-om-malik-says-video. Transcript:
Emily Chang: Facebook (FB) shares rose the most since April Monday off reports the company is forging deeper relationships with banks to offer customer service products via its messenger chat app.
FB has for years worked to make messenger a natural place for consumers to communicate with businesses aiming to replace email.
In response to the Wall Street Journal report, FB didn't deny pursuing these bank partnerships but did clarify that any customer information it gathered won't be used for advertising. Could this spark additional concerns around customer privacy?
Joining us to discuss, back with us, Om Malik with True Ventures. So investors are excited about this because the stock is up but it certainly raises the question: can we trust them?
OM Malik: No. One big thing people forget this is in my opinion and the opinion of many others, one of the least trustworthy companies on the internet right now. Because everything they say is not true. Six weeks go by and what they said is not true. And it has been a pattern. I think that one of your competitors wrote a long piece about the lies and lies and unending lies of Mark Zuckerberg and FB. I think we have to kind of pause and stop and stop believing these people. I think [inaudible] the regulators have to stand up and say: you know what, every single action they take should be put through a ringer because we don't know what they're going to do with this information. They never say it and they only tell you the truth if and only they are caught lying.
EC: So to be fair they've hired thousands of new people to handle, uh you know look at fake news, to work with online harassment to work to prevent election meddling. They're working on technology so that AI can also help with this job. Privacy concerns around FB date all the way back to when you were in your GigaOm days. And the company made a lot of apologies but also worked to right some of the wrongs and change settings. Does any of that matter? Is it their intentions that you think are unpure?
OM: After watching this company for a long time the only way to describe it like you have a donkey and you paint stripes on it it doesn't become a zebra. And I think it's the same situation with FB it is not going to change as a company because its entire core is based on collecting as much information and data as possible and then broker it for advertising. That's it that's their core business as long as they're doing that they will continue to play loose and easy with peoples' privacy.
EC: But investors love it.
OM: Yeah but investors also punished Twitter the other day for saying we are removing bots. Investors just want to see the numbers going up without understanding that there are consequences not just for the society but for the company. Eventually as the trust erodes in the FB brand where will people go?
EC: So we certainly saw some pullback in new users in Europe and there is this looming question of whether FB is running out of new users. You have been off FB now for a year. But I wonder, you are more tech aware than most, do most people care about this?
OM: Not yet. Like I think they just -
EC: Will they?
OM: They will and then they will switch to [Facebook-owned] Instagram and then it will be another three-four years of Instagram kind of playing loose and easy. And then people are going to wake up to something else.
I think the reality is that we have to start monitoring these companies more closely now more than ever. Because, you know, looking out five years from now when video and visual internet is all around us, creating fake news and having fake you know, information out there which looks like real information, can actually have a much deeper impact, because what your eyes see is more effective than what headline you read. And this is what I feel that legislators now are worrying about the past too much whereas we need to be thinking about controlling these companies in the future.
EC: So what should the regulation look like?
OM: The regulation should look like - should be all around data. How they use data, what they do with that data, having clarity around how they're using it, in language that average humans can understand. And also being able just to localize what that information can be used for. And I think just it is actually negating this company's capabilities, abilities to use and misuse that data.
EC: is there a world that you could see with the right regulation or right laws, where you would get back on Facebook. Why not.
OM: No I'm just done with it. The reason I left FB was not just because of concerns about privacy. The reason I left FB was because I was living my life based on how I felt about other people. And I think that to me was the worst thing.
sound premise | large leap | faulty conclusion |
Facebook's stock rose on news it will obtain personal financial data from banks without user consent. | ||
faulty premise | small leap | very faulty conclusion |
very faulty premise | small leap | very very faulty conclusion |
"People don't care about privacy." |
sound premise: Facebook stock rose on reports the company "asked large U.S. banks to share detailed financial information about their customers, including card transactions and checking-account balances" (WSJ: Facebook to Banks: Give Us Your Data, We'll Give You Our Users) | large leap | faulty conclusion: "People" don't care that Facebook misused personal user data in the past. And "people" won't care if Facebook misuses user data in the future. |
faulty premise: "People" don't care that Facebook misused personal user data in the past. And "people" won't care if Facebook misuses user data in the future. | small leap | very faulty conclusion: People don't care about their own privacy, others' privacy and have no desire for laws to protect their privacy. |
very faulty premise | small leap | very very faulty conclusion |
-------------------------
Further Reading:
Facebook web traffic has dropped by half in the last two years. cnbc.com
This work by AJ Fish is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.