Congress Is Missing the Point on Facebook (From Bloomberg.com)

Data


Americans need a data bill of rights.

Irrelevant.


Photographer: Oli Scarff/AFP/Getty Images

I’m getting increasingly baffled and disappointed by the scandal-cum-congressional-ragefest surrounding Facebook. Instead of piling on Mark Zuckerberg or worrying about who has our personal data, legislators should focus on the real issue: How our data get used.

Let’s start with some ground truths that seem to be getting lost:

  1. Cambridge Analytica, the company the hoovered up a bunch of data on Facebook users, isn’t actually much of a threat. Yes, it’s super sleazy, but it mostly sucked at manipulating voters.
  2. Lots of other companies – maybe hundreds! – and “malicious actors” also collect our data. They’re much more likely to be selling our personal information to fraudsters.
  3. We should not expect Zuckerberg to follow through on any promises. He’s tried to make nice before to little actual effect. He has a lot of conflicts and he’s kind of a naïve robot.
  4. Even if Zuckerberg was a saint and didn’t care a whit about profit, chances are social media is still just plain bad for democracy.

The politicians don’t want to admit that they don’t understand technology well enough to come up with reasonable regulations. Now that democracy itself might be at stake, they need someone to blame. Enter Zuckerberg, the perfect punching bag. Problem is, he likely did nothing illegal, and Facebook has been relatively open and obvious about their skeevy business practices. For the most part, nobody really cared until now. (If that sounds cynical, I’ll add: Democrats didn’t care until it looked like Republican campaigns were catching up to or even surpassing them with big data techniques.)

What America really needs is a smarter conversation about data usage. It starts with a recognition: Our data are already out there. Even if we haven’t spilled our own personal information, someone has. We’re all exposed. Companies have the data and techniques they need to predict all sorts of things about us: our voting behavior, our consumer behavior, our health, our financial futures. That’s a lot of power being wielded by people who shouldn’t be trusted.

If politicians want to create rules, they should start by narrowly addressing the worst possible uses for our personal information – the ways it can be used to deny people job opportunities, limit access to health insurance, set interest rates on loans and decide who gets out of jail. Essentially any bureaucratic decision can now be made by algorithm, and those algorithms need interrogating way more than Zuckerberg does.

To that end, I propose a Data Bill of Rights. It should have two components: The first would specify how much control we may exert over how our individual information is used for important decisions, and the second would introduce federally enforced rules on how algorithms should be monitored more generally.

The individual rights could be loosely based on the Fair Credit Reporting Act, which allows us to access the data employed to generate our credit scores. Most scoring algorithms work in a similar way, so this would be a reasonable model. As regards aggregate data, we should have the right to know what information algorithms are using to make decisions about us. We should be able to correct the record if it’s wrong, and to appeal scores if we think they’re unfair. We should be entitled to know how the algorithms work: How, for example, will my score change if I miss an electricity bill? This is a bit more than FCRA now provides.

Further, Congress should create a new regulator – along the lines of the Food and Drug Administration – to ensure that every important, large-scale algorithm can pass three basic tests (Disclosure: I have a company that offers such algorithm-auditing services.):

  1. It’s at least as good as the human process it replaces (this will force companies to admit how they define “success” for an algorithm, which far too often simply translates into profit),
  2. It doesn’t disproportionately fail when dealing with protected classes (as facial recognition software is known to do);
  3. It doesn’t cause crazy negative externalities, such as destroying people’s trust in facts or sense of self-worth. Companies wielding algorithms that could have such long-term negative effects would be monitored by third parties who aren’t beholden to shareholders.

I’m no policy wonk, and I recognize that it’s not easy to grasp the magnitude and complexity of the mess we’re in. A few simple rules, though, could go a long way toward limiting the damage.

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

To contact the author of this story:
Cathy O’Neil at coneil19@bloomberg.net

To contact the editor responsible for this story:
Mark Whitehouse at mwhitehouse1@bloomberg.net

Before it’s here, it’s on the Bloomberg Terminal.
LEARN MORE

Tech


The social-media giant’s economic model emerged unscathed despite hours of questions from senators.

by

So far, so good.


Photographer: Chip Somodevilla/Getty Images

Mark Zuckerberg did just fine in his first turn in the Congressional hot seat. He was confident. He capably tackled many of the queries proposed last week by Bloomberg columnists. The 33-year-old billionaire appeared humble throughout much of the hearing, with only a few smug smiles. 

The best news for Facebook Inc. the company was that Zuckerberg ably deflected any challenges to the beating heart of its economic model: its hungry data collection and the fine-tuned targeted advertising based on that data. Zuckerberg’s success is a win for anyone primarily concerned with the company’s market value. But it’s a loss for the rest of us.

Facebook will keep failing users’ trust as long as its business is based on unrestrained hoovering of as much user data as possible, and crafting ever-more innovative ways for advertisers to harness that information for commercial goals. It’s an arrangement to which Facebook’s users agree and can sidestep, technically, but it is hardly informed consent or a real option to avoid. 

This inherent conflict was on display during two of Zuckerberg’s exchanges on Tuesday. The first was with Senator Roy Blunt, the Republican from Missouri. He asked Zuckerberg a series of questions about what information the company can collect on its 2 billion users and use for advertising, including whether the social network can pinpoint that a person who posts on Facebook from his work computer in the morning is the same person who uploads a photo to his Facebook smartphone app at night.

The answer, as Zuckerberg surely knows, is yes. Facebook brags to advertisers that it can provide "cross device" targeting, as it is called. The company can also track people nearly everywhere they go online, and it can see what apps people have installed on their phones.

Facebook also collects information on "offline" activity, as Blunt also asked, which includes information on users’ location as they roam around the real world. Companies can also match their information on what your purchase in stores — that box of cereal at the supermarket, for example — and marry it with Facebook account information. Inexplicably, Zuckerberg tried to say he wasn’t completely sure about Facebook’s data collection policies, and one of his underlings could follow up later. The Facebook CEO knows what his company does, but perhaps he couldn’t acknowledge that his companies relies on assembling detailed dossiers on billions of people.

This exchange mattered because Blunt and others revealed the flaw in Facebook’s bargain with users. The company gives us a service we find valuable, and in exchange we agree that Facebook will harness that information to make money. Zuckerberg said everyone who uses Facebook consents to what they agree to share, and has complete control of it. The trick is few people really understand what they’re giving, or are capable of truly controlling it. Zuckerberg seemed to concede as much after a lawmaker brandished a stack of papers said to be Facebook’s data collection and ad policy disclosures to its users.

Technically, Facebook’s users can turn off targeted advertisements or disable sensitive features such as image recognition in photos. (I couldn’t figure out how to do the latter, and I write about technology for a living.) Zuckerberg believes he’s giving users control, but he’s giving them the illusion of control. And that means the consent of Facebook users is not informed. 

Senator Richard Blumenthal and other lawmakers tried to get Zuckerberg to change the rules of engagement between Facebook and its users. Facebook right now operates as take or leave it. Users of Facebook give the company broad permission to collect whatever information the operators of Facebook want for whatever reasons they have. If the user decides to protect that information, it is more of a case-by-case process. Blumenthal, the Democrat from Connecticut, asked to flip that around, and force Facebook to explicitly ask permission for whatever pieces of personal information it wanted to harvest and use, and explain why. 

Maybe people would find this system too cumbersome to be practical. And regardless, there is no way Zuckerberg can agree to this. If everything on Facebook only functioned with an informed "opt in" from users, the company’s business doesn’t work. (Yes, a new European law forces companies to only collect the information they need to provide a service, and obtain clear consent to collect and use the personal information. Facebook has been wishy-washy about whether its implementation of the European law will also be applied to Facebook users outside of Europe.)

Facebook could voluntarily change the rules of the game. It could elect to turn off location tracking of users by default, to stop collecting information on people’s activity away from Facebook without express permission, and to give people even more information that shows how advertisers target them for each Facebook ad they see.

Those changes could dramatically curtail Facebook’s power and its revenue — and that’s the point. None of the good changes the company announced in recent weeks will truly hurt Facebook because it hasn’t revised its economic engine: all that data, and unfettered use of it without informed approval of Facebook’s citizens. Only a dramatic data diet can curb the worst downsides of Facebook. It’s time for Facebook to really change.

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

To contact the author of this story:
Shira Ovide at sovide@bloomberg.net

To contact the editor responsible for this story:
Mike Nizza at mnizza3@bloomberg.net

Before it’s here, it’s on the Bloomberg Terminal.
LEARN MORE

via Bloomberg.com https://bloom.bg/2F5eFDe

April 12, 2018 at 01:22AM