Mark Zuckerberg is probably wearing out his flashcards as he prepares to be the sole subject of two congressional hearings, where lawmakers will take turns grilling the Facebook CEO about the policies that let an app developer cart away and inappropriately share data on as many as 87 million Facebook users. The first is a joint hearing of the Senate Judiciary and Commerce committees on Tuesday. The next day, Zuckerberg will answer questions from members of the House Energy and Commerce Subcommittee.
Without any specific, wide-reaching legislation on the table but lots of public anger toward Facebook, expect members of Congress to ask difficult and sensational questions designed to put the executive on the defensive. They’ll have no shortage of transgressions to focus on. There’s the fake news that spread on the platform throughout the 2016 campaign and the ways in which Russian operatives have used Facebook to try to manipulate U.S. voters. Or how Facebook, which is the second-largest online ad company in the world, allowed advertisers to market explicitly to people’s bigotries, as their software suggested ad-targeting terms like “Jew haters” and “threesome rape.” And then there’s the reason Zuckerberg finally relented to calls that he testify, the scandal involving political-data firm Cambridge Analytica, wherein the Facebook data of tens of millions people was harvested off the platform and allegedly used to help Trump’s campaign.
All of this happened on Zuckerberg’s watch—with virtually no federal laws that would have protected Americans’ privacy and the health of the country’s information ecosystem. Depending on what happens at Tuesday’s and Wednesday’s hearings, Facebook might not stay unregulated for very long. The only major legislative proposal currently drafted and seeking support is the Honest Ads Act, championed by Democratic Sens. Amy Klobuchar and Mark Warner and Republican Sen. John McCain. That bill would require political advertisements on Facebook to list who paid for them, just like political ads on radio, print, and television do, and would levy fines on social media companies that don’t follow the rules. On Friday, Facebook executives said in a blog post that the company does support the Honest Ads Act, although Zuckerberg told Wired last month that he doesn’t expect that bill to pass. Considering the public blowback his company is currently facing, that legislation might be the easiest thing for Congress to pull off the shelf to show Americans that Facebook won’t pass through this controversy without consequences.
Does the company really need information on the Air Jordans I was eyeing 10 years ago?
In broad strokes, most of the concerns members of Congress will likely raise will center on questions of user privacy, election integrity, and the company’s indisputable impact on how Americans communicate and get their news. (Some Republicans will probably also raise questions of whether Facebook is in some way biased against conservatives.) Throughout both hearings, lawmakers will likely highlight reporting on how Facebook’s internal culture and policies reveal that the company was well aware of how its platform was being misused and yet decided not to take the substantial steps to fix it, all the while continuing to be one of the most valuable companies in the world with control over how billions of people get information and stay connected to their personal and professional communities.
If the committee members really want to move the ball forward in terms of what we know about Facebook, here are some questions they might consider asking Zuckerberg while he’s under oath.
Facebook collects a tremendous amount of data on its users—and Facebook owns that data, which it shares with all kinds of third parties through all kinds of partnerships. And the data Facebook collects off users isn’t limited to data pulled from Facebook: There’s also information from Facebook’s “like” buttons installed on websites across the internet, data Facebook lets advertisers bring on board in order to target Facebook users with more precision, and data on your call and text message history that Facebook pulls off people’s phones. And that data has been used to target people in pernicious ways, like in housing ads programed to send to people based on their race or job listings targeted to people based on their age, all discriminatory practices that potentially violate civil rights laws.
Lawmakers could ask Facebook how much data it really needs on users in order to make its targeted-advertising system work. Is it really necessary for the company to see every single website someone visits or have a list of everyone a user calls or texts in order to help advertisers personalize a wedding dress ad for a single woman in a long-term relationship who follows Facebook pages of bridal magazines? (Probably not!) There’s also the question of how long Facebook and other internet companies need to store data on each user. If I’ve been on Facebook for 10 years, does the company really require information on the Air Jordans I was eyeing 10 years ago? Or the doctor’s offices I looked up? Congress members should think about what kinds of data-minimization practices Facebook could adapt and how Facebook can take technological steps to anonymize the data it does hold. Apple, for example, does what it calls “differential privacy,” which allows the company to collect data on individuals’ activity without collecting data that could identify one specific user. That could be a model Facebook could emulate. Or maybe there’s another, better one.
There’s more: Currently, when any of the data that Facebook collects escapes Facebook’s walled garden, whether through developers breaching their contract with the company or from hackers who are able to breach Facebook’s security, there’s no federal law that requires Facebook to alert all the customers that are affected. That data could end up on the black market or, as with what happened with Cambridge Analytica, in the hands of political operatives that users didn’t consent to sharing their personal information with. Facebook didn’t alert users that Cambridge Analytica was in possession of wrongfully obtained data, even though it knew about it for years, and although the company says it’s going to alert everyone effected, without a federal data breach notification requirement, there’s nothing preventing the company from deciding to keep users in the dark again. Elected officials should prod Zuckerberg for more information about why it failed to contact its users and see if he has any sound arguments for why the federal government shouldn’t force him to do so in the future.
Although Zuckerberg said Friday that his company does support the bipartisan bill that would require the online political ads to disclose who paid for them, that’s really just a tiny, initial step in terms of the kinds of policies Facebook needs in order to keep the integrity of our elections and political discourse intact.
Facebook is still fighting a battle against fake news and foreign interference that has served to stoke political divides and rile American voters over some of the most polarizing social issues in the country—often thanks to fake accounts cooked up in Russian office buildings spewing memes and ads targeted to Americans on police brutality, immigration, religious freedom, and abortion. Many of the Russian pages and posts made by Kremlin-backed operatives have aimed to deter voter turnout, and Congress should demand an investigation into whether any of that actually worked. With all the data that Facebook collects on users, it should be in a good position to be able to conduct such an assessment.
Creating laws about what is and is not “real news” is tricky, since a clumsily worded law could threaten to stifle parody and other important forms of political speech protected under the First Amendment. But still, broadcasters have been subject to rules aimed at ensuring people have access to the political information needed in order to vote and engage in political life, like the fairness doctrine, which required that broadcasters devote some programming to controversial political issues and that when a broadcaster endorsed a candidate, the news outlet still had to give airtime to the other candidates in the election. The idea was, in part, that if broadcasters use public airwaves, they have a responsibility to make sure that the public’s information needs are being met. That law was deregulated out of existence in the 1980s, and it wouldn’t map on to Facebook, anyway, which doesn’t transmit over the broadcast spectrum. But lawmakers should ask Zuckerberg how his company plans to be accountable to its millions of American users and consider policy remedies to encourage—or require—Facebook to act in ways that aren’t harmful to the pubic interest.
To that end, Facebook profited off the spread of news that it knew was fake and off advertising that profiled people in malicious ways, and Congress should push Facebook to explain how it justifies this kind of activity. Lawmakers should also demand clarity on how Facebook’s news feed works: How does fake news go viral and how does Facebook decide what people see or don’t see when on the website? In a similar vein, Zuckerberg should explain Facebook’s advertiser review process and what his company is doing to make sure the ads people see aren’t laced with deceptive messaging, especially with political ads.
Finally, Mark Zuckerberg needs to account for Facebook’s past behavior. It’s unacceptable that for years the company allowed app developers to cart off data of users who didn’t consent to it, and it’s not clear that Facebook even respected the online privacy regulations it was supposed to be following that the Federal Trade Commission issued in 2011. The FTC is now investigating Facebook for potentially breaking rules that required the company to get explicit consent from users before sharing their data. Even if Facebook did get users’ OK through its terms of service that they agreed to, the current consent regime on the internet is broken. No one reads the terms of service and if one doesn’t agree to every aspect of a company’s contract, then the option is to not use Facebook or accept the terms regardless. Questioning the company’s past blunders won’t necessarily lead to new regulations, but it will shine light on whether its leadership can be trusted to continue to be stewards of our personal data now—and if they can be trusted to continue to carry this responsibility in the years ahead.
April Glaser is a Slate technology writer and co-hosts the podcast If Then.
Slate · by April Glaser · April 6, 2018