Mark Zuckerberg has made many apologies on privacy – Vox

Mark Zuckerberg has made many apologies on privacy – Vox.

Back in October 2003, then-college freshman Mark Zuckerberg exploited lax computer security at Harvard’s online dorm directories (they were called “facebooks” after physical books full of little pictures of people’s faces that used to be distributed to students in the pre-digital era) to assemble a vast collection of photos of students’ faces, which were used as raw material for a web project he called Facemash. Facemash was essentially a clone of the then-popular Hot or Not site; it would deliver to you the photos of two students and then the user would say which one was more attractive.

The project, intended as more of an inside joke than a wide-release software product, spread virally through the student community, where it quickly prompted outrage and was taken down. On November 3, 2003, Zuckerberg got notice that he would have to appear before a disciplinary panel (“Ad Board,” short for administrative board in Harvard jargon), and on November 19, he was cleared to continue attending school after apologizing and promising essentially not to do it again.

“Issues about violating people’s privacy don’t seem to be surmountable,” he wrote in an email statement to the student newspaper, the Crimson. “I’m not willing to risk insulting anyone.”

Fifteen years and many billions of dollars later, Zuckerberg is one of the wealthiest and most powerful people in the world. And he again finds himself hauled before a disciplinary body, this time the United States Congress, in order to once again apologize for reckless conduct and cavalier violations of privacy. And odds are that, once again, he will not face any concrete consequences beyond a solemn promise to do better.

Zuckerberg isn’t responsible to anyone for anything
In a full-page ad that ran in a number of national newspapers in the immediate wake of the Cambridge Analytica scandal’s initial revelation, Zuckerberg wrote that “we have a responsibility to protect your information. If we can’t, we don’t deserve it.”

The theme of responsibility is one he returned to several times during the interview with Vox’s Ezra Klein that he sat for as part of his crisis communications push:

“Our responsibility here is to make sure that the time that people spend on Facebook is time well spent,” he said, as opposed to simply being a lot of time.
“I do think a big responsibility that we have is to help support high-quality journalism,” he observed in answer to a question about Facebook ad targeting’s impact on the media industry.
“There’s no doubt that our responsibilities to amplify the good parts of what people can do when they connect,” he said when asked about Facebook’s role in amplifying anti-Rohingya propaganda in Myanmar, “and to mitigate and prevent the bad things that people might do to try to abuse each other.”
The theme of responsibility recurred in his congressional testimony:

We didn’t take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here. So now we have to go through every part of our relationship with people and make sure we’re taking a broad enough view of our responsibility. It’s not enough to just connect people, we have to make sure those connections are positive. It’s not enough to give people a voice, we have to make sure people aren’t using it to hurt people or spread misinformation. It’s not enough to give people control of their information, we have to make sure they aren’t using it to hurt people or spread misinformation. It’s not enough to give people control of their information, we have to make sure developers they’ve given it to are protecting it too. Across the board, we have a responsibility to not just build tools, but to make sure those tools are used for good.

It is in some ways genuinely refreshing to hear a CEO speak this way, in terms of responsibilities and moral obligations that transcend the narrow dogma of shareholder value and Milton Friedman’s shallow remark that “the social responsibility of business is to increase its profits.”

But on another level, no matter how many times Zuckerberg says he’s responsible for this or that, it doesn’t change the fact that he’s not actually responsible to anyone for anything. And that’s the problem.

King Zuckerberg’s corporate dictatorship
When Facebook staged its initial public offering six years ago, it implemented a dual-class share structure that means Zuckerberg personally controls a majority of the voting stock even though other investors own the majority of the financial value of the company. In technical regulatory terms, this means Facebook is known as what’s called a “controlled company” that is exempt from certain standard Securities and Exchange Commission investor protections in exchange for making fulsome disclosures about the fact that if you buy Facebook stock, you are buying into a controlled enterprise.

“One of the things that I feel really lucky we have is this company structure where, at the end of the day, it’s a controlled company,” Zuckerberg told Klein earlier this month. “We are not at the whims of short-term shareholders. We can really design these products and decisions with what is going to be in the best interest of the community over time.”

This truly is a powerful privilege, and one that Zuckerberg has probably made some personal financial sacrifices in order to obtain, since the dual-class structure likely depresses the value of Facebook stock somewhat. But you can see that he wields this privilege in some ways as a rhetorical bludgeon.

In a more conventionally structured company, he would be genuinely responsible to the board of directors and to the shareholders to make them money. And it would then be obvious that to the extent the interests of the shareholders clash with those of “the community” — a community that, for all intents and purposes, includes the entire population of the developed world — it’s the responsibility of the community’s elected representatives in Congress to make rules that align those incentives.

Instead, Zuckerberg claims that precisely because he’s not responsible to shareholders, he is able instead to answer his higher responsibility to “the community.”

And he’s very clear, as he says in interview after interview and hearing after hearing, that he takes this responsibility very seriously and is very sorry for having violated it. Just as he’s been sorry ever since he was a first-year college student. But he’s never actually been held responsible.

Responsibility implies consequences
Online social networks obviously pose some novel legal and regulatory issues. But broadly speaking, the question of how to ensure that companies discharge their responsibilities is not a brand new one.

Companies involved in the provision of health care are responsible — not just morally but legally and financially — to abide by the terms of the Health Insurance Portability and Accountability Act of 1996. That law hasn’t eliminated all privacy violations in the health care space, by any means, but when violations occur, they are punished, and the punishment gives actors in that space real reason to avoid them. Financial institutions, similarly, must comply with the privacy rules set out in the Gramm-Leach-Bliley Act. GLBA compliance has thus become its own somewhat tedious mini industry, with lawyers and specialized GLBA compliance firms you can hire.

Similarly, Congress has traditionally recognized the role of journalism in American society. The US Postal Service delivers periodicals at a discount rate, and the Federal Communication Commission’s television station licensing requirements include a vague but meaningful “public interest” standard that is generally held to require both the production of local newscasts and the airing of major national news events. A station that too flagrantly violated these norms would be accountable, legally and financially, to a regulatory body.

Enough is enough
Once upon a time, the US government wisely believed that it would be a bad idea to subject promising young internet startups to the bureaucratic morass involved in things like HIPAA or GLBA compliance.

But the young internet startups are all grown up now, and can easily afford to hire vast armies of lawyers and compliance experts who will help them avoid breaches that lead to massive fines. There is no longer a need to treat Facebook like a delicate flower whose agility will vaporize if it is held legally accountable for its actions.

That means disclosure rules for advertising, it means financial consequences for privacy violations, it means firm antitrust action to restrain further acquisitions and try to uphold some semblance of competition in this marketplace, and it means taking a close look at whether the development of ever more sophisticated ad targeting algorithms is being done in a way that serves the public’s interest in creating a robust media infrastructure.

Fifteen years ago, Harvard’s Ad Board faced a bright kid who seemed well-meaning despite a serious fuck-up. Today, Congress faces a billionaire corporate titan whose recklessness has dire consequences for the lives of millions of people around the world. It’s time for real responsibility.

Vox · by Matthew Yglesias · April 10, 2018

Categories: left

Tagged in: