Will the Facebook Papers be the Catalyst for the Big Tech Reboot?

*Originally published at Tech Policy Press

Are the Facebook Papers brought forward by whistleblower Frances Haugen– evidence that the company’s leaders repeatedly and knowingly ignored the public good while enabling harm and violence in pursuit of profit– enough to finally inspire the democratic action necessary to better align Big Tech with the public interest? 

The Cambridge Analytica scandal didn’t do it. Previous whistleblower Sophie Zhang’s damning revelations about the way Facebook is used by governments to manipulate the public weren’t enough (for reasons explored in A tale of two Facebook leaks, an article in Columbia Journalism Review). Even the storming of the U.S. Capitol on January 6th, 2021 has failed, so far, to spur reform. Nothing yet motivated our collective resolve to resist the threat to democracy posed by a tech industry bound only to its mandate to maximize shareholder value, regardless of its effects on society and beholden to the whims and dictates of unaccountable CEOs. 

Against this backdrop, a trio of Stanford University professors– Rob Reich, Mehran Sahami, and Jeremy M. Weinstein– a philosopher, a technologist, and a policymaker, respectively– build a case in System Error: Where Big Tech Went Wrong and How We Can Reboot for citizen activism, inspiring public engagement to address the perilous state of our democracy. The authors make a strong case for various positions: that democracy is demonstrably preferable to technocracy; that democracy’s role is to act as a guardrail for the worst outcomes; and that the tech industry cannot currently self-regulate, despite the industry’s insistence to the contrary.

The authors muse that democracy is the preferred form of government for a few reasons. Considering forms of governance against their capacities for “identifying and mitigating the harms and suffering we want to avoid,” they argue, “democracies have generally excelled: avoiding mass starvation; preventing nuclear war; eliminating extreme poverty and suffering.”

But this is not necessarily the view amongst tech executives. Reich shares an anecdote of a discussion at a small dinner of unnamed Silicon Valley elites musing about a new state “to maximize science and tech progress powered by commercial models.” Reich raises the question of governance to the table, and the response was telling. “Democracy? No. To optimize for science, we need a beneficent technocrat in charge. Democracy is too slow, and it holds science back.” 

The authors contend that democracy and its focus on free expression and individual dignity are competing values that cannot be optimized for equally: “If a fervent commitment to free speech is threatening democracy by allowing misinformation and disinformation in our public square, it is also threatening individual dignity.” The Big Tech-enabled “marketplace of ideas” is hopelessly submerged in a flood of algorithmically amplified disinformation, where countering bad speech with good speech is equivalent to un-yelling “fire” in a crowded theater, and efforts to correct misinformation are as pointless as an earnestly appended retraction after the stampede for the exits has already begun. 

System Error lays out policy prescriptions to back up its theories:

  • The authors advocate bringing technologists into the public sector as “tech teammates” through flexible mechanisms outside of formal civil service channels;
  • Reanimating the zombie Office of Technology Assessment (OTA), an office of the United States Congress founded in 1972 and defunded by Newt Gingrich in the mid-90’s, which could provide a channel for informing politicians and policymakers on technology, a role now played mostly by Big Tech lobbyists; and 
  • Enabling “adaptive regulation” through so-called “regulatory sandboxes” to address the chasmic lag-time between the “move fast and break things” (or “race-to-MVP and lock in network effects”) pace of emerging technological progress and the successful passing of regulations. 

Regulatory sandboxes describe a process that involves proof of concept, “forking” the existing body of law most relevant to the innovation at hand, provisionally granting the innovator permission to deploy the technology but also devising regulations for the new system for a year, and reconvening to weigh the benefits and drawbacks This type of system could provide some of the requisite agility that policy-making is sorely lacking in a democracy that desperately needs to keep up with the existential threats against it; threats which are enabled by the very technologies it has been ill equipped to promptly govern.

System Error suggests that public attention and participation are required to jumpstart the necessary policy changes required to safeguard democracy. Just as Industrial Revolution-era labor laws were sparked by the Triangle Shirtwaist Factory fire, or healthcare industry guardrails were partially inspired by the Tuskegee Experiment, are the revelations of the Facebook Papers the catalyst necessary to finally regulate the tech industry? 

Regulation is especially urgent because democracy is in so perilous a position, as the information ecosystem has already been so degraded by disinformation and polarization. The authors hint that things could get worse, thanks to synthetic media technologies such as OpenAI’s GPT-3, which may make it possible to flood the internet with a cacophony of machine-generated content.

But, we don’t actually need GPT-3 (or similar tools that could soon be readily available) to supercharge and automate a deluge of disinformation when Facebook’s algorithms are already so successfully amplifying and rendering viral the paltry efforts at disinformation disseminated by people taking advantage of the economics of social media, like the small cadre of Macedonian teenagers who ginned out false stories prior to the 2016 U.S. presidential election. Today’s volume of disinformation, algorithmically amplified, is already successfully drowning democracy by flooding the public square with a targeted and artificially boosted firehose of bile. The volume of bad actors can remain relatively low because the platforms amplify extremist messaging. If this is already happening without GPT-3-style tools, what happens when the automation is coming from both propagandists and platforms?  

Importantly, the authors emphasize that these problems are not just about Facebook, and not just about Mark Zuckerberg, because “there are literally hundreds of would-be Zuckerbergs who are in the pipeline.” 

Therefore, the time is now. If the Facebook Papers can be used to engage the public, perhaps we can transform the attention into collective action through our democratic institutions. The bright spots are ample: there is a bipartisan appetite for regulating Big Tech, though the political right and left are motivated by different values– contrast Elizabeth Warren’s larger push for stakeholder capitalism in her Accountable Capitalism Act with Josh Hawley’s concern about censorship of conservative voices. There is a brief window open during President Joe Biden’s administration during which there are motivated personnel in place to make moves toward meaningful regulation. Key appointments include Alondra Nelson as deputy national science advisor (also, notably, “President Biden elevated the national science advisor to the Cabinet for the first time”); Lina Khan, a specialist in antitrust and competition law as Chair of the Federal Trade Commission; and, just announced last week, Meredith Whittaker, a core organizer of the 2018 Google walkout to the role of Senior Advisor on AI to the FTC; among others. 

With reform-minded leaders in place, there are also substantial efforts underway to build a bench. There are new initiatives to encourage technologists to enter public service, from New America’s Public Interest Technology University Network to programs such as the Congressional Innovation Fellows, the Presidential Innovation Fellows, and the just-launched U.S. Digital Corps. These programs will create a path for people that want to work on these issues, and universities are training the talent. For instance, System Error’s authors at Stanford University created one of the most popular courses on campus, “Ethics, Public Policy, and Technological Change,” and will likely have an influence on computer science curriculum more broadly and potentially the overall education of future tech industry employees. 

The Facebook Papers can be the catalyst for the public attention that System Error contends is required. The policy prescriptions for reigning in Big Tech are persuasive; bipartisan appetite to legislate anything at all amidst the gridlock should not be squandered. The personnel is in place. The public has been inspired. The Big Tech Reboot must come now.

Rebekah Tweed

Rebekah Tweed

Rebekah Tweed is a leader in Responsible Tech and Public Interest Technology careers, talent, and hiring trends. She is the creator of the Responsible Tech Job Board, the Program Director at All Tech is Human, and the Assistant Producer of A BETTER TECH, 2021 Public Interest Technology (PIT) Convention & Career Fair, hosted by New York University and funded by New America’s PIT-University Network, where she manages the career fair and senior talent network and curates the job board and career profile gallery. Rebekah is also the Co-Chair of the IEEE Global AI Ethics Initiative Editing Committee and a member of the Arts Committee. Previously, Rebekah worked as the Project Manager for NYC law firm Eisenberg & Baum, LLP’s AI Fairness and Data Privacy Practice Group, where she examined technology’s impact on society, organizing and promoting virtual events to build public awareness around algorithmic discrimination and data privacy issues in New York City and beyond. Connect on LinkedIn and Twitter.


What does the Responsible Tech ecosystem look like?

Observations from Rebekah Tweed, Program Director at All Tech Is Human

Originally tweeted by Rebekah Tweed (@_bekah_) on October 1, 2021.

I’m the Program Director at All Tech Is Human and also Assistant Producer of A BETTER TECH, public interest technology convention & career fair coming up on Oct. 14-15, hosted by NYU and supported by New America’s Public Interest Technology University Network. I’ve been studying Responsible Tech careers, talent, and hiring trends for about a year since I created a Responsible Tech Job Board.

I’m going to reflect briefly about the Responsible Tech Ecosystem through the lens of what I’ve learned over the past year sifting through the over a thousand jobs I’ve posted by hundreds of employers. These jobs are focused on limiting the harms of technology, diversifying the tech pipeline, and ensuring that technology is aligned with the public interest.

Learnings from the job board and conversations with hiring managers 

  • Our Responsible Tech Job Board curates and surfaces unique roles that were previously hard to find.
  • Social change happens both from the inside, outside, and reimagining something new. The Responsible Tech movement is happening both within industry, from outside advocacy groups and research organizations, and also alternative tech models being launched. 

I want to start by highlighting industry, which is a key component of the overall ecosystem. Most big tech companies have by now built in-house departments around responsible tech under various names like Responsible Innovation, Ethical AI, Responsible AI, Ethical and Humane use of technology, Algorithmic Accountability, etc. Over the past year, I’ve seen these departments grow and in some cases, balloon (and in one case, implode); but interestingly, more and more companies outside of tech are building these types of teams too as they’re increasingly amassing data and utilizing AI. Senior talent will be in increasingly high demand to fill these roles. 

I want to point out some of the recent responsible tech job openings that caught my eye because they aren’t within traditional tech companies.

For instance, in the recent past:

  • American Express hired a Director of Data Ethics.
  • Mayo Clinic just hired a Senior Data Science Analyst for AI Ethics.
  • Paypal hired a Head of Responsible AI.
  • Workday is currently looking for a Machine Learning Trust Program Leader.
  • BP hired a Digital Science Tech Associate for Digital Ethics.
  • H&M has a robust Responsible AI and Data team and hired a few roles last year.
  • Walmart recently hired a Senior Director of Digital Values.

At the very least, this is a signal of the maturation of the responsible tech field as it’s developing beyond traditional tech companies to include corporations in other sectors. Almost all of these companies start with senior hires but then have to staff up over time to actually build out these teams, so I anticipate many more opportunities in the coming months at every level. 

Responsible Tech Roles Across the Ecosystem

Of course, different parts of the professional ecosystem appeal to different people — some people absolutely do not want to be a part of the tech industry while there are also those who would rather avoid academia, or government, or particular NGO’s. But because the field benefits from principled and passionate people at every outpost in this ecosystem, I make available on the Responsible Tech Job Board every type of opportunity to get involved in the field.

This is why you’ll find postings from NGO’s like the World Economic Forum and the U.N. alongside tech roles in local governments and various mayor’s offices; opportunities in the U.S. federal government like TechCongress and Presidential Innovation Fellows, jobs at tech policy think tanks like Brookings Institution’s Center for Tech Innovation; you’ll find openings at the many nonprofits like Algorithmic Justice League, ADL Center for Tech & Society, New America, as well as roles with the philanthropists that fund the work like Patrick J. McGovern Foundation, Omidyar Network, and Ford Foundation. These are alongside ethical AI start-ups like Parity, Spectrum Labs, Fiddler, Virtuous, Anthropic. Additionally, Global Consultancies serve a very important function in the ecosystem, so you’ll find roles from firms like Deloitte, PwC, EY, Accenture, and Avanade. There are also tons of academic roles as professors and within University Research Institutes; and yes, large multinational corporations and big tech companies, as well.

Even though it’s a relatively nascent field, there is already a very robust ecosystem, so I try to touch on all of it with the job board. 

Organizations within the Ecosystem

Industry is a vital part of the ecosystem, but honestly the countless non-profit organizations doing work on the ground are the beating heart of Responsible Tech. All Tech Is Human just released a list of hundreds of organizations and their resources in the Responsible Tech Guide, searchable on, to make it easier for people to make sense of the entire field in order to see where they themselves can most effectively get involved.

Much of our programming is built around clarifying these murky pathways into the field of responsible tech and empowering people to carve out a specific path for themselves, because formalized paths have not yet fully developed. The field is no longer in its infancy, by any means, but it’s still very malleable, and everyone participating within the field right now is a part of firming up these pathways that will eventually feel like foregone conclusions.

The fact that there isn’t one single preferred path can be a challenge for people planning the next step in their careers — or the first. In fact, the most common hurdle named by the over 200 mentees in our inaugural mentorship program is “Knowing where to start.”

Academia and Responsible Tech

The interdisciplinary nature of the field can also pose a challenge in a university setting, where colleges within universitiesare siloed andoften structured as competitors or incentivized to be territorial — this is something I’ve found to be remarkable about New America’s Public Interest Technology – University Network, which is funding my work at NYU with A BETTER TECH. The PIT-UN is working to incentivize collaboration across 43 universities and counting, and across schools within universities that typically compete for resources — funding, job placements, students. PIT-UN is effectively funding collaborative projects that encourage knowledge sharing and joint efforts for the purpose of building the entire field.

For All Tech Is Human, as a part of the over 200 page Responsible Tech Guide that we released on September 15th, we put together a whole document just focused on University research institutes; we’ve featured 67 of them, and more are forming all the time. These programs straddle disciplines, housed in colleges as varied as computer science, engineering, information science, sociology, philosophy, arts & humanities, law, policy, design, business — across entire Universities.

In some cases, these Research Institutes are the point of entry for a University to begin offering graduate certificates or a new degree program, perhaps in response to the strong demand from students for these kinds of academic offerings.

Tech Journalism’s Role in the Ecosystem

At this point, I’d like to acknowledge the incredibly important role of journalists in developing this ecosystem, because these are the people raising awareness about these technologies and their effects on society to the public at large. Their articles are inviting the general public to think critically and reaching an audience that academic research alone just can’t.

Tech journalism helps move the needle culturally, not only forcing industry to have to account for shortcomings at the risk of facing public backlash, but also nudging institutions of higher education to include course offerings and degree programs that match the interests and values of their student bodies, which are shaped in part by the cultural conversations driven by journalists.

(My own flashpoint was Kashmir Hill’s bombshell for New York Times in January of 2020, “The Secretive Company That Might End Privacy as We Know It”, about a then-unknown start-up, Clearview AI, which served to solidify my own interest in the field that was already forming out of my own experiences.)

Your Role in the Responsible Tech Ecosystem

The responsible tech ecosystem is full of people with unconventional and non-linear backgrounds, so don’t be afraid of getting involved if that’s you. All Tech Is Human is focused on filling in the gaps to help people get from whatever point they find themselves at right now — with whatever experience and background from whatever part of the world — to help people find opportunities, engage in some volunteer experience, match with mentors that can give valuable guidance, and ultimately get plugged in with opportunities to build a network (because somewhere in our 1900-member Slack, you’ll find what and who you need). I see these connections being forged every day. I found my first paid job in the field through the All Tech Is Human community, as well as my second job, and my third. This is the foundation of our programming — helping people find the information that they need and the community that they’re seeking, in order to build a network and gain opportunities to develop a lasting career in Responsible Tech.