You, Me, and All of Us

On that Facebook whistleblower interview and why it gives me (some) hope for this hellish industry

“There were conflicts of interest between what was good for the public and what was good for Facebook,” Frances Haugen said, a former Facebook Product Manager turned Whistleblower, revealing her identity for the first time in a 60 Minutes interview that aired last Sunday. Last month, she released internal research, under an alias, to lawmakers, regulators, and the news media, including the filing of a complaint to the US Securities and Exchange Commission, exposing that Facebook, to no surprise, knew about the harms it was causing around misinformation, hateful content, and how Instagram had been worsening body image issues. In the interview, Frances went on to say a statement that is all too familiar for many of us working in tech, “I knew what my future looked like if I continued to stay inside of Facebook, which is, person after person after person has tackled this inside of Facebook and ground themselves to the ground.” She was a part of the Civic Integrity team that was, unsurprisingly yet hauntingly, disbanded after the 2020 US presidential election and before the January 6 insurrection.

Like many of us, I’ve been struck by this story, not merely because it’s yet another report about Facebook or the fact that we’re all wondering whether that outage yesterday was caused by some kind of internal sabotage or an experiment to see whether more people would get vaccinated. I’m struck by it because of how much this story hits on the high-voltage wires that hide underneath not only Facebook but within the industry-at-large. This story mirrors back to us what we already know as tech makers, the very thing that keeps us up at night: that there are clear conflicts between what’s good for the company’s pocket, what’s good for our individual pockets, and what’s good for the public. Maybe you feel the same lump in your throat that I feel as I write this, questioning whether it’s even possible to work in this industry without participating in harm of some kind. Honestly, with the way that the industry is currently structured and incentivized, I’m not so sure that we can. But Frances Haugen’s story is a warning, a warning that gives me an unexpected kernel of hope, a hope that I hope you can take with you, consider, and explore in your own workplaces and product spaces, especially in this pressurized time of year when many of you are planning for 2022.

As tech makers, when it comes to determining what a tech product or company should do next, we are incentivized to look at the relationships involved, focusing on an equation of two entities: product and user, employer and employee, current and projected metrics, engineer and designer, you and me. But in the context of shaping technology, a relationship is not, and should not, be limited to two entities. There is a third entity: all of us. It’s the entity that is most significantly impacted by the individual decisions that you and I make as tech makers, impacts that ripple out across the world, that rarely, if ever, boomerang back to our own workplaces. Enter Frances.

Throughout my career, I’ve seen an ‘all of us’ type of expression blast across marketing campaigns, conference stages, shiny happy employee videos, all-hands meetings, emails. But what I haven’t seen as much is the perspective of  ‘all of us’ explicitly written on the page of our planning documents, budget proposals, product roadmaps, metrics analysis, design requirements, and performance review criteria. Too often, the perspective of ‘all of us’ lives and dies within the internal research reports that get overlooked. It lives within the heavy hearts of the researchers who must carry the burden and the risk (let’s not forget Timnit Gebru’s story) of trying to raise alarms and “influence stakeholders” to slow down and consider the perspective of ‘all of us’ when making decisions. (If you haven’t yet, today, and every day, would be a good time to demonstrate real support and appreciation for the researchers you work with.)

I, too, have been guilty of ignoring the ‘all of us’ perspective, naively assuming that a researcher, or someone, somewhere in the company, already thought about ‘all of us,’ that surely we wouldn’t be moving forward on this project if it was going to cause harm, right?  Maybe you, too, have faced this same experience, assuming that ‘all of us’ has already been considered. Or maybe we’ve ignored this perspective because we don’t make the time to talk about it, because we’re too exhausted, because there’s been another re-org and we don’t know who our team is anymore, because it feels risky to talk about it, because it may threaten our own career growth and financial stability, because it feels uncomfortable to challenge the existing power structures, because it puts our very world into question, a world that skews this whole damn relationship by tricking us into believing that it’s okay to put profits above all of us. Whatever the case, in an unregulated industry, it’s not enough to assume that the perspective of ‘all of us’ is already taken into account, we must actively, and explicitly, make it known, seen, and heard.

Soon after The Wall Street Journal published details from the internal research reports that Frances Haugen released, including the story of a teenage girl who had spent three hours a day on Instagram, admitting that this led her to develop an eating disorder, Facebook announced that the company would pause developments on an Instagram experience for tweens. Adam Mosseri, Head of Instagram, stated, "We’re going to take the time to work with parents, policymakers, regulators, experts, to demonstrate why this project is valuable, and how it helps keep teens safe.”  

Valuable to whom?  

To Facebook’s profits and power in the world? Yes. To the corporate career trajectory of the teams working on this product? Maybe. To the parents and tweens? Whew, it feels a bit bold to say that there’s a possibility to demonstrate value for them, particularly when we know (and there’s still much we don’t know yet) about how the existing Instagram experience is already causing harm for younger generations. Are parents, policymakers, regulators, mental health experts (or really anyone who exists outside of the company’s incentive structure) involved in how Instagram is shaped today?  And I’m not talking about quick user studies and surveys, but rather, in-depth, seat-at-the-table type of discussions and decision-making. And if so, why isn’t this information shared with all of us?  It shouldn’t have to take people like Frances Haugen and others to take on personal risk, nor should it have to reach a point where harm is being caused, in order for ‘all of us’ (parents, policymakers, regulators, experts, and yes, the tweens themselves) to have a fucking seat at the table in determining how technology is shaped–especially products that have reached one billion monthly active users. Because at that point, it is no longer about how the technology is being shaped but about how our lives are being shaped.

Even if the product or service you’re working on doesn’t seem as controversial or broad-reaching as Instagram for tweens, 2022 presents itself as an opportunity for all of us to embrace a more transparent process, to budget the time and money to invite more external experts into our discussions, to listen, to let it inform your work, to let it take technology and industry to a better place than where it is today. Make a plan to assess the gap between the ‘all of us’ sentiments that may exist within your marketing campaigns but not within your product discussions, documents, and roadmaps. Create an ‘all of us’ section at the top of your product, design, and engineering documents, one that outlines the benefits, risks, and unknowns that an external expert–one that exists outside of your incentive structure–could help you navigate. Review your OKRs, career ladders, and performance criteria with a magnifying glass, examining how the language and metrics could present a conflict of interest between what’s good for the public and what’s good for the company. Advocate with and for your research teams by not letting their reports get overlooked or ignored. And if this all feels impossible to explore in your current situation, and it very well might, that if, as Frances Haugen puts it, you see person after person after person tackling this, grounding themselves to the ground, I hope that you remember that your own health matters as you strategize your exit.

Today, Frances Haugen is scheduled to testify in Congress about Facebook’s impact on younger generations. I’ll be watching, hoping for no more oblivious, time-sucking questions like the one Senator Blumenthal asked Facebook, “Will you commit to ending Finsta?” As I watch, I’ll be wondering what this story will mean, not only for Facebook, but for the future of this wild industry, and more importantly, what it will mean for the tweens and teens that are growing up in this wild time. Let’s hope, for the sake of all of us, that we let this decade be known not for what has been scaled but for what has been tamed.

Check out The Tech Worker Handbook, created by Ifeoma Ozoma, a collection of resources for tech workers who are looking to make more informed decisions about whether to speak out on issues that are in the public interest.

Curious to read more about how to advocate for people in a profit-driven world? I wrote another piece about it, written in partnership with dscout and Hmnty Cntrd.

Have a burning question you want me to explore in this newsletter? Submit it here, anonymously.