Alison Holman on Misinformation and Your Mental Health (Ep. 245)
Alison Holman, Professor of Nursing at UC Irvine, joined Joe Miller to discuss how misinformation affects your mental health.
Dr. E. Alison Holman, PhD, FNP is a Professor of Nursing at the Sue & Bill Gross School of Nursing and Department of Psychological Science at the University of California, Irvine. Her program of research examines early predictors of trauma-related health problems to inform development of secondary interventions and prevent trauma-related morbidity/mortality. She has led several community-based studies of coping with collective traumatic life events (e.g., firestorms, terrorism) funded by the National Science Foundation, and Josiah Macy Jr. Foundation. After the September 11th terrorist attacks her team conducted a 3-year prospective, longitudinal study of coping in a nationally representative sample to examine the psychological processes affecting mental and physical health following this national collective event. She received a Robert Wood Johnson Foundation Nurse Faculty Scholar award to expand her 9/11 study and examine stress-related genetic/biological processes that affect cardiovascular health, and runs a multi-site longitudinal study on genetic susceptibility to stress in rehabilitation outcomes following stroke, funded by National Institute of Nursing Research. Her team is now working on a large national study of coping with the Coronavirus outbreak.
University of California, Irvine, 2020. Study Links Rising Stress, Depression In U.S. To Pandemic-Related Losses, Media Consumption. [online] Available at: <https://news.uci.edu/2020/09/18/study-links-rising-stress-depression-in-u-s-to-pandemic-related-losses-media-consumption/> [Accessed 12 October 2020].
Let’s end the cliche, “these things can’t happen overnight,” shall we? We’ve been talking about hate speech on the internet for over 20 years! Mark Zuckerberg’s had a predilection for hate speech as long as we’ve known him. Facebook has been a safe harbor for bigots since it launched.
Look at this:
March 18, 1999 – – The New York Times posts a lesson on its learning network about hate speech. One of the questions they asked students had to do with “whether or not the Internet should censor Web sites promoting [groups that promote hate speech].”
November 19, 2003 — the Harvard Crimson quotes female students — one African- American and one Latina — outraged about Facesmash — Zuckerberg’s first Frankenstein — created so students could rank each others’ attractiveness. Facesmash, of course, is “Smash Face” in reverse, a term that’s long been used to call someone ugly.
November 15, 2007 — Going on thirteen years ago. Nine months after Facebook’s launch. Christopher Wolf, a former Chair of the International Association Against Cyberhate, writes a NY Times Op-Ed entitled “The Web Fuels Hate Speech.”
You know the rest.
Monika Bickert, Facebook’s VP of Content Policy announced Monday that Facebook will no longer permit Holocaust deniers on the platform. Well, it’s about time! The German Marshall Fund reports nearly 2 billion engagements on misinformation coming from Facebook pages — roughly three-times the rate it was at his time in 2016.
Misinformation is more popular than ever.
Here’s the kicker, though — read all the way down, to the last paragraph of Facebook’s announcement it would now ban holocaust denial. “Enforcement of these policies cannot happen overnight. There is a range of content that can violate these policies, and it will take some time to train our reviewers and systems on enforcement. We are grateful to many partners for their input and candor as we work to keep our platform safe.”Monika Bickert, Facebook’s VP of Content Policy
With all due respect, Monika, it hasn’t been “overnight.” It has been over two decades of this. If Mark had been paying attention in 2003/2004 when he launched Facesmash, hate speech on the internet was already a thing. Facebook launched in 2007. Wasn’t that enough time to put the right controls in place?
Fantastic, though, Facebook. Glad you’re awake. Now how about some of these other issues? Now you’ve set the precedent. By the way, why didn’t you flag president Trump’s false statement about his COVID situation? Twitter did. He said he was immune. Since he had it. You really have to be kidding.
But Facebook’s not the only one. YouTube is now on the fence about whether it will ban QAnon. Even though the group is “tearing people apart,” which Travis Andrews writes about in the Washington Post. YouTube CEO Susan Wojcicki says the company is now flagging QAnon content “borderline.”
Borderline. Saying COVID-19 comes from people who eat bat soup, and that Democrats are operating a sex trafficking operation, isn’t “borderline.” Things that have no basis in fact, i.e. evidence in context, are misinformation. Period. Especially when it affects the discourse as it has. The entire fabric of our society.
If it’s satire, flag it as satire. But lies designed to impact elections and incite violence are not “borderline.”
They aren’t free speech, either. It is impossible for hate speech to be free. Just like it is impossible for enforced segregation — separate but equal — to be equal. The Supreme Court’s holding in Brown v. Board of Education, which even Amy Coney Barrett describes as a super-precedent, one of those exceptional cases that should not be overturned, the great Thurgood Marshall argued before the Court in Brown that it is impossible for segregation to be “separate but equal.”
The same logic applies here. Brown applied the Fourteenth Amendment. Here we apply the First.
But let’s look at Microsoft. Microsoft’s making progress. The company won a court order to take down an election interference botnet network called Trickbot.
So if a company like Facebook, Twitter, YouTube, and Microsoft want to do something overnight, it absolutely can. File the motion. Iterate quickly. Be agile. Isn’t that a buzzword now? Agile Project Management? Let’s get agile with this.