Emily and Alex talk to UC Berkeley scholar Hannah Zeavin about the case of the National Eating Disorders Association helpline, which tried to replace human volunteers with a chatbot--and why the datafication and automation of mental health services are an injustice that will disproportionately affect the already vulnerable.
Content note: This is a conversation that touches on mental health, people in crisis, and exploitation.
Hannah Zeavin is a scholar, writer, and editor whose work centers on the history of human sciences (psychoanalysis, psychology, and psychiatry), the history of technology and media, feminist science and technology studies, and media theory. Zeavin is an Assistant Professor of the History of Science in the Department of History and The Berkeley Center for New Media at UC Berkeley. She is the author of, "The Distance Cure: A History of Teletherapy."
References:
VICE:
... show moreEmily and Alex talk to UC Berkeley scholar Hannah Zeavin about the case of the National Eating Disorders Association helpline, which tried to replace human volunteers with a chatbot--and why the datafication and automation of mental health services are an injustice that will disproportionately affect the already vulnerable.
Content note: This is a conversation that touches on mental health, people in crisis, and exploitation.
Hannah Zeavin is a scholar, writer, and editor whose work centers on the history of human sciences (psychoanalysis, psychology, and psychiatry), the history of technology and media, feminist science and technology studies, and media theory. Zeavin is an Assistant Professor of the History of Science in the Department of History and The Berkeley Center for New Media at UC Berkeley. She is the author of, "The Distance Cure: A History of Teletherapy."
References:
VICE: Eating Disorder Helpline Fires Staff, Transitions to Chatbot After Unionization
… and then pulls the chatbot.
NPR: Can an AI chatbot help people with eating disorders as well as another human?
Psychiatrist.com: NEDA suspends AI chatbot for giving harmful eating disorder advice
Politico: Suicide hotline shares data with for-profit spinoff, raising ethical questions
Danah Boyd: Crisis Text Line from my perspective.
Tech Workers Coalition: Chatbots can't care like we do.
Slate: Who's listening when you call a crisis hotline? Helplines and the carceral system.
Hannah Zeavin: The Third Choice: Suicide Hotlines, Psychiatry, and the Police
New York Times hype: Using "AI" to diagnose mental disorders based on voice recordings, April 2022
Fresh AI Hell:
ChatGPT for city government work
https://www.japantimes.co.jp/news/2023/06/06/national/yokosuka-adopts-chatgpt/
ChatGPT as manuscript pre-screener for publishing companies:
https://www.publishersweekly.com/pw/by-topic/digital/content-and-e-books/article/92471-ai-is-about-to-turn-book-publishing-upside-down.html)
https://twitter.com/jgnoelle/status/1665729550922252288
A lawyer tried to use ChatGPT for “research”:
https://storage.courtlistener.com/recap/gov.uscourts.nysd.575368/gov.uscourts.nysd.575368.32.1.pdf
Legal scholar (and former MAIHT3K guest) Kendra Albert breaks down the story about a lawyer using ChatGPT for court proceedings:
https://dair-community.social/@kendraserra/110441210244994168
ChatGPT as investment advisor:
https://venturebeat.com/ai/jpmorgan-plans-for-a-chatgpt-like-investment-service-are-just-part-of-its-larger-ai-ambitions/
Professor asks “ChatGTP” if it wrote student’s assignments, and then gave them a 0 if it said yes.
https://twitter.com/Linkletter/status/1658591316991938560
You can check out future streams at http://twitch.tv/dair_institute.
Follow Emily at https://twitter.com/EmilyMBender and https://dair-community.social/@EmilyMBender
Follow Alex at https://twitter.com/alexhanna and https://dair-community.social/@alex
Music: Toby Menon.
Production: Christie Taylor.
Are you just catching up on the bonkers story about the lawyer using ChatGPT for federal court filings? This is a thread for you.
JPMorgan Chase is developing a ChatGPT-like service to provide investment advice to customers, but it is a small part of its AI ambitions.
Sharon Goldman (VentureBeat)