In this episode, host Angeline Corvaglia explores the concept of parasocial AI with Sonia Tiwari, an AI consultant and parasocial learning researcher. They delve into the emotional bonds people form with AI characters and chatbots, discussing the origins and implications of these one-sided relationships. The conversation covers how AI chatbots can be designed to appeal to users, the potential mental health risks, and the need for responsible usage of AI, especially among children and teenagers. They emphasize joint media engagement and the importance of caregiver and educator awareness to mitigate risks.
00:00 Introduction and Guest Introduction
01:03 Understanding Parasocial Relationships
03:40 Parasocial Interactions and AI
07:50 Designing AI Characters
13:06 Ethical Concerns and Safeguards
15:36 Practical Advice for Parents and Educators
20:59 Conclusion and Final Thoughts
Special thanks to
Sonia Tiwari for taking time to be a part of this episode!
Follow Sonia on LinkedIn https://www.linkedin.com/in/soniastic/
Episode sponsored by Data Girl and Friends
They are on a mission to build awareness and foster critical thinking about AI, online safety, privacy, and digital citizenship. They inspire the next generation to navigate the digital world confidently and responsibly through fun, engaging, and informative content. They focus on empowering young digital citizens and their families through engaging and informative learning.
Find out more at: https://data-girl-and-friends.com/
List of Data Girl and Friends materials about some of the topics covered:
- Educational workbook, “What is AI?” What is AI – 1
- Educational workbook, “Algorithm Insights Adventure” https://data-girl-and-friends.pagetiger.com/algorithm-insights-adventure/1
- Educational video, “What does AI understand?” https://vimeo.com/916925937
Contact us for any questions or comments: https://digi-dominoes.com/contact-us/
Transcript
Harry is my friend. And so when something bad happens to Harry, you feel emotionally invested in the story that, Oh, how is he going to recover from it? Or, you know, when Dumbledore dies and you, you feel as if your own teacher, a loved mentor passed away. And so that kind of emotional connection, my first introduction was in a good way that you, you feel emotionally attached to characters and it's one sided.
The other [:Anything can be weaponized, [:z Lemon is like literally my [:characters who might be your [:, it actually started in like:it's still one-sided, right? [:. And there's even like a, a [:'re being valued. And it's a [:hard. So this becomes like a [:Why not just talk to someone? Even if it's a fictional chatbot, that would make you feel comfortable about whatever you're going through. And so, like now, parasocial interactions have taken a whole new level of mental health issue, and what we learn in that, the news story that we discussed a while ago, right?
yes, this is a chatbot. Even [:is next level. How do they manage to make people get emotionally attached even though they know that it's not real? Is that something that's built into it? How does that work? Yeah, so some of it is by design. Like I said, anything can be weaponized, right? So good character design principles are one part is visual.
So if it's [:dly things, for example, for [:when you see something cute [:ey have this like nurturing. [:ure attention. In Japan, the [:So the same concept can be going back to that earlier example of, you know, social media can be used by a climate activist for doing something good. Social media can be used by a cosmetic company to sell crap to a bunch of teenagers. And this is just an example of one type of character for a simple age group.
s, for teenagers, as in that [:ter for everyone. It's like, [:know what I understand. You [:Do other things, stay focused on a purpose. We're trying to solve a complex systemic problem with simplistic solutions. So by saying that, well, what were the parents doing? The parents should pay more attention. Well, that's a simplistic solution to a systemic problem, right? It's like parents are not even aware.
chool systems are not aware. [:on't know I did that because [:interested in what it might [:a child's conversation. And [:pose in mind that if you are [:Uh, in the form of an AI chatbot to write a history report. Do it with a class, with your teachers, ask questions that you learned in a textbook that you would like to hear responses on. So some kind of context, some kind of purpose, and some kind of community, a trusted adult who is like constantly monitoring this whole thing.
That's this [:l will have to do before we, [:you can give the character a [:hat person's voice. They can [:are much more powerful than [:alked about the child saying [:Why can't you be my, I played that video for my son. And I also shared the news story about the teenager and Character AI. Because he has been, observing like secondhand, listening to my research and talk about AI. I have demonstrated like some of the math tutorials with, um, ChatGPT Omni with the video and voice.
demonstrations that. He was [:Character AI. She said that [:hard to protect your kids or [:e is. There's a huge gap and [:doesn't take very much time.[:It takes really usually half an hour max to understand how the feeling of this thing is. And then you can have conversations about it, and that's already the beginning, and then slowly you can maybe even learn together about it. But just knowing what they're doing and talking to them about it is a huge, huge first step.
of your head for yourself as [:at. But a response is a more [:is the perfect way to close. [:Please let us know what you think about what we're talking about in this episode and the others. Check out more about us and subscribe at digi-dominoes.com. Thank you so much for listening. I'd also like to thank our sponsor, Data Girl and Friends. Their mission is to build awareness and foster critical thinking about data.
tal citizenship through fun, [: