388. AI, algorithms, and social media: How to protect your child from digital harm and advocate for change with Imran Ahmed

Listen on Apple Podcasts button
Listen to this Episode on YouTube

Imran Ahmed, founder and CEO of the Center for Countering Digital Hate, joins me to talk about how social media platforms, algorithms, and AI systems are designed and what that means for our children’s mental health and safety.

Together we explore:

  • How social media algorithms are built to maximize attention, and why emotionally extreme content is often amplified.
  • What research reveals about how quickly self-harm and eating disorder content can be served to young users.
  • How AI platforms can respond dangerously to vulnerable teens when guardrails are not properly in place.
  • Why this is not just a “screen time” issue, but a systemic design and accountability issue.
  • The difference between pulling the “emergency brake” and creating meaningful long-term change.
  • What parents can realistically do at home to build digital resilience, foster trust, and partner with their children in navigating online spaces.

This episode isn’t meant to create more fear, but to offer greater clarity. My hope is that parents walk away feeling informed, empowered, and better equipped to both advocate for safer systems and strengthen the relationship that ultimately protects kids most: the one they have with you.

LEARN MORE ABOUT MY GUEST:

🔗Center for Countering Digital Hate 

FOLLOW US ON SOCIAL MEDIA:

📱IG: @counterhate  FB: Center for Countering Digital Hate  Youtube: @CCDHate 

📱IG: @drsarahbren Youtube: Securely Attached 

ADDITIONAL REFERENCES AND RESOURCES:

💻Protecting Kids Online – Download the guide for parents

🔗Deadly by Design: TikTok pushes harmful content promoting eating disorders and self-harm into young users’ feeds 

🔗Fake Friend: How ChatGPT betrays vulnerable teens by encouraging dangerous behavior 

🔗Resist and unsubscribe – Scott Galloway 

👉 Parenting in the age of AI, algorithms, and constant connectivity can feel like a lot. If you’re feeling unsure, reactive, or overwhelmed, Upshur Bren Psychology Group offers therapy and parent coaching to help you feel grounded, clear, and confident as you support your child and manage your own stress. Schedule a free 30-minute consultation or go to upshurbren.com to learn more and find the right support for your family.

CHECK OUT ADDITIONAL PODCAST EPISODES YOU MAY LIKE:

🎧Listen to my podcast episode about secure attachment vs. social media with Dr. Miriam Steele

🎧Listen to my podcast episode about the do’s and don’ts for introducing screens to your toddler

🎧Listen to my podcast episode about teaching kids healthy tech habits free of guilt or power struggles with Ash Brandin

🎧 Listen to my podcast episode about the hidden dangers of EdTech with Andy Liddell

🎧 Listen to my podcast episode about rewiring the way our kids interact with screens with Alé Duarte

Click here to read the full transcript

Four tweens standing together outdoors, all looking at smartphones, reflecting how social media captures children's attention in groups.

Imran Ahmed (00:00:00):

I don’t agree with at a ideological level. I don’t agree with bands on social media. I don’t agree with age limits necessarily, but I think they’re like an emergency break. You pull when something’s gone terribly wrong.

Dr. Sarah Bren (00:00:18):

Parenting in today’s digital world can feel overwhelming. With emerging technology like AI adding to the already fraught world of social media, algorithms and big tech, it’s no wonder the parents can feel like there is no way to protect their children from online harm. Hi, I’m Dr. Sarah Bren, a clinical psychologist and mom of two, and I’m the host of the Securely Attached podcast. Each week I sit down with leading experts in psychology, medicine, neuroscience and child development to translate complex clinical research into practical, grounded parenting insights that you can use to help you in your daily life. And this week I’m joined by Imran Ahmed. Imran is the founder and CEO of the Center for Countering Digital Hate leading efforts to hold social media and AI companies accountable when their systems harm children and undermine our shared civic life, A recognized authority on the social and psychological impacts of digital platforms and the manipulation of our information environment.

(00:01:18):

He regularly appears in the media and in documentaries and advises policymakers on common sense safeguards that put people first. In this conversation, we talk about how algorithms are built to capture attention and why emotionally extreme content often gets amplified. We also talk about what research reveals about how quickly vulnerable young users can be exposed to harmful material. We also explore why this isn’t just a screen time issue, but really a systems and accountability issue and what meaningful change could actually look like. But most importantly, we talk about what parents can do, how to build trust instead of getting pulled into power struggles, how to partner with your child rather than policing them, and how to tap into one of the most powerful protective factors you have your relationship.

(00:02:10):

Hello Imran. It’s so wonderful to have you here. Welcome to Securely Attached podcast.

Imran Ahmed (00:02:23):

Hi, it’s lovely to be with you.

Dr. Sarah Bren (00:02:26):

I’m very, very much looking forward to this conversation with you. To start us off, could you tell us a little bit about the evolution of the work that you do, the center of countering digital hate, but also your entry into this world?

Imran Ahmed (00:02:45):

Yeah, well look, first of all, thank you for having me on. It’s lovely to be talking to you and to your audience. And I mean the work that I do at CCDH started in a moment of crisis and grief. Actually from our name, you can tell the Center for Country in Digital Hate Hate was the original thing that we looked at. I was a special advisor in the British parliament and my colleague Joe Cox, who was a 41-year-old mother of two, was shot and stabbed to death by a terrorist who’d been radicalized online. And in 2016, I kind of realized that something odd is happening to our society and to our world in that social media and the internet has become the main place where we share information, where we establish what we call our social mores, our norms of attitude and behavior where we negotiate our values and even where we negotiate what we call the truth.

(00:03:53):

And those environments worked in different ways to the real world. The mathematics of what information gets fed to billions of people rather than just seen by a few people was completely different online. The economics were different, the algorithms were different, and that was reshaping our world. And that kind of got me into looking at this from both a mathematical and a sociological perspective, trying to understand that. And over time, as I sort of started to explain to people what I was finding, other people would come to me and say, well, this is actually affecting so much more than just the amount of hate in the world and conspiracy theories and lies. It’s affecting kids’ mental health. And I myself am a fairly recent, my kids are under two, but still I’m a new parent and it’s something I’ve wanted all my life. I love kids and the eldest of seven kids as well. So always had that kind of brain of thinking about what’s good for younger people. And that in particular, our work on kids, has really transformed my understanding of why it is that we need to take action on social media and what kind of action we need to take to protect our society and protect our future because that’s what our kids are. So that’s us.

Dr. Sarah Bren (00:05:17):

That’s critical. I’m glad that you are doing this. I’m a mom. My kids are eight and six. I’m also like a user of social media and other tech spaces and ai, which is I’m also plugged in very much to parents of kids of all ages in my work as a clinical psychologist. And so what I have been feeling myself in my own home, but also in just the milieu of parenthood is this mounting sense of yes, anxiety, but it feels bigger than that. It feels like it’s bordering on the overwhelm, the helplessness almost to the point of such overwhelming helplessness that we’re bordering on paralysis and even maybe even apathy. This is too big, I can’t stop it. I guess I can’t do anything and I turn I void. Or it’s like I don’t know what to do. And so while I think my biggest goal for this conversation is to be very hopeful and give real agency to parents, I think we also have to name, this feels so big, the speed at which all of this stuff is just expanding around us, and it is so all consuming of ourselves and our kids. Where do we begin?

Imran Ahmed (00:06:49):

So how about we start this way? I don’t want this conversation to be one just about harms. I don’t want to terrorize people. I want to talk about why it is that CCDH does work exposing harms. So I spend most of my day seeing things that most people would not want to see, whether it is content that may inspire terroristic acts or and eating sort of content, or it is AI platforms that are willing to help a kid kill themselves. And these are grim and scary things. But the reason we do it is because information technology is so, so valuable. And I know that for my own life, I’m 47 years old. When I was three years old, my dad brought home, I suspect it was stolen. He was a bit of a local hoodlum computer with 32 kilobytes of ram. It was a micro B computer and programmed in a language called basic. And I used to write little programs on it. So I’ve always had computers around me by the time I was 18 and in the uk medicine’s an undergraduate degree. So I went to medical school. My medical encyclopedia was on a CD rom and I thought that was a miracle, like a single disc.

Dr. Sarah Bren (00:08:20):

Yes, I remember CD ROMs.

Imran Ahmed (00:08:21):

And so I had this CD rom and I could see my anatomy stuff and my physiology and my microbiology. It was really exciting. Six years later, by the time I went to Cambridge University to go and study social and political sciences, my college was fully networked with high-speed internet. And Cambridge looks a bit like Harry Potter anyway, because all those spires and ancient buildings. And then you’ve got this thing where you click a button and it’s like digital owl goes and fetches any book you want. That’s how Cambridge’s online library works. And it brings it back and you’re like, oh my God. I’m like, I’m not sure if I’m Harry Potter or if I’m like Iron Man, but this is so cool. I have access to all the knowledge. And that’s why it’s great because it opens the world up.

(00:09:15):

And social media at its best in its original promise was meant to make the world so small. You could fit it on a screen that you could speak to someone from Tulsa or Timbuktu or hindsight in Northeast England and they’d be as close to you as anyone else. It would reduce barriers. It would make us see and understand and know and empathize with others. It would actually make us a better humanity. But that’s not exactly what’s happened, is it? And we all know that. And what we are trying to identify is ways in which harms are created. And there is an industry that we think is really negligent in the way that it is administering this awesome power that they’ve created. And I thank them for creating that power. And I’m really regretful that we are able so easily at CCDH to evidence how that power has become used for harm in so many ways. And so I do want to talk about the harms, but I wanted to begin by reminding people it’s worth our effort to make these spaces better because the promise they have is incredible.

Dr. Sarah Bren (00:10:33):

And they are not going anywhere. Yes, they have promise, but they also have such deep seated utility in our world that the reality is we’re dependent on them. I am a firm believer in a harm reduction versus an abstinence model in almost anything we know that plays out in the mental health world profoundly. We need to teach our kids, we need to first educate ourselves on how to be digitally sophisticated consumers of technology. And then with that knowledge, we’re armed more effectively to support our kids learning how to do that. I really do think we have to teach kids how to be safe. Online is like that’s just scraping the surface. But we need them to know how to discern and navigate and not just by paying attention to what’s happening around them, but to be paying attention to what’s happening internally. How is this making you feel? Trust your internal cues. If this doesn’t feel comfortable, what can you do? How do you know that? We could talk a lot about that I think on the episode two, because I do think that’s another piece of agency for parents. But from your end, as you’re navigating these true capital H harms really serious things that are unacceptably bad. But then there’s also these lowercase eight harms of waiting too deep into a current that’s just too fast for a kid. Cyber bullying, just self, the sort of social erosion of connection and relationship and social skills, these are also harmful. But that’s very, it’s this sort of subtle pervasive thing. What are you seeing?

Imran Ahmed (00:12:25):

So to unpack the ways in which the harm is created, I want to be really cautious not to overwhelm people. I dunno if you were scared of the dark when you were a kid. I dunno if any of your kids are scared of the dark. I was scared of the dark. And when you’re a kid and you’re scared of the dark, you put your duvet, your blanket over your head and you hide because you don’t want to see it. And the way that a parent, I remember the way that my parents dealt with it was to sort of slowly tell me to pull my eyes up underneath there and then help me to identify what the noises were that I was hearing and the things that were scaring me and I thought were monsters and tell me that it was the plumbing that was making that noise. It was my dad snoring in the other room, whatever else. It wasn’t something terrifying and to slowly help me to understand what they were. So let’s try and do that a little bit.

(00:13:32):

There’s three things that I think are really important to think about. One is the way that platforms are designed. And fundamentally, platforms are designed to be addictive. So they are social media platforms. Their only way of thinking about us as the users is how many minutes every day can I get them to watch? And based on the number of minutes that you watch, they can serve up a certain number of ads per minute. And that’s how they make their revenue. Like tiny micro transaction for each advertisement to each person, which accrues over time. The way that they’ve built these platforms is to maximize the amount of time that we spend on them. They’re incredibly good at doing it. The average teenager in America spends 4.7 hours a day on social media. And that is an almost unprecedented shift in where our attention is placed as human beings.

(00:14:33):

So these platforms are a incredibly addictive. The second thing is that the content that they allow to be served to kids is a function of two things. One is what is the most addictive? And so the most addictive content is the content that generates extreme emotional reaction. And they’ve worked that out by studying because what they have is data on what kids click on. They’re tracking how long you spend on a post, they’re tracking how many times you like it, whether or not you comment on it. And they’re adding all that information up into a user engagement score per post. And then they’re saying this post keeps people clicking and replying and watching. And so they serve that to more people. And what they’ve realized is that that engagement, which also predicts whether or not you’re likely to stay on the platform, therefore maximizing the number of minutes that you’re there, the most engaging content is often the most harmful because the things that we can’t keep our eyes away from are usually things that anger us, make us scared, and so make us think I’ve got to know more.

(00:15:50):

They generate those intense reactions. And then the final thing is they’re meant to have systems to make sure that content that is dangerous or harmful to kids and all sorts of content can be harmful to kids over time like violence, for example, pornography, but also genuinely obviously bad content like content about how to kill yourself or content about why your body is disgusting. And you should go on a 300 calorie a day diet, which no one should be sending to a 13-year-old girl or boy to be frank. That content that the safeguards that they’ve got in place to make sure that that isn’t amplified and monetized and therefore encouraged are not fully in place. They’re actually very weak, those boundaries. So you’ve got these three things. You’ve got the addiction business model, the algorithms that serve up the most emotionally effective content, and then the lack of guardrails on what content is sent to kids.

(00:16:58):

And all three of those are lacking. The transparency isn’t there. So we don’t know how the systems work and only organizations like CCDH, which have data scientists and experts in all sorts of fields can give you the information that you need about what those harmful things are. And then there’s a lack of enforcement. So actually there’s zero regulation of these platforms, which is kind of wild if you think about it, because what other things do we give to our kids that we would, if I said to you, Hey, I’m going to give you a kid this bar, this bar has absolutely no regulation. And by the way, if this bar kills your kid, there’s nothing that you can do to see the manufacturer. Would you give that bar to your kids?

Dr. Sarah Bren (00:17:46):

No. Yeah, that’s a very, like no way.

Imran Ahmed (00:17:49):

Right? That’s social media. You cannot sue those companies. I was in the Senate last week with parents who were explaining to senators that their kids had died as a result of things that happened on social media and that something called section two 30 of the Communications Decency Act 1996. So a 30-year-old law prevented them from taking action against them because social media platforms have been given this really weird protection under US law where they can’t be held liable for negligence. And that is kind of crazy to me. It is an injustice that needs to be fixed.

Dr. Sarah Bren (00:18:34):

And that’s a big part of the work you guys have been doing, right, is to try to promote awareness of that. I mean, you’re doing a lot of things, but one of the things I feel like you’re doing is promoting awareness of that and trying to create some advocacy around know it so we feel that’s a problem and take some action to change it. How does, I’m sure we could get into advocacy work, which I think some parents, one thing we know is when you feel really helpless and overwhelmed, feeling like you have agency is actually really important. And so that agency can come in the form of self-education, helping educate your children on how to navigate these things. But it also can be much higher level of trying to affect change in the systems.

Imran Ahmed (00:19:32):

I think that you’ve got to separate what an individual parent can do. And I’m an individual parent too. I feel so empowered in my work. I’m the CEO of an organization with offices in London, in Brussels, in Washington dc in la. I get to speak to the media, to podcasters, to a load of people about senators, celebrities, about the work that we do. And then I go home to my 17 month old daughter and I feel completely clueless and helpless. In fact, I feel ashamed of how little I understand her and her brain and her development very frequently. I always feel an insufficiently good parent, and I’m sure that that’s a feeling that a lot of people share as well. It’s the one thing that we all human beings, no matter what, color, race, anything else, parenting is a universal experience and it’s difficult. So I think about trying to the two different parts of my life in the part of my life where I feel empowered. Advocacy is something that makes me feel like I’m changing things.

(00:20:48):

And so last week I was on the hill in the Senate, or maybe two weeks ago with Joseph Gordon Levitt, who’s an amazing guy, a parent himself and who was talking to senators and we did a press conference with some parents who’ve lost their kids, urging them to pass a bill that Senators Lindsey Graham, who’s a Republican from South Carolina, and Dick Durbin, who’s a democrat from Illinois, they’ve introduced into the Senate, so a bypass and bill with multiple lawmakers signing up to it, Senator Klobuchar, Senator Hawley, and saying, look, let’s get this done. Let’s create a change in the liability for these companies that if they do kill kids, they know that they’re not doing the right thing. They fail to put into place the correct processes to make sure that they don’t harm kids. We can hold them liable in the courts. You should be able to sue them if they kill your kid.

(00:21:44):

It’s just that simple. If you can sue someone because their coffee’s too hot, which is why when I moved to America, I was like, why do all the coffee cups say this contains hot liquid? If it didn’t, I’d be really annoyed. But that’s how America got consumer rights was by suing these companies. And I think you should be able to do that. So that’s kind of the advocacy that we do. And to have effective accountability and to hold these companies responsible, you need to have transparency. So you need to have mandatory reporting guidelines. So if you are a toy company or if you’re a deli, you have to tell people what’s your ingredient list? You buy something in the store or process good and it’s got the ingredients, the calories, the fat, the everything else. Do the same with social media, have some transparency rules. But then as a parent, you go home and you’re trying to work out what to allow your kid to experience. And that’s a completely different question because until those systemic changes go into place, we need to still parent and telling your kid, you can’t have something. I don’t think it’s good for you. That’s not necessarily the best way to get ’em to not do something.

Dr. Sarah Bren (00:23:01):

Maybe for your 17 month old but not for your 17-year-old. That certainly gets a lot harder.

Imran Ahmed (00:23:07):

My 17 month old, she’s got, I’m British, so we don’t do the sort of the thing, but she does that to me and I’m like, how do you already know how to do a Beyonce n nah, nah thing. It’s wild.

Dr. Sarah Bren (00:23:27):

They still have, yes, they will still fight us. Their will is strong.

Imran Ahmed (00:23:30):

I’ve got a sassy 17 month old, she’s sassy. She’s going to be a real, uh.

Dr. Sarah Bren (00:23:35):

It’s going to be a superpower for sure.

Imran Ahmed (00:23:36):

I’m really looking forward to, but you wonder how do you prevent them from being harmed? And I think the best way that we can do that is try and navigate that together as parent and child. And so one of the things that we do is say it’s not the kids that need to have the duvet taken off their heads, it’s us, it’s the parents. So get informed real fast. If you don’t know any of the stuff that I talked about earlier on, if any of it was too fast or too complicated for you, we’ve got a very simple free guide on our website Protecting kids online.org where you can download and understand a little bit more about what an algorithm is, how it works, why these platforms don’t enforce their content guidelines. It’s expensive. They have to have human beings to review the content and make sure it’s safe for kids.

(00:24:30):

And then also some tips on how we create these mutual journeys of learning, where we are learning about what these kids are experiencing, the technology that they have from birth that we never ever saw as young people, and that we are learning together. And it’s kind of a symmetrical journey of us learning what they’re seeing, but then using our wisdom to contextualize it for them and help them have that richer level of understanding and resilience that we have as adults to enhance their ability to understand and navigate this world. And I think that’s where the magic happens, is in those symmetrical mutual journeys.

Dr. Sarah Bren (00:25:13):

That is so important and super aligned with how I think about this a lot when I talk with parents. First of all, we’ll link that guide in our show notes so people can find it easily because a super valuable resource from people who really understand the backend of this stuff. But what you just said about how that symmetrical process this podcast is called securely attached. It’s about the attachment relationship between a child and a parent and how do we leverage that as a parent to orient us in our parenting strategies? When you partner with your child, you have so much more options and have so much more impact than what we were sort of saying before. It doesn’t work so well, which is when you partner against your child, like, no, you cannot have this thing because I said so or even because it’s not good for you. But if we say, okay, let’s say we’re talking about a 10-year-old who wants to have access to an app. Okay, let’s sit down together and open it up and look at what it is. Let’s read some things together about how people have navigated this app successfully and how it’s been tricky, right? Look for resources that you can enter together with your child, that partnership, it does a couple things that’s super important. One, it creates some safety around talking about this with your kid because one of the things that happens is when we just shut something down that our kid really wants, they don’t stop wanting it. They just go underground. And they don’t include us in that process.

(00:27:05):

And as our kids get older and their access to reach things outside of our purview gets better because their world gets bigger, that can become a really big gulf between what we are connected to them around and what they’re alone with. And when they do stumble upon something because what they want is not to go find harmful stuff. They want to talk to their friends, right? Or play this new game. But when they stumble upon what we know is out there and some real harm happens or something almost really harmful happens, and now they’re okay, I’m alone in this because I can’t tell mom and dad that I’ve been doing this, but now what do I do? I have to sit with it. So that’s, I think a big risk of not going collaboratively towards this with our kids and saying, I’m learning with you, but let’s create a sandbox that we play in together, not alone.

Imran Ahmed (00:28:12):

I just couldn’t possibly agree with you more. And as you were talking, I was thinking back to being a child myself, and these times when you’d be in scary situations and when you’re a kid, you first of all feel fearful of telling parents because you think it’s my fault that I’m in this situation and you then shut yourself off and you spiral and you can actually then become dependent on these. You’re trying to fix it, but you dunno how to. And so you’re engaging and engaging and engaging. And actually as I was saying that, I was just thinking that’s exactly what kind of a, that’s exactly what an ai, an algorithm, an algorithm is an ai, basically it’s artificial intelligence, it’s based on mathematics, but it’s still an intelligence. It’s saying, what will make this kid stay for longer? And as this kid kind of spirals and spirals and spirals, that’s what they’re looking for and they know that they’ve worked it out. They’ve worked it out not because they understand psychology, because they can just measure the number, the amount of time kids spend on the platforms. And so that’s why that content is so valuable.

Dr. Sarah Bren (00:29:18):

They’ve overly reduced it to code, binary code and then doesn’t have the capacity to say, but wait, what could this do to this human being that’s interacting with me? That’s where it’s really lacking. And I think that’s true for algorithms, and we see this in the social media and the ways that this has caused deep fractioning. Obviously in the macro big political world, we know we are super fractured, and I’ve heard you say social media algorithms really force us to the fringes of the argument on any issue, which creates a perception that, oh, everybody thinks this, but it’s not. But when you think about a kid’s experience on a very sort of smaller level, that same phenomenon is happening. If I’m a kid and I’m in a Snapchat with my friends at school and everybody is picking on one kid, it could be that most kids in this group do not want to hurt this kid.

(00:30:27):

But that kid’s going to feel like everybody hates me, everybody. And then that kid’s super alone and they’re going to go sort of deeper and deeper and deeper into that alone. It’s like it creates these fractions and these kids go into these darker and darker places to try to grapple with these feelings of aloneness and the anger that comes with it, the fear that comes with it, the shame that comes with it. And there’s a lot of places out there in our digital worlds that are very, they’re designed to attract that loneliness, that fear, those algorithms, right? They’re pumping content. It can get very sticky very fast, and it’s the quick sand. And so kids don’t want to navigate that. Well, if you’re listening to this and noticing that familiar tightness in your chest, that mix of concern and overwhelm, I just want to pause for a moment and remind you, you don’t have to navigate this alone.

(00:31:32):

Parenting in today’s world is complex between social media, ai, school pressures, and just the day-to-day demands of raising kids. It’s easy to feel reactive, unsure and stretched thin. At Upshur Bren Psychology Group, we work with parents who want to feel more grounded and confident, especially when the world feels fast-paced and high stakes. Whether you are navigating anxiety, digital boundaries, power struggles, or simply trying to show up as the parent you want to be, we offer therapy and parent coaching designed to help you regulate your own nervous system, strengthen connection with your child, and make decisions from a place of clarity rather than fear. You don’t have to have it all figured out before reaching out. If you’d like support, you can schedule a free 30 minute consultation by clicking the link in the episode description wherever you’re streaming this podcast, or visit ups your brand.com to learn more about how we can help you and your family feel more secure and supported. That’s U-P-S-H-U-R-B-R-E-N.com. Okay, now back to my conversation with Imran Ahmed.

Imran Ahmed (00:32:44):

When we think about all those facts and how difficult it is, and I also think that you and I are fairly privileged parents. You and I are very privileged parents. Even in America, we’re privileged and that’s the most privileged country on the planet. It’s the wealthiest country that the world’s ever known, the most economically successful country, even if it’s unequal and it is very unequal, certainly compared to the country I grew up in. We have the time and the space and the resources and the education and the help to be able to have those to navigate with parents, with our kids, but a lot of kids don’t. If you’re working two jobs or three jobs because things are tough, it’s not fair to ask someone to do that on their own. And it’s profoundly disrespectful to the incredibly hard choices that some parents in America and certainly a lot of parents around the world in less wealthy and less lucky countries than ours have to endure. And that’s why in part we’ve seen things like, which I don’t agree with at a ideological level, I don’t agree with bans on social media. I don’t agree with age limits necessarily, but I think they’re like an emergency break. You pull when something’s gone terribly wrong.

(00:34:01):

And a lot of countries are introducing legislation, so this is very trite, but I always say Europeans regulate Americans litigate, but both things do the same thing. They create costs for harm, right? In America, I am a great student at the consumer rights revolution and how Americans got safer cars. And now when I bought a car, because we had the kids coming, we were like, we could check every single aspect and rating of the safety of that car. There’s incredible transparency. We were able to find the safest way. Same with car seats, same with food, same with everything else that we buy for our kids, but in Europe that’s often regulated in America, it’s done through litigation because companies do not want to expose themselves to the potential liabilities of being sued in a class action suit by and sent into chapter 11. So we are trying to create that accountability over time and in the short, but in the short term, some countries and Florida for example, has got an under 14 ban and Utah’s getting one, and there’s a number of Republican states which are leading on this in America.

(00:35:20):

But other countries like Australia’s got a kids’ ban as well on social media. They’re just saying something’s gone terribly wrong. You need to fix your systems and we’re not allowing you to carry on right now. And that’s why the emergency break is being pulled by a lot of countries. And I really regret that we’ve gotten to that point. I feel like it’s a complete failure by both the platforms and the lawmakers that we got ourselves into a situation where the technology got out of control. Our lawmakers failed to do their damn jobs, which is to actually intervene and say, Hey, hold on a sec, maybe you shouldn’t be behaving this way. And we are now in a position where Governor DeSantis where governors all across the country and where prime ministers and presidents of other countries are saying pull the emergency break.

Dr. Sarah Bren (00:36:10):

But I think what I’m hearing you say is yes, okay, you could pull the emergency brake but don’t confuse that as the solution.

Imran Ahmed (00:36:17):

Correct. Completely agree with you.

Dr. Sarah Bren (00:36:19):

That is an emergency response.

Imran Ahmed (00:36:22):

Exactly.

Dr. Sarah Bren (00:36:22):

It’s like you called the ambulance, but the ambulance has to actually get the EMTs out to do the support system. Someone’s got to give CPR and stop the bleeding and then heal the wounds. We have to actually fix the systems because whether we like it or not, banning social media does not mean that kids will not go on social media.

Imran Ahmed (00:36:50):

I think of it like with a car, with a truck or a car or something. If the brakes start failing everywhere, then they do a recall. Now recalls insanely expensive, it’s insanely disruptive, it hurts your brand really badly. But car companies do that all the time and they recall their cars and say, we got to fix this. And then they fix the problem and they make sure it never happens again. They don’t want to endure that cost of the recall. And I think that’s what an emergency some of these bands are. They’re kind of a recall. They’re saying you recall it. They’re not saying, we don’t want to drive places in Washington dc I need my car.

Dr. Sarah Bren (00:37:38):

Yes, that’s a very important element to this metaphor. I do fear that some people think the ban is the answer and once we’ve done it, good, we’ve done our job and we can just sit back again.

Imran Ahmed (00:37:55):

Let’s use the car analogy a little bit more. So I am a very careful driver because I just am. I’ve never had a speeding ticket in my entire life. I’ve never had a parking ticket. I’m super, super cautious. My wife is from Oklahoma, she likes fast cars, she quite likes guns. For her birthday, I bought her, what’s it called, indie 500, the Speedway, the nascar. I bought her a NASCAR experience so she can do laps in a NASCAR on her own and then with an actual driver who’s going to take her at full speed. So this is her thing. I actually people to be able to enjoy their car either the way that I do, which is very cautiously or the way my wife does, which is a crazy person. And that’s the same with technology. Some people are really going to be into it.

(00:38:47):

They’re going to want to use it lots and lots and lots. I want to be able to use it in the way that I do parents and kids want to use it, but I’m telling you that nascar, that thing is checked, it has standards, it has liability. And I want the same for social media. I want us to have great social media. I want us to have great internet technology. I want us to have great access to information and to know that it’s being tested, checked that there’s liability, and that they actually care about safety. I know I bought that experience for my wife in a car that’s going to go at an insane speed. I have never loved nor liked anyone as much as I love and my wife and the thought of anything happening to her beyond awful, and I trust she will be safe not because of faith, because of regulation, of laws of liability, and because we know that they’ve worked over time to create safe experiences for people. It’s got to be the same with social media.

Dr. Sarah Bren (00:39:59):

I agree. And to take your car metaphor a step further, love metaphors. So the work that you do, the work that the Center for Countering Digital Hate is doing work that a lot of other really wonderful people and organizations are doing to make sure that the car companies, social media companies, tech, big tech is actually creating safe vehicles. That’s important work. And the people listening to this podcast may want to participate in that, right? And do go to your website, we’ll put links and go figure out what we can do to support the people who are fighting that good fight. But at home, at the end of the day when you come home, I think where we can feel agency too is just like we trust that the car companies are making safe cars. We also want to make sure that our kid gets a driver’s license before they get behind the wheel. Exactly.

Imran Ahmed (00:41:00):

Exactly.

Dr. Sarah Bren (00:41:01):

And practices using their seatbelt and isn’t fiddling with the radio and their phone while they’re driving and doesn’t have 15 kids in the car while they’re going somewhere. We want to make sure that we’re doing our, and again, I don’t want to make this feel like if something bad happens, it’s on the parents, it’s not. And where can we feel some agency? And most parents want to support their kids’ healthy use of technology. So what are actual strategic ways to do that? And I can link a couple episodes too in this podcast in the show notes of episodes we’ve done where we specifically talk about strategies for talking about social media with your kids, talking about tech with your kids, helping your kids be sort of educated consumers of technology, aware of their, because this I think is where the family level work can be in giving kids tools.

Imran Ahmed (00:42:04):

Look, I think this is important. Transformative work that happens has to happen throughout society and we need to have a serious conversation about this. What really, really annoys me is listening to tech executives talk about this is about freed speech. I’m like dude telling my daughter that she’s too fat and that she should go on a 300 calor a day Diet is not free speech. It is cruising for a bruising. And if you do that to thousands of kids every day, you should be in jail. There’s not a single parent in America that disagrees with that statement. I’m telling you actually, it’s constitutionally free speech. But people will be sitting there going, yeah, but are you kidding me? Surely there’s something you can do about these companies. And that’s why systemic transparency and holding them accountable with the same laws that everyone else does, negligence law is vital. Well, first of all, I think the idea of teaching your kids to drive and getting their driving license is a great idea. I mean, I don’t think they should be a driving, I don’t think kids should not be allowed to use social media until the government gives. You go to the DMV, the worst experience of my entire life. Everyone else says America doesn’t have a state religion. Bureaucracy is your state religion. Oh my lord, sure. Getting my driving license was the worst experience of my life in America.

Dr. Sarah Bren (00:43:23):

Right. But even if you didn’t have to go to the DMV to get it, do you think that you would’ve still wanted to take some lessons? Of course, familiarize yourself with the way of the road.

Imran Ahmed (00:43:34):

And who does the first lessons? It’s the parent. Definitely not mommy because mommy is a scary driver. Daddy will do the lessons, but at home when it comes to social media, it’s both of us. And so you put a learner permit on and you have that conversation about what safety looked like and it starts real slow. You don’t put your kid straight behind the car and go, Hey kid, go for it very slowly. This button here does this, this does this, pedal does this. And you slowly take them through it and allow them to slowly engage with that world and benefit from your wisdom. And the truth is cars are all different. So I grew up with manual gearshift cars and in America it’s going to be an automatic. And so it may be slightly different for you to the way that the technology was when you were using it, but actually you still have enough experience resilience like your intellect is such that you’re flexible in the way that you think about things. You can think about the basic principles of safety and what good information looks like and what healthy usage of technology looks like and apply that to a new situation. And that’s why I say it’s an exploration that’s bilateral. It’s a symmetrical experience actually. It is kind of new what our kids are facing.

Dr. Sarah Bren (00:44:54):

Yes.

Imran Ahmed (00:44:55):

You can’t analogize it perfectly to what we did before, but you can bring your wisdom and your hard earned experience, those bruises, those scars.

Dr. Sarah Bren (00:45:06):

And your relationship. Exactly. That’s the thing. As much as big tech has built these really addictive algorithms that can really capture our kids’ attention, the one thing we have that’s stronger than that is our kids want our attention. And our attention on our children is potently is powerful. They feel that they want that just as much as they want that dopamine hit of the next YouTube video. So spending quality time with our kids where we’re approaching this together actually has deep weight and a lot of value.

Imran Ahmed (00:45:52):

And that’s where the limits of what we can do end at CCDH and what you do in this podcast and what in your work as well, which is helping parents to build up a trust bank with their kids across multiple areas of life. Because it isn’t just technology that kids need to have trust in you to help them to navigate fully. It is driving a car. It is everything. It’s about the whole of life.

(00:46:21):

And I think really hard about that trust bank with my daughter and how to make sure I maintain it. I come from a country where we are not known for our emotional intelligence. We’re a little bit emotionally kind of distant, but also how to create that mutual vulnerability, the love, the trust, all that stuff. I will be watching your podcast to understand that, to help me to navigate spaces that I genuinely do understand as an expert, but still require that trust bank that’s built up across years and across so many different aspects of parenting to be able to fully exploit.

Dr. Sarah Bren (00:46:59):

Yeah. Yeah. And thank you. I think it’s an important distinction between everything’s going to feel, because going back to the thing we were saying at the beginning, I don’t want this to feel so overwhelming that we get stuck. So when you feel overwhelmed and you feel stuck and you feel like this is so big, it’s important to just sort of take a moment, zoom out and sort of zoom in to your like, what can I do in my home today? Not to make the massive changes in the world that you’re working to do, but just in this moment, how do I build this relationship with my child so that they’re coming to me when things get stressful, confusing, scary, overwhelming, confusing.

Imran Ahmed (00:47:57):

I think that we’ve really helpfully sort of given, I think that we’ve given, we’ve both told why we want to do this and talked a bit about the solutions. I think it is worth just very briefly reminding parents that I’m not talking about minor harms here and the stuff that we study is not small. There is a genuine threat to kids. So you might be thinking, well, we’ve just explained why parents can do something. So why are all these governments pulling the emergency break? Let me give you two examples of two studies that we’ve done that should just at least tell people that this is not a nice to have thing. This is a must have. We all kind of need to do this with our kids and need to support advocacy and need to support where necessary more drastic action. So one study that we did is called Deadly by Design, tells you in the name what it’s not going to be great.

(00:48:57):

This one, we set up accounts in TikTok in four different countries, the United States, the United Kingdom, Canada and Australia. And we told TikTok we’re a 13-year-old girl and that was it. And then we recorded the first half hour of what it served to us, and we had researchers go through and categorize that content. So within 2.6 minutes, those accounts, those new accounts on TikTok of a 13-year-old girl were getting self-harm content. What do I mean by self-harm? Content content? Like a picture of a video of a razor blade with the words I missed a touch of you on my skin set modeling music coming up. Within eight minutes, they were getting eating disorder content in that first half hour on average every 39 seconds, our accounts were getting potentially harmful content. And when we named half the accounts like Susan and half the accounts Susan lose weight because there’s research to show that kids with mental health problems often put that into their bios or their usernames. The accounts that were called Susan Lou Weight in the first half hour without any other indicator of vulnerability, they received 12 times as much self-harm content as the standard accounts. So the algorithm could recognize vulnerability and would flood it with self-harm content

Dr. Sarah Bren (00:50:22):

That’s chilling.

Imran Ahmed (00:50:23):

These are not now, they know this now and they’re still not doing sufficient work to fix it. And to me that is criminal in its irresponsibility. Another report we did is called Fake Friend. So earlier in 2025, there was a big push to get platforms like chat, GPT, which is owned by OpenAI, which everyone heard of. It’s a generative AI platform. I want to be clear, we use chat GPT internally in my normal office when I’m at home, I have three screens. One is my browser, the second is my messaging, like emails and stuff like that. And the third is just AI platforms. So I have Gemini, I have Anthropic, I have open ai, and I have a note taker. And so I believe in those technologies. I think that they’re transformative for productivity. They are another sea change in the amount of great information available to me, but some kids who are using them, we know that in the past cases like Adam Rain, who was the kid in California who took his own life, they’re using these in ways that can be really destructive and harmful to themselves.

(00:51:41):

And we wanted to see whether or not these platforms would give dangerous advice to teenagers using them. So we set up accounts on chat, GPT, we created three profiles of kids, one with an eating disorder, one with mental health problems and suicidal ideation, and one with substance abuse problems. And we found that chat, GPT was willing to advise our 13-year-old kids how to safely cut themselves within two minutes. It was listing pills that they could take at home for an overdose and quantities within 40 minutes, and it generated a full suicide plan and goodbye letters after 65 minutes for the eating disorder. One, it was willing to tell a 13-year-old how to hide eating habits from their family suggesting appetite suppressing medications. I don’t mean GLP ones like a pic, I mean other things that they could take, dangerous things that they could take to suppress their appetite and 13 year olds should not be taking a pic and also creating dangerous diet plans. And then for kids with substance abuse, within two minutes it was telling them how they can get as drunk as possible, telling them how to mix drugs. And I don’t mean acetaminophen mean or Tylenol, and I’m talking about ketamine, hallucinogenics cocaine, like bad drugs. And a 13-year-old shouldn’t be taking those drugs. And it was explaining how to hide intoxication at school within 40 minutes. So these platforms, they should know that they could be used in this way and they do use it this way.

Dr. Sarah Bren (00:53:26):

They know they do know.

Imran Ahmed (00:53:26):

But they haven’t put the guardrails in place. And there’s a bit of me that I used to work in, I’m in New York at the moment. I used to work at Merrill Lynch, which is an investment bank up the road from where I’m right now. I know that these companies ultimately, they’re designed to create shareholder capital. They’re designed to create shareholder wealth. It costs money to put safety in place. But I actually think when people hear statistics like this, and I know because it is driven, our research drives legislation and lawmaker action all over the world, including in the United States, it is so counterproductive to release these things without spending the money to put the safety in place. And these guys are mega billionaires. They could spend the same amount of money that they spend on, I know private jets every year on safety. And they would have platforms that actually then wouldn’t trigger regulatory reactions. So there’s a tiny bit of me that’s like, why aren’t you thinking about long-term business success? It doesn’t make sense at a business level, let alone at a human level.

Dr. Sarah Bren (00:54:43):

Right? Well, it does if you think you will never be held accountable.

Imran Ahmed (00:54:48):

Yes.

Dr. Sarah Bren (00:54:49):

And that is, I think hopefully the seed change that you are trying to affect, that the accountability will eventually catch up to them and we’ll hurt the wallet enough that it will inform the decision making. Because ultimately that’s what talks.

Imran Ahmed (00:55:06):

And that’s why so many billionaires don’t like me.

Dr. Sarah Bren (00:55:11):

Yes. Well, I as a mother do. Thank you. So keep going.

Imran Ahmed (00:55:18):

Thank you.

Dr. Sarah Bren (00:55:19):

Thank you so much.

Imran Ahmed (00:55:19):

It’s so important and thank you so much for giving me the opportunity to talk about this with your parents. And I hope that people don’t walk away too scared because actually I think that the answers are so straightforward. It is about trust, it’s about symmetrical conversations and it’s about supporting advocacy and supporting lawmakers who are doing the right thing. And there are lots of people who want to do the right thing, but they should hear from you tell them your parent, their job is to have your back, not a billionaires back, your back, my back. All of our backs.

Dr. Sarah Bren (00:55:53):

And we do, we vote with our pocketbook. There’s a big movement that I just saw recently. I forget the name of the guy who’s doing it. It’s like Unsubscribe.

Imran Ahmed (00:56:07):

Scott Galloway. Prof G, yeah.

Dr. Sarah Bren (00:56:09):

Yes.

Imran Ahmed (00:56:10):

Yeah, yeah.

Dr. Sarah Bren (00:56:12):

He’s interesting. And he’s trying to disrupt the consumer economy by getting everyone to unsubscribe from all of their recurring subscriptions as a protest to just shake things up and actually impact the pocketbooks of the big companies.

Imran Ahmed (00:56:30):

My favorite kind of protest is the one where I end up with more money at the end of the year.

Dr. Sarah Bren (00:56:35):

It’s right.

Imran Ahmed (00:56:37):

This is a pretty easy protest to me.

Dr. Sarah Bren (00:56:38):

I know. But literally my thought process when I saw it, I was like, okay, first I saw it and I was like, unsubscribe from what exactly. So I did a little deeper and he’s like From anything, right? But it’s really just the mass acts of the domino effect of all these unsubscribing from everything but the big ones, apple Chat, GPT, Amazon Prime. And I was literally as a mom, Amazon Prime is like, I have joked before, I’ve made this joke before, but I think it’s true that postpartum moms probably single-handedly put whoever on the Bezos on the moon because nobody buys more on Amazon than a postpartum mom who’s nursing in the middle of the night just shopping for stuff that they need for the diapers. Oh, I forgot about this. Oh, I got to get that for school. It’s like a lifesaver for mothers. And it’s also like it would be so hard to get rid of it, but I see mothers doing it. I see parents doing it saying, I’m going to stop. And that would be impactful. So just a thought.

Imran Ahmed (00:57:55):

Look, I think everything that we can do, we should do, because this to me is the consumer rights battle of this century, is to how do we negotiate a less toxic relationship with Silicon Valley, which is become increasingly indifferent to the harm that it causes. And Silicon Valley in the 20th century yielded some of the greatest advances in American technology and in our economy, they were responsible for the miraculous growth in American wealth and economic success in the 21st century. That relationship has become extractive where they are taking from us and they’re not sharing, they are exploiting rather than enriching our lives increasingly. And I think that we need to renegotiate our relationship with them. And so I think it will take a lot of tools, it will take lawmakers, it will take individuals using their advocacy, so their voices, but also their pocketbooks as well.

(00:58:55):

And things like saying, I will not buy products from people who behave in this way because this is the single biggest consumer rights of this battle of this century. And my office is actually in DuPont Circle in Washington dc, which is where Ralph Nader, who some people may know who he was, but he was the consumer rights champion in the 1970s and eighties in America, pushing for automobile safety standards and all sorts of things. I think this is going to be as profound and important a battle as that was. And we now have a much healthier relationship with a lot of different American industries. I want the same to be true about technology as well.

Dr. Sarah Bren (00:59:35):

And one final thought. I know my podcast producer is like, stop talking, but this is so important. But there’s a few core things that I’m thinking of that would be just a few action items that a parent could do that I think would be helpful. One, we’ve talked about doing with, right? And in order to do with things like bring back the family computer, right? Put it in the living room, put it in the kitchen, have this shared space, right? That’s what we had when we were growing up and it had a function. It wasn’t an alternative to being alone in your room scrolling all night because we didn’t have that. But if we can go back to some of these older ways where technology was still embraced and enjoyed and included in our lives, but it was just privatized to the little bedrooms in the bathrooms, it’s like, no, we do this here together.

(01:00:31):

We publicly make it more family life and then educate ourselves and our kids about big tech. I will often say one of the best ways to get out of power struggles with your kids around tech use is to not have it be you versus them, but you and me versus the problem. And helping our kids actually become aware of who is wanting you on your phone, who is benefiting from you being on your phone and not in a moment where they want to be on their phone. That’s not the time to have this conversation. But in a not tech focused moment, but when we’re going for a walk or we’re driving in the car some calm connected moment, helping them think reflectively like, do I want to spend this much time on this? Is that my first choice? Or is it that I went on to do it and I completely lost track of time? Helping them become more aware of what they want and what happens and allowing them to have feelings about the control that a big company wants to have over them because kids hate to be controlled. And so helping them understand who’s trying to control them, how do they feel about that? Do they want to let them control them in that way? What would they maybe do differently to prevent that from that control from happening? Gives kids agency that doesn’t have to be pit against you. And I think that’s a very effective strategy and can be used in a lot of different ways. So those are just some things that I think I’ll leave you with. And also maybe quit your subscriptions, which I know is hard, but…

Imran Ahmed (01:02:16):

I’ve been taking notes throughout that entire last bit. So thank you.

Dr. Sarah Bren (01:02:22):

This is where I can help. I feel like the politics and the advocacy is bigger than where I can step in, but when I can give parents information that’s rooted in child development, parental mental health, child mental health, and attachment safety, that’s where I can help.

Imran Ahmed (01:02:42):

And that’s the thing I think with parents, you have to give yourself a break sometimes. And I’m saying this, this is ax thematically true. Every single parent who’s listening to us talk right now as already doing the right thing because they’re trying to get good information for themselves. And that’s what we all want is good information. And that’s a collective battle that we’re going to have as a society, as individuals, as parents, and as a community of people who are talking to each other. So thank you for playing such an enormous role in that, and I really enjoyed being here today.

Dr. Sarah Bren (01:03:13):

This was a wonderful conversation and I really appreciate you taking the time to come on here. Thank you. If you enjoyed listening to this conversation, I want to hear from you, share your thoughts and your feedback with me by scrolling down to the ratings and review section on your Apple Podcasts app or whatever app you’re listening on. And let me know what you think of this episode or the show in general. Your support means the absolute world to me, and just a simple tap of five stars can make a real impact in how this show gets reached by parents everywhere. So thank you so much for listening and don’t be a stranger.

Never miss an episode!

Rate, review, & follow the podcast

  1. Adelaide Dupont says:

    Like Ahmed, I am not a fan of brands on social media.
    {I am also not sure about bands!}

    Some really profound thoughts on algorithms are delivered here; including concepts and themes which had passed me by until this point.

    “AI and Algorithms” was very timely for me as I am evaluating a YouTube channel which is actively using AI for creating animal documentaries.

    Things I had learnt through the last five years of Artificial Intelligence penetration helped me identify some of the risks; benefits and costs.

    For example: I know quite a lot about gorillas [and some other primates]; slightly less about dogs and hardly anything about the web of endangered species.

Leave a Reply

Your email address will not be published. Required fields are marked *

And I’m so glad you’re here!

I’m a licensed clinical psychologist and mom of two.

I love helping parents understand the building blocks of child development and how secure relationships form and thrive. Because when parents find their inner confidence, they can respond to any parenting problem that comes along and raise kids who are healthy, resilient, and kind.

Featured In:

Get episodes straight to your inbox!