Earlier this summer, the UK’s Daily Mail newspaper reported that the first word of a toddler in the UK was Alexa — the name of the Amazon Echo smart speaker. Apparently, the one-year-old had been closely observing his parents’ commands to the Echo, and at some point, he caught on that he too could get the same reaction from her.
A report released last fall by Common Sense Media found that American children under the age of 8 spend an average of 2 hours and 19 minutes per day staring at a screen of some sort. Much of that time is spent watching videos on YouTube, served up by an algorithm that parents have come to trust. But YouTube’s algorithm is optimised for watch time rather than what is good for the child. This has resulted in the production of some strange and often disturbing copycats of our children’s favourite cartoons. There have been several stories in the last year detailing the dark places YouTube’s recommendations have taken children to.
In this episode, we’re going to discuss trust and the Internet. When it comes to technology being used at home and in the classroom, who — or what — are we entrusting our children to, and what is the role of parents and teachers in guiding children through this new age?
Quotes from the episode
“The VCR, which is actually kind of pre-computers, was the first programmable object that most people had in their homes…Most people threw up their hands at that point, were like, I don’t want anything to do with this devilry. You know, they’d be one kid in the family to program it for all the adults.” — Author James Bridle
“You see a lot of like mums and dads sharing these posts and you’ve got all the kids going ‘Oh mum, like what you’re doing, you know, that’s not true’. But they just seen an important looking Facebook page that’s posted it.” — Student Kate Steven
“They (young children) look at technology, and have no boundaries, they have no preconceived notions about what the technology should be able to do and shouldn’t. So I continue to be inspired by their no bounds thinking about what technology should be able to do for us.” — Emily Lai, Pearson’s VP of Impact Evaluation
“We need to say to our kids…show me your world. This is part of your world. And it’s a part of the neighborhood that I haven’t been to yet, so just like if our kids were hanging around in a neighborhood that we didn’t know, we would say, hey, let’s go walk around the neighborhood together, so I feel comfortable that you’re over there.” — Jill Hodges, Fire Tech Camp
Rajni Sood Laurent, mother from Washington D.C.
James Bridle, writer, artist and technology thought leader
Kate Stevens, Student
Jill Hodges, Entrepreneur and founder of Fire Tech Camp
Emily Lai, Pearson’s Vice President of Impact Evaluation
This week’s unsung hero in STEM is cosmetic scientist Florence Adeoju, founder of awesome lipstick brand MDMFlow.
James Bridle’s orignal Peppa Pig blog post
James Bridle’s Peppa Pig Guardian Article
Common Sense Media’s Children’s Media Use in America report
Wired article on YouTube’s disturbing bootleg content
Stanford’s CS+X initiative
James Bridle’s Dark New Age book on Amazon
HOST: If you’re a parent — or anyone that spends a lot of time with a preschool-aged child — then chances are you’ve probably heard of Peppa Pig.
Peppa is the star of a popular British TV show that airs in 180 territories around the world. She is a female pig that lives with her pig family — her parents and her brother George — and has a diverse set of mammalian friends. The show follows them as they perform normal everyday activities like seeing a doctor, going swimming, or riding their bikes. It’s pretty tame.
Children obsess over it, and parents generally accept it as safe, trusted entertainment for their kids.
TAPE MAURYCY: Mom! I want to watch Peppa!
HOST: That’s Maurycy Sood-Laurent. He’s two and a half years old, and he loves Peppa.
FATHER: Why do you love Peppa the pig?
MAURYCY: Because I LOVE it.
HOST: Maurycy lives with his parents, Rajni and Daniel, and his older sister, Ani, who’s 7, in Washington D.C. And media has always played a big of the family’s lives.
RAJNI SOOD LAURENT: Pretty much the first thing both of my children ask for when they wake up in the morning, sadly, has become, “Mom, can we watch TV?” If it’s a school day, I can usually divert their attention and get them to focus on something else, but a little bit harder with the two and a half year old. He likes to get a little more feisty at his age — the terrible twos and push for us to let him watch TV before he gets to school.
HOST: That’s Rajni Laurent-Sood, Maurycy’s mother. She used to work in international development, and now has cut back hours so that she can focus on her kids.
RAJNI SOOD LAURENT: We started out with him watching things really on the computer, through Youtube, little videos, nursery rhymes, things that were probably in our minds appropriate for kids his age to entertain, kind of engage them a little bit. Right now, Peppa Pig is his top favorite. That has been going on for months on end. We have heard Peppa Pig — my ears started to hurt a little bit from hearing Peppa, as cute as it is.
HOST: For the most part, Peppa Pig is harmless, age-appropriate entertainment that has light-hearted lessons for its young audience.
RAJNI SOOD LAURENT: Both of them have picked up phrases from Peppa Pig. You know, in the way that they say those phrases because Peppa comes from England, with a British accent they will say things with an accent.
HOST: But there’s just one problem — and it’s a big one.
Just like adults, kids tend to watch Peppa — and other content on the Internet, primarily Youtube. And because Peppa episodes are pretty short, at 5 minutes each, every five minutes, to keep watching, the viewer has to click on another link.
Now, in theory, Youtube makes this easy by suggesting related content. But not all of that content is what it seems. Sood-Laurent explains what happens when her kids click on some of these suggested links.
RAJNI SOOD LAURENT: They take you to different places and, you know in Youtube will click on some other remake that’s totally inappropriate…
HOST: Here’s the difference between an actual Peppa the Pig video and a fake Peppa video: in one popular episode, Peppa learns about dental hygiene. The episode shows Peppa and her brother brushing their teeth and visiting a dentist who’s this kind, respectful elephant. But in an unofficial version, Peppa’s visit to the dentist starts with her being tortured on the dental chair, before she turns into into various robots from Iron Man and dancing. It’s something out of a fever dream — and, for a young child, the stuff nightmares are made of.
RAJNI SOOD LAURENT: I like to think that they’re still young enough that they really don’t quite understand what it is they’re watching, but it’s kind of scary because it’s still entertaining enough for them to keep watching it, and to watch it over and over again.
HOST: What’s happening is that people are taking advantage of children’s attention — and parents’ trust in brands like Peppa — to create their own knock-off content, without any of the thoughtfulness, intentionality, and child development that went into the real programme.
Writer, artist, and general technology thought leader James Bridle has written extensively about Peppa Pig, and what it means for the internet. His blog post on the fake Peppa videos on Youtube has been viewed millions of times, quoted in newspapers around the world, and has even led to official questioning in the European Parliament. He called it, “Something is Wrong With the Internet.”
JAMES BRIDLE: Millions and millions of children — and often very small children watch lots of Youtube, there’s very small kids watching hours and hours and hours of this stuff. That’s obviously a huge revenue driver because of ads on all of those videos and it’s Youtube so people upload videos, trying to get kids to watch them, and then they get money from Google. That’s led to some very odd effects, because kids aren’t very discerning and because Google’s algorithm decides what those kids see through auto plays and recommendations, people game that system. So they make videos that are quite like other videos. You know if one video is popular, people make lots of videos like that. If there’s, a particularly popular cartoon character or Disney movies or whatever it is, there’s a lot of knockoffs, so you get this kind of huge sea of barely differentiated content.
HOST: Youtube’s recommendation system doesn’t differentiate between the real videos and the knock-offs. And that’s a big part of the problem: in addition to the content being inappropriate and, sometimes, downright scary for children, both the legitimate videos and the knock-offs look alike to Youtube’s opaque algorithms.
But it’s not just the machines that are fooled. It can be just as difficult for humans — whether parents or children — to tell apart the real videos from the fake videos, at least at at a glance. And especially for busy parents, who may or may not be watching with their children, we put our trust in this “babysitting algorithm.” And because these algorithms decide what to recommend, they’re deciding both what we watch — and what gets created.
JAMES BRIDLE: You get videos that appear to be entirely generated by software, they’re just mashing together different bits of video and kind of uploading the Youtube automatically. Or you get people acting out these videos that no longer really make any sense, because the title is just a mishmash of popular video titles and they try to cram all this stuff into one video and it doesn’t actually make any sense to humans at all. But for me what it really gets down to is systems that operate at this scale, no one ever considered that this would be the outcome of making a user-submitted video channel. This was not what anyone planned. The impossibility of knowing who’s making all of these things or where they’re coming from, they’ve essentially become kind of ruled by computation, by the algorithm.
HOST: Bridle has written about all of this in a new book titled New Dark Age. What Bridle argues for in his book — and his writings on Peppa Pig — is that all of us should think deeply about what exactly it is that we are trusting when we accept the daily convenience of technology in our lives.
In this episode, we’re going to discuss trust and the Internet. When it comes to technology being used at home and in the classroom, who — or what — are we entrusting our children to, and what is the role of parents and teachers in guiding children through this new age?
This is Nevertheless, a podcast about learning in the modern age. Each episode we shine a light on an issue impacting education and speak to the women creating transformative change. Supported by Pearson and hosted by me, Leigh Alexander.
Earlier this summer, the UK’s Daily Mail newspaper reported that the first word of a toddler in the UK was Alexa — the name of the Amazon Echo smart speaker. Apparently, the one-year-old had been closely observing his parents’ commands to the Echo, and at some point, he caught on that he too could get the same reaction from her. This story is just one of many about how much modern families have integrated technology into daily life. But, there’s still a lot of confusion — and, among experts, debates — about the impact of this integration.
The American Academy of Pediatrics is one of the most trusted voices on children and tech use, especially screen time, and its official position is that children under the age of two should have their screen time limited — and always in the presence of a parent. But this advice is hard to follow. A report released last fall by Common Sense Media found that American children under the age of 8 spend an average of 2 hours and 19 minutes per day staring at a screen of some sort. Nearly all families surveyed — 98% — have TVs at home, and the same number have mobile phones, and they use them. 42% of parents said that they had their TVs going “always” or “most of the time”, and that’s a problem because parents engage less with their kids when the TV is on. But…how did we get here, to this point where we’re entrusting so much of our families’ time and attention to devices? And, was it inevitable?
For James Bridle, the answer may lie decades in the past.
JAMES BRIDLE: The VCR, which is actually kind of pre-computers, was the first programmable object that most people had in their homes.
HOST: Bridle sees the VCR was a turning point in how we use and think about technology at home. Because it’s not just about how much time we spend using technology — by staring at a TV screen, for example — but how we think about it, that has led us to where we are today.
And VCRs — videocassette recorders, in case anyone needs a refresher — allow users a deeper level of agency by choosing what to watch, and when to watch it, which is a whole different way of thinking about tech. And the result of that agency?
JAMES BRIDLE: Most people threw up their hands at that point, were like, I don’t want anything to do with this devilry. You know they’d be one kid in the family to program it for kind of everyone and all the adults and most of the other kids would be like NO it’s ridiculous, it’s complicated. It was never that complicated, but people were afraid of it, and so there’s a huge element of fear as well. Kind of fear of looking stupid a fear of not being able to do it. All of these reasons, I think, come to bear on why technology has kind of a concentrated power, and it’s hard to access, hard to understand.
HOST: In other words, from the very earliest days of programming, or “pre-programming”, as Bridle called the VCR, tech users were self-selecting into two groups: those that understand the machines, and those that don’t. And, according to Bridle, it was often the adults, the sources of authority, that were deciding to step back. And that trend of stepping back has influenced how technology has continued to be designed: to maximize simplicity and ease of use. That’s true whether it’s a website, an app, our smartphones — or media programmes aimed at kids.
And that brings us back to these fake Peppa Pig videos on Youtube. After a user has watched a legitimate video, these fake videos that have learned to game the algorithmic system of recommendations pop up, and most of the time, they look exactly the same as legitimate videos. So what does the young Peppa fan do if he or she wants to continue watching?
JAMES BRIDLE: What Youtube does is training kids to click on the very first thing that comes along, regardless of the source with the kind of inability to think critically about where the information might be coming from. But just treating it all as kind of undifferentiated source training us in the absolute worst behavior, the least kind of cognitive effort, when cognitive effort is precisely what we need as adults to navigate in a complex technological world.
HOST: What Bridle argues is that at the very moment when we all need to learn to be better critical thinkers, the algorithms of platforms like Youtube are conditioning impressionable young children to instead just click on whatever is easiest. Understandably, when Bridle first wrote this blog post, it struck a chord. He was, after all, highlighting basically a nightmare-ish scenario for parents and educators alike…but ultimately, the issue wasn’t just about malicious actors on the Internet targeting children, as bad as that is. .
JAMES BRIDLE: What’s happening on Youtube is this fairly dark mirror to the way in which a lot of technological processes are being developed everywhere.
HOST: It reveals a much bigger problem of algorithms pushing content and influencing what we see and how we think and it does this entirely behind the scenes, making it hard for us to even figure out what’s happening. And when we are targeted by “fake news,” or any content that preys on (or even) creates insecurities, it erodes social trust. And if there is nothing that is inherently trustworthy, then that also means that anything is possible, and not in a good way.
Many of us believe that the internet’s ability to surface more information would help us make better decisions but the amount of information that’s out there on the internet, available at the click of a button is creating not just information paralysis but affecting the very way that we trust, or in this case dis-trust our own ability to make decisions. It’s also changing how we think about knowledge itself.
Bridle calls this “computational thinking”
JAMES BRIDLE: It’s just reinforcing the idea that knowledge is something that you can sort of acquire through accumulating information rather than by thinking critically oneself. The idea that if you just put the right search terms, keywords into the machine, they’ll give you an answer. It’s the idea carried out to its fullest extent, that a computer tools are giving us the answers to things, rather than tools for asking questions with…
HOST: The challenge with analysing the effect of technology, though, is that it’s so hard to know exactly what its impact is because it’s all so new, and because every generation is being introduced to it in a different way.
So a two-year-old growing up with Peppa Pig videos is, ultimately, going to be affected differently from an adult choosing not to learn how to program his VCR, to someone that is growing up amidst the transition, and with it, to our digital world…like twenty-year-old college student Kate Stevens.
KATE STEVENS: I remember my mom wouldn’t let me have Facebook till I was 14, but then once I got Facebook about six years ago now I was all over everything. I got tumblr from like age 14. I got really into Tumblr. Twitter, I’ve kind of been using pretty much all the big ones for the past six years that they’ve been such a huge part of my life for a really long time now.
HOST: Stevens didn’t always appreciate her “late” start on social media, but over time, she has come to.
KATE STEVENS: When I was younger I thought it was really unfair and I really felt like I was missing out. But now I kind of, you see celebrities getting in so much trouble nowadays because of things they posted on social media when they were younger and things like that. And so I’m just really grateful that I kind of had the chance to, I guess, become more of a person and understand a bit more about the world before I was just throwing my opinions out there and like talking about whatever topic I wanted to I think.
People my age particularly are very aware of how social media can really impact you because we’ve seen Youtubers who we watched when we were 14. We’ve watched their downfall from stuff they’ve said coming back to bite them in the ass.
HOST: So here’s another lesson about trust that Stevens and her peers have learned: sometimes, kids have to be protected… from themselves, and their own lack of judgement. But when they do, they may become naturally more literate — and more critical thinking — users of information shared online.
KATE STEVENS: When those generations older than us who was so used to traditional media who um, will like hear a story on the radio or on TV and completely trust it because that’s their source of the news, I feel like they can sometimes take the same approach to when they see something on Facebook or twitter. They’ll be pretty quick to just share the article or share the post and…not to generalise, but you see a lot of like mums and dads sharing these posts and you’ve got all the kids going “Oh mum, like what you’re doing, you know, that’s not true”. But they just seen an important looking Facebook page that’s posted it. And so they don’t really, they kind of like take it as news. So I feel like there definitely is a difference, but in between generations who are more used to traditional media and more used to news spread online.
HOST: And for the most part, Stevens says that she and her peers learned this lesson not from what was taught in school, but just through being on the platforms.
KATE STEVENS: I guess you don’t learn as much about false information apart from when we’re writing our homework our teachers would always tell us not to look on wikipedia because anyone can edit that. And so I think we did learn a lot at school when, you know, doing homework and writing essays to always take it from reliable sources.
When you’re online as much as people from my generation all, when you know you’re constantly refreshing Twitter and you’re, you’re hearing what your friends have to say about it all the time, you’re seeing the comments and seeing what they have to say. I think that it’s definitely, you get really used you’re reading is kind of more reliable or you know, if all you’re. If someone’s shared an article and all your mates are like, that’s just not true what like what you’re talking about. Then you kind of like learn like, oh, okay, that’s probably not true, I kind of need to look a bit more into this before I start shouting my opinions about something.
HOST: That type of education that Kate Stevens describes as missing, “media literacy”? Educators and parents are increasingly arguing that schools should be teaching it, and at earlier ages. But it’s not the only way in which educators are trying to incorporate more technology in kids’ learning.
Entrepreneur Jill Hodges is not a traditional educator, but she is trying to change how kids learn about technology. Too much of the current focus is on teaching kids to code but, maybe because she is not a programmer herself, she thinks that this is too narrowly focused.
JILL HODGES: My sister was one of the first PhDs in computer science out of Georgia Tech, and my dad was an engineer, so I came from a techie family. We were always early adopters on these things, so I came at it with a lot of enthusiasm, and as my own kids sort of reached the age of being interested in this, I realised that they didn’t have the same kind of access that we did,all the things we use have become very consumer focused.
HOST: It was this observation that led Hodges to start Fire Tech Camp. This programme provides technology education in the UK for kids aged 9–17, through holiday camps, after school programmes and weekend workshops.
JILL HODGES: We have this whole generation who just consumes stuff and doesn’t understand where it came from. My view is that the best way to get kids to use tech constructively and to have a constructive relationship with tech is to educate them about tech. It’s really about computational thinking. So it’s digital media, it’s robotics, it’s making. It’s sort of the evolution of craft and arts and how all those things come together, and it’s about creativity and it’s about really empowering kids with these tools. So that this is not a black box and they can say, you know, I’ve got a great idea, I can turn it into a film and I can start a Youtube channel. If we can get kids engaged with that and constructive with that, then they naturally become more critical consumers, right? Because they can say, ok, well when I did that, this is all the stuff that we had to have and I don’t really see where this other source is getting all that kind of stuff. So I think having kids be engaged in technology is really important.
HOST: Hodges’ main premise is that this type of tech education is neither about teaching kids to code or geared just towards budding computer scientists.
JILL HODGES: Technology is not a vertical, technology is a horizontal. No matter what career you have, the people who understand technology are going to have a better set of opportunities than people who can’t understand the technology. Not because technology is better, but because you’re limited if you don’t understand.
HOST: Hodges looks at Stanford University’s approach to teaching technology as a model for how to better integrate it across sectors.
JILL HODGES: They offer degrees which are called CS+X, where, no matter what you study, they encourage you to get a concentration in computer science. Again, it’s not just about the coding, but about sort of, ‘How do you pull these things together?’ If all you can do is code, then you’re not able to have that creativity either. You know, you have to be able to think of those stories. You have to be able to think of that content. You have to be able to understand those problems, think about how to solve those problems and then bring this tool set. Again, it’s horizontal, it goes to whatever kinds of work you’re trying to do.
HOST: Of course, teaching students to use technology more creatively is only one way that tech is being integrated into their lives. Increasingly, technology is being used to deliver the lessons themselves — and they can be especially useful in teaching “21st century skills.” Emily Lai, Pearson’s Vice President of Impact Evaluation, explains:
EMILY LAI: Some call them non-cognitive skills, some call them 21st century skills, but skills like critical thinking. Information literacy is a composite set of multiple sub skills, so it’s the ability to recognise, first of all when information is needed and to be able to sort of articulate specific questions that you’re trying to answer or that you’re trying to solve and then being able to identify the type of information that’s needed. And then once you’ve found the information how to evaluate the relevance of the information to the question you’re trying to answer and the credibility of the information that you’ve found. And then the ability to take a step back and use the information that you’ve gathered in an effective and an ethical way.
HOST: So How does Emily recommend parents and teachers help young children and students develop these skills, whether that’s in the classroom or at home?
EMILY LAI: I think maybe sometimes there’s an expectation that kids will pick things up through osmosis or something and they don’t. Like learning anything, when you break it down into small bites and you make it explicit and you model it, talk through as you’re modelling it, what you’re doing,why you’re doing it, so children can begin to observe that and sort of understand that line of thinking. When you pose questions to students that forces them to slow down, like how do you know that? And those types of really open-ended questions, force kids to slow down and think about the warrants in order to make claims and make statements about things that are trustable, people need to offer evidence in support of those claims, that’s part of building and constructing an argument.
HOST: And sometimes the opportunities to develop information literacy skills are right before our eyes,
EMILY LAI: There are a hundred occasions in a day where there is an opportunity to resolve a debate or answer a question by consulting a source on the internet. Who won the superbowl in, you know, 1987? There’s just a million opportunities where you could jump on your phone or jump on your device to answer a specific targeted question. And some of them are not as open or as closed ended as that question about the Super Bowl, some of them are more open ended like should we, as a country, be doing X or Y? And if you jump on those opportunities at home and say let’s do this, let’s do it together. And you sit side by side with your child, you walk through step by step, what you’re doing and why you’re doing it. And you’re kind of thinking out loud like well okay, I’m seeing this, I don’t think I’m going to click on this first thing because, look it’s kind of…doesn’t really sound like a very credible source, I don’t know who this person is… Externalising that thought process, you know making that visible to kids, they will pick up on that and just repeated exposure to it, over time I think it will start to sink in.
HOST: But as Emily points out, as adults we can be guilty of underestimating the critical thinking skills of young children.
EMILY LAI: There’s been research to look at very young children as developing the ability to question the motives of adults, which is a very sophisticated thing for a young child, especially like younger than the age of five, to begin to do. And so there is evidence that they’re capable of the kind of thought, the skepticism that you would need, sort of a theory of mind in order for them to be able to question every piece of information that’s being given to them is trustworthy. They look at technology, they have no boundaries, they have no preconceived notions about what the technology should be able to do and shouldn’t. So I continue to be inspired by their no bounds thinking about what technology should be able to do for us.
HOST: But there are a few common themes that seem to come up, including this: a lot of our attitudes towards children’s tech use is driven by adult fears. For Jill Hodges of Fire Tech Camp, it’s about…
JILL HODGES: Fear on the parts of families. It comes from fear on the part of schools. And it comes from this, you know, especially as adults and people who were in charge who are less facile with the technology that the kids are, so there’s this feeling that the kids can sort of run away and get into trouble .. In ways that we don’t quite know how to protect them from.
HOST: While James Bridle adds,
JAMES BRIDLE: There’s just like, not being able to do it and looking stupid, but that becomes a kind of anti-intellectualism and also a feeling that it’s not your place…That this belongs to some specialized geek class…and I speak as someone who is of that geek class, but we definitely left these questions in the hands of that class for far too long.
HOST: And for Washington D.C.-based mother Rajni Sood Laurent, the main concern is her children.
RAJNI SOOD LAURENT: I am worried, honestly, that they’re spending…too much time… watching things, and not spending time doing other things. I hear a lot from my daughter, especially…saying she’s bored, you know, she doesn’t know what to do when she’s not in school, and she’s not with friends. “What do I do? How do I keep myself entertained?” And the default is, well, let just watch TV. Let me watch something…so the scare factor for me comes in both sides: it’s watching too much and not knowing how to engage in other ways and keep yourself occupied in other ways — but it’s also the quality of what they’re watching.
HOST: Everyone has these very legitimate fears, so the next question is, what do we do with it? Jill Hodges has a practical answer:
JILL HODGES: We shouldn’t dismiss those fears. I think we should absolutely address those fear, because there are some risks associated with it, but I think to me there’s actually a huge risk that in shutting this down, we just make the problem worse. We just drive it underground. The kids are doing this stuff completely unsupervised, you know, I think that every family should be having these discussions. I sat down with my kids and you know, we put Snapchat up on Apple TV and we said, show us how this location thing works. Show us how many friends you have. Show us where your friends are…we’re not asking them to show us every single thing they post. I’m sure they post a ton of silly stuff that they don’t want to share with us, but we do want to understand what is…the appeal, you know, why are you on there?
We need to say to our kids…show me your world. This is part of your world. And it’s a part of the neighborhood that I haven’t been to yet, so just like if our kids were hanging around in a neighborhood that we didn’t know, we would say, hey, let’s go walk around the neighborhood together, so I feel comfortable that you’re over there…And oh, by the way, this looks like kind of a scary neighborhood and maybe you should avoid this part of town…that’s the way we get kids to engage with us and that’s why we get them to engage with this world, and know a lot of these risks that people are so concerned about.
HOST: In addition to that practical advice, maybe there’s something about the way that we approach technology that needs to change as well.
JAMES BRIDLE: First of all, to kind of step back from this very intense, computational thinking to understand that there’s a huge amount of the world now that can’t be completed in this kind of very direct way, and then it’s actually to look at how else we increase our agency.
Because the problem of believing is everything is computable is that you give all the power to the computers and to the people who make the computers and write the computer programmes, but actually the agency belongs to all of us.
HOST: So ultimately, we — as adults, parents, educators — need to be at a point where we can have more trust in what we personally know and in our own ability to think, and to pass that on to the next generation of technology users. Because technology can be revolutionary, but it still requires human guidance to ensure that it’s working for us, and not the other way around.
Nevertheless is a Storythings production — Series Producer is Renay Richardson. Executive Producer are Nathan Martin and Anjali Ramachandran. This episode was produced and written by Eileen Guo. Music and sound design by Jason Oberholtzer and Michael Simineli, supported by Pearson publishing, and presented by me, Leigh Alexander.
This week’s unsung hero in STEM is cosmetic scientist Florence Adeoju, founder of awesome lipstick brand MDMFlow. Florence mixed her passion for chemistry, biology and art to create a brand for black women. To follow Florence on twitter just go to @flowsphenom