YouTube CEO Susan Wojcicki (left) and David Prager in 2013. Photo courtesy TechCrunch. Parenting in the age of YouTube Kids has its advantages – but it also gives parents like me new reasons to worry.
My 6-year-old daughter loves watching YouTube videos. My husband and I joke that she learned her ABCs and how to read from them, but this is probably not far from the truth. She knew all her letters before she turned 2, thanks to a 50-minute video she watched on repeat. (I nearly memorized it too.) Overall, my family has had good experiences with the YouTube Kids app as a learning tool, but we are still mindful of what our daughter watches.
A recent article in The Wall Street Journal discussed the possibility that YouTube would make sweeping changes to its platform, especially with regard to kids’ content, in light of a Federal Trade Commission probe. The FTC is investigating whether Alphabet Inc. – the parent company of YouTube’s owner, Google – has been collecting data from minors under the age of 13 without parents’ consent. The groups that complained to the FTC also alleged that YouTube exposed children to inappropriate content.
With the FTC’s eyes turned toward YouTube already, the platform is considering two significant changes: moving all content related to kids age 13 and under to the YouTube Kids platform and disabling the autoplay feature on YouTube Kids, so that a new video will not automatically play after the previous one ends. The autoplay feature has been a hallmark for YouTube. The recommended videos shown at the end of the viewer’s selection have increased viewing hours immensely. But YouTube’s critics say that this recommendation system has shown underage viewers inappropriate content, even when they initially viewed innocuous fare.
Since its launch in February 2005, YouTube has grown into a media juggernaut. The company has stated that users watch a total of 1 billion hours of content each day, according to The Wall Street Journal. The proposed changes to the platform would be some of the biggest ever, and they would likely cost the service viewing time – and the associated ad revenue. Technology podcast Reply All reported that YouTube generates an estimated $10 billion in annual revenue, and kids’ programming almost certainly accounts for a large portion of that total.
Both Google Chief Executive Sundar Pichai and YouTube CEO Susan Wojcicki have admitted that YouTube has been slow and sometimes unprepared to handle inappropriate videos. For instance, footage of the mass shootings in New Zealand this past March was uploaded to the platform tens of thousands of times. It took nearly 24 hours to remove the bulk of these videos. Earlier this year, YouTube disabled comments on many videos featuring young children after news broke that pedophiles were using the platform to flag content for one another. Some observers criticized the amount of time it took YouTube to act, but the decision represented a step toward more sweeping policy changes for the platform.
Wojcicki and her team have recently begun work to adjust the rules that govern which videos the platform will promote. Under the new system, videos promoting hate speech and illegal or inappropriate content will vanish beneath other uploads, making them harder for users to run across passively. Wojcicki has stated in internal communications with her team, “It’s not about free speech, it’s about free reach.” While YouTube staff members haven’t gone so far as to consistently remove all offensive content, they have elected not to allow certain creators to promote their videos.
Despite YouTube’s recent efforts to fix their stumbles, many parents feel the company has not done enough. Dr. Free N. Hess wrote a blog post in February titled “YouTube Kids. There is still a HUGE problem” (emphasis hers). Hess posted screenshots and full videos she found on YouTube Kids that illustrate violence, self-harm, sexual innuendo and human trafficking. The uploaders use popular games such as Minecraft or anime-style cartoons, which may appear innocuous to the viewer and hide dangerous and inappropriate content within. Hess, a doctor of pediatric emergency medicine, noticed an uptick in suicide attempts among children over the past few years. Some of the patients she treated told her they learned their self-harm techniques from YouTube videos. Some of the videos she found had been flagged for removal as much as eight months prior, yet she could still view these videos when she posed as a child on YouTube Kids.
Even videos featuring Mickey Mouse, Queen Elsa of “Frozen,” and other Disney characters that appeal to children as young as 2 are not safe. In 2017, videos surfaced of Mickey Mouse lying in a pool of blood while Minnie Mouse looks on, horrified. There is also a disturbing Peppa Pig video that shows a dentist abusing Peppa, sadistically pulling out her teeth. Adult viewers can see that these videos are crudely drawn and clearly are not sanctioned by the original illustrators. While older children may be able to identify a video that doesn’t seem right and alert their parents, younger children won’t be able to recognize this inappropriate content and will watch their beloved character without a second thought. While they may not fully understand what they are viewing, the messages in these videos still can be damaging.
YouTube knows and acknowledges the uphill battle it will fight to control uploaded content. While the company is taking proactive steps toward identifying the issues, it is unlikely to fully resolve them for years. YouTube must change the site’s infrastructure to disable the autoplay feature on YouTube Kids; this alone will be a costly and sweeping change to their platform.
My husband and I are among the first generation of parents in the social media era. We didn’t grow up with social media and the myriad messaging apps and video viewing sites to which our kids have access. This generation faces a new frontier, and it can be difficult to remain vigilant to every bit of information that bombards our children on these sites. My husband and I allow our daughters to watch YouTube videos, but we do our best to keep one ear open and listen for anything that sounds alarming. (As I type this post, I can hear Peppa Pig and friends in the background.) We also ask our daughters about what they are watching and keep an open dialogue to discuss what they see. We have already identified a few YouTube channels we don’t want them watching, as well as online games we don’t want them playing. We do our best to keep our daughters safe without restricting these websites and services entirely, as we believe they can be excellent learning tools when used correctly.
Until YouTube can get a handle on the situation and make the necessary changes to keep their Kids platform safer, that is the only option we have.
Posted by Aline Pitney
YouTube CEO Susan Wojcicki (left) and David Prager in 2013. Photo courtesy TechCrunch.
Parenting in the age of YouTube Kids has its advantages – but it also gives parents like me new reasons to worry.
My 6-year-old daughter loves watching YouTube videos. My husband and I joke that she learned her ABCs and how to read from them, but this is probably not far from the truth. She knew all her letters before she turned 2, thanks to a 50-minute video she watched on repeat. (I nearly memorized it too.) Overall, my family has had good experiences with the YouTube Kids app as a learning tool, but we are still mindful of what our daughter watches.
A recent article in The Wall Street Journal discussed the possibility that YouTube would make sweeping changes to its platform, especially with regard to kids’ content, in light of a Federal Trade Commission probe. The FTC is investigating whether Alphabet Inc. – the parent company of YouTube’s owner, Google – has been collecting data from minors under the age of 13 without parents’ consent. The groups that complained to the FTC also alleged that YouTube exposed children to inappropriate content.
With the FTC’s eyes turned toward YouTube already, the platform is considering two significant changes: moving all content related to kids age 13 and under to the YouTube Kids platform and disabling the autoplay feature on YouTube Kids, so that a new video will not automatically play after the previous one ends. The autoplay feature has been a hallmark for YouTube. The recommended videos shown at the end of the viewer’s selection have increased viewing hours immensely. But YouTube’s critics say that this recommendation system has shown underage viewers inappropriate content, even when they initially viewed innocuous fare.
Since its launch in February 2005, YouTube has grown into a media juggernaut. The company has stated that users watch a total of 1 billion hours of content each day, according to The Wall Street Journal. The proposed changes to the platform would be some of the biggest ever, and they would likely cost the service viewing time – and the associated ad revenue. Technology podcast Reply All reported that YouTube generates an estimated $10 billion in annual revenue, and kids’ programming almost certainly accounts for a large portion of that total.
Both Google Chief Executive Sundar Pichai and YouTube CEO Susan Wojcicki have admitted that YouTube has been slow and sometimes unprepared to handle inappropriate videos. For instance, footage of the mass shootings in New Zealand this past March was uploaded to the platform tens of thousands of times. It took nearly 24 hours to remove the bulk of these videos. Earlier this year, YouTube disabled comments on many videos featuring young children after news broke that pedophiles were using the platform to flag content for one another. Some observers criticized the amount of time it took YouTube to act, but the decision represented a step toward more sweeping policy changes for the platform.
Wojcicki and her team have recently begun work to adjust the rules that govern which videos the platform will promote. Under the new system, videos promoting hate speech and illegal or inappropriate content will vanish beneath other uploads, making them harder for users to run across passively. Wojcicki has stated in internal communications with her team, “It’s not about free speech, it’s about free reach.” While YouTube staff members haven’t gone so far as to consistently remove all offensive content, they have elected not to allow certain creators to promote their videos.
Despite YouTube’s recent efforts to fix their stumbles, many parents feel the company has not done enough. Dr. Free N. Hess wrote a blog post in February titled “YouTube Kids. There is still a HUGE problem” (emphasis hers). Hess posted screenshots and full videos she found on YouTube Kids that illustrate violence, self-harm, sexual innuendo and human trafficking. The uploaders use popular games such as Minecraft or anime-style cartoons, which may appear innocuous to the viewer and hide dangerous and inappropriate content within. Hess, a doctor of pediatric emergency medicine, noticed an uptick in suicide attempts among children over the past few years. Some of the patients she treated told her they learned their self-harm techniques from YouTube videos. Some of the videos she found had been flagged for removal as much as eight months prior, yet she could still view these videos when she posed as a child on YouTube Kids.
Even videos featuring Mickey Mouse, Queen Elsa of “Frozen,” and other Disney characters that appeal to children as young as 2 are not safe. In 2017, videos surfaced of Mickey Mouse lying in a pool of blood while Minnie Mouse looks on, horrified. There is also a disturbing Peppa Pig video that shows a dentist abusing Peppa, sadistically pulling out her teeth. Adult viewers can see that these videos are crudely drawn and clearly are not sanctioned by the original illustrators. While older children may be able to identify a video that doesn’t seem right and alert their parents, younger children won’t be able to recognize this inappropriate content and will watch their beloved character without a second thought. While they may not fully understand what they are viewing, the messages in these videos still can be damaging.
YouTube knows and acknowledges the uphill battle it will fight to control uploaded content. While the company is taking proactive steps toward identifying the issues, it is unlikely to fully resolve them for years. YouTube must change the site’s infrastructure to disable the autoplay feature on YouTube Kids; this alone will be a costly and sweeping change to their platform.
My husband and I are among the first generation of parents in the social media era. We didn’t grow up with social media and the myriad messaging apps and video viewing sites to which our kids have access. This generation faces a new frontier, and it can be difficult to remain vigilant to every bit of information that bombards our children on these sites. My husband and I allow our daughters to watch YouTube videos, but we do our best to keep one ear open and listen for anything that sounds alarming. (As I type this post, I can hear Peppa Pig and friends in the background.) We also ask our daughters about what they are watching and keep an open dialogue to discuss what they see. We have already identified a few YouTube channels we don’t want them watching, as well as online games we don’t want them playing. We do our best to keep our daughters safe without restricting these websites and services entirely, as we believe they can be excellent learning tools when used correctly.
Until YouTube can get a handle on the situation and make the necessary changes to keep their Kids platform safer, that is the only option we have.
Related posts:
No related posts.