Step into a world where the focus is keenly set on Parental Control Where Are They Now. Within the confines of this article, a tapestry of references to Parental Control Where Are They Now awaits your exploration. If your pursuit involves unraveling the depths of Parental Control Where Are They Now, you've arrived at the perfect destination.
Our narrative unfolds with a wealth of insights surrounding Parental Control Where Are They Now. This is not just a standard article; it's a curated journey into the facets and intricacies of Parental Control Where Are They Now. Whether you're thirsting for comprehensive knowledge or just a glimpse into the universe of Parental Control Where Are They Now, this promises to be an enriching experience.
The spotlight is firmly on Parental Control Where Are They Now, and as you navigate through the text on these digital pages, you'll discover an extensive array of information centered around Parental Control Where Are They Now. This is more than mere information; it's an invitation to immerse yourself in the enthralling world of Parental Control Where Are They Now.
So, if you're eager to satisfy your curiosity about Parental Control Where Are They Now, your journey commences here. Let's embark together on a captivating odyssey through the myriad dimensions of Parental Control Where Are They Now.
Showing posts sorted by relevance for query Parental Control Where Are They Now. Sort by dateShow all posts
Showing posts sorted by relevance for query Parental Control Where Are They Now. Sort by dateShow all posts
Meta Unveils New Parental Controls for Instagram, Virtual Reality
Meta Unveils New Parental Controls for Instagram, Virtual Reality
Facebook's parent company Meta said Wednesday it's rolling out new tools meant to give parents and guardians more control over how their teens use social media and virtual reality.
Instagram will let parents limit how much time their teens spend on the service.
Instagram
The social media giant is releasing the new features first on Instagram, a photo-and-video service owned by Meta. Parents will be able to view the amount of time their teens spend on Instagram and set time limits. They'll also be able to get updates about accounts their teens follow and the accounts that follow their teens.
Instagram will have a Family Center where parents can access these tools in one place and view resources such as videos and articles about how to talk to their teens about their social media use. Instagram said the tools will be available in the US on Wednesday. The company plans to roll out the features globally in the coming months.
"This is just one step on a longer path -- our vision for Family Center is to eventually allow parents and guardians to help their teens manage experiences across Meta technologies, all from one central place," said Adam Mosseri, who heads Instagram.
Meta owns messaging apps WhatsApp and Facebook Messenger along with VR headset maker Oculus.
The release of more parental controls underscores how Meta is trying to respond to criticism it's not doing enough to protect the safety of young people on its services. There are a number of issues that parents are worried about on social media, including content about suicide and eating disorders. The blowback against Instagram escalated after Frances Haugen, a former Facebook product manager turned whistleblower, leaked a trove of internal documents last year.
The Wall Street Journal published a story partly based on the documents about how Instagram knows the app is "toxic for teen girls." Research presented in 2019 found that Instagram makes body image issues worse for one in three teen girls. Teens also said Instagram increased rates of anxiety and depression, The Journal reported.
Meta pushed back on the characterization of its research, noting that Instagram also connects teens with their family and friends. The leaked research prompted US lawmakers to call several congressional hearings on the topic.
In December, Mosseri testified before Congress for the first time. At that hearing, lawmakers also expressed concerns about Instagram creating a kids' version for children under 13. Even though Instagram paused the project, Mosseri stopped short of saying the idea is permanently off the table. Instagram says the project is meant to give parents more control over the social media usage of kids between the ages of 10 and 12 who may already be on the app.
Earlier this month, US President Joe Biden also urged Congress in his State of the Union address to "strengthen privacy protections, ban targeted advertising to children, demand tech companies stop collecting personal data on our children."
As Instagram rolls out more tools for supervision, parents will also have to balance a teens' need for privacy. Teens have also found ways to circumvent parental controls, and parents don't always use the tools available to them.
For now, teens will need to activate the parental controls on the Instagram app. In June, parents will be able to initiate supervision of their teens' Instagram use through the app and the desktop site. Teens will still need to approve the request from their parents.
Instagram said it will also release more tools in the coming months, such as the ability for parents to set the hours their teens are allowed to browse Instagram. More than one guardian will also be able to supervise a teens' Instagram account.
Meanwhile, Meta has been pushing forward with plans to build the metaverse, virtual worlds where people will be able to work, socialize and shop.
VR, though, has the same problems found on social media such as harassment and underage users. To use the Oculus VR headset, you're supposed to be at least 13 years old.
Meta said in May it will start rolling out more supervision tools in VR. The company will start to automatically block teens from downloading or purchasing apps in VR that are inappropriate for their age. Parents will have the ability to override the blocks and teens will also be able to request this as well.
"Different teens have different maturity levels, and parents know their teens best. We also know that customizable controls, teen autonomy and adjustable settings are important to our community," Meta said in a blog post.
The Oculus mobile app will include a dashboard where parents can manage these tools. In April, parents will be able to prevent teens from accessing games that are inappropriate for their age by using an "unlock pattern" to lock access to these apps. Oculus users create this pattern as an extra level of security to prevent others from accessing their devices or saved passwords.
Parents will also be able to view all the apps their teens own, receive notifications when a purchase is made, view who is on their friends list on Oculus and also see how much time their teens spend in VR.
Make youtube more kid friendly 5 things parents can control freaks make youtube more kid friendly 5 things parents can do to prevent make youtube more kid friendly 5 things parents can do make youtube more kid friendly 5 things parents shouldn t make youtube more kid friendly 5 things to know make youtube more kid friendly 5 things for anxiety make youtube more kid friendly 5 things about me how to make youtube more kid friendly how to make youtube more kid friendly how to make youtube videos how to make youtube banner make youtube account
Make YouTube More Kid-Friendly: 5 Things Parents Can Control
Make YouTube More Kid-Friendly: 5 Things Parents Can Control
There's a ton of content on YouTube, and kids can easily spend hours wading through the ocean of videos online. From Cocomelon to gamer streams like PewDiePie and TikTok compilations, YouTube can be a video-filled playground for a kid, especially when they have access to a tablet.
So, what are parents to do? Rather than let your young kid run amok with uninhibited freedom on YouTube, you can practice restraint by activating restrictions. While there are ways to use safety settings on your Android or Apple phones or TV streaming devices, YouTube has an assortment of features that go beyond the basic set of parental controls. You can curb screen time, monitor content, block videos and more.
Read more:Best Kids' Tablets for 2022
The separate YouTube Kids app is designed to cater to younger children, but the main YouTube app offers myriad other ways to oversee what your child is watching as well, even if they're older. Here are the main tools available to parents, arranged roughly in order of easiest to most involved -- from a single toggle to setting up and using separate apps.
'Dismiss' what you don't want your kid to see
YouTube's algorithm notoriously tracks your life and what you like to watch. Then it uses that information to recommend more related -- or surprisingly random -- videos to grab your attention. Imagine how many videos kids are bombarded with after taking in the latest Marvel trailer, for example. There's a way to shut that down.
YouTube encourages parents to dismiss content they don't want their kids to see by clicking on the three dots on a video or channel and choosing "Not Interested." Such content will not pop up in their recommendation feed again. This function is available only to users of mobile phones, which may keep you on your toes when your child is checking out something that's way too mature for them.
Unlike some of the other safety options, this one isn't on autopilot and requires you to be an active viewer. It's ideal if you're watching with your kids at the doctor's office, the beach or in an Uber, where you'll learn to be quick with the dismissive clicks when autoplay is rolling.
Restricted Mode
An option that's separate from the "Not Interested" function, YouTube's Restricted Mode is a single setting that lets you hide inappropriate content. It's available on all devices, including smart TVs and tablets. From a phone or tablet, click your account profile, navigate to General and then hit the gear icon for Settings. Toggle Restricted Mode to "On." Unless you're viewing YouTube on a smart TV, you must sign in to your Google account to flip on this setting.
You can use this feature to manage videos for a child of any age, including teens. Restricted Mode is also accessible if you run a supervised account for your child.
Supervise your child's account
Introduced on YouTube in 2021, supervised accounts have been available for parents in beta mode. They operate exactly how they sound. Initially, these accounts were specifically for kids under 13, but now the feature includes older teens.
Content ratings are set by parents. Per YouTube, parents can choose from three content settings -- Explore, Explore More and Most of YouTube -- that correspond with ratings for tweens and teens. These settings apply to the main YouTube app and YouTube Music.
Explore is generally for viewers ages 9 and older. Some of the videos may contain low levels of violence, controlled substances or provocative language. YouTube also notes that educational videos about the human body or mental health are also in this grouping.
Videos under Explore More are geared toward the 13-and-over crowd. In addition to vlogs, music videos, gaming clips and other content, there may be mild profanity, violence, controlled substances and "non-explicit sexual references."
The app's Most of YouTube setting denotes videos that are for older teens, with the exception of those marked for users who are 18+ and a selection of filtered content.
Worried about ads? Supervised accounts prevent personalized ads from being shown, and certain other ads aren't allowed. Video creators are required to notify YouTube if their content spotlights product placements, so you may occasionally see these occurrences.
According to YouTube, updated features for supervised accounts now include the ability to block specific channels, and supervised experiences have expanded to eligible smart TVs and YouTube Music on Android, iOS, the web and Google Assistant on smart displays.
Head to YouTube Kids
Prefer to keep it simple for kiddos under age 13? Just set them up with a YouTube Kids account. The app is entirely separate from the main YouTube app and houses age-appropriate content that's either curated by YouTube or you, the parent. Videos are chosen and filtered by a group of parents, child experts and YouTube engineers before hitting the platform, and you still have the ability to flag or block channels and individual videos.
Additionally, parents and caregivers with Google accounts can sign in and send content directly to their kids using the YouTube Kids app. But it's the parental controls that may give you real peace of mind.
After creating kids' profiles, you're able to choose content based on age or set it up where you must approve each video. With the latter, you can select collections or specific channels for your child's viewing experience, and kids are prevented from searching for anything on the app. Parents can also set timers or track their child's viewing history. If you want to go the extra mile, you can opt to pay for a premium account to lock out any ads.
Manage YouTube with Google's Family Link app
Google isn't only YouTube's parent company, but it also owns Family Link, an app that allows users to filter content and manage screen time for kids. It's not a requirement in order to use parental controls, YouTube's supervised accounts or other safety features, but it may make your life easier.
Family Link lets you connect new or existing Google accounts for your child. For kids under age 13, you can register a new account, and Family Link allows you to use parental controls and supervision on existing accounts for children of any age. However, with the latter, your kid must consent to parental supervision in order for you to activate it. From there, you can install and activate Family Link on their individual devices and manage YouTube from your own account. Should your kids deactivate supervision on their devices, you'll receive a notification, and Android devices will lock.
Family Link is available on the YouTube website, Chromebook, iOS 11 devices and higher and certain Android devices running versions 5.0 or higher. Download the app and you'll be walked through the steps to add your child's new or existing Google account. Once the accounts are linked and sign-up is complete, your kids can log in to their devices using Family Link. From there, you can set up restrictions or lock devices for each child by simply tapping his or her name in Family Link.
With your kid's YouTube app access tethered to yours, you can then set boundaries for screen time, content or bedtime. You'll be able to monitor what videos they watch and how often, whether it's on the regular YouTube app, YouTube Kids or YouTube Music.
For more, check out Netflix's parental control features and how you can set up and use your Wi-Fi router's parental controls. Plus, here are the best apps to download to keep your kids safe online.
Tiktok parents are taking advantage of their kids it needs to be fixed tiktok parents are taking advantage of their kids it needs assessment tiktok parents are taking advantage of their kids itchy tiktok parents are taking advantage of their kids itunes tiktok parents are taking advantage of opportunities tiktok parents are taking advantage quotes tiktok parents are taking supplements tiktok parents are taking laxatives tiktok parents are taking amino tiktok parents catch son sneaking out
TikTok Parents Are Taking Advantage of Their Kids. It Needs to Stop
TikTok Parents Are Taking Advantage of Their Kids. It Needs to Stop
Rachel Barkman's son started accurately identifying different species of mushroom at the age of 2. Together they'd go out into the mossy woods near her home in Vancouver and forage. When it came to occasionally sharing in her TikTok videos her son's enthusiasm and skill for picking mushrooms, she didn't think twice about it -- they captured a few cute moments, and many of her 350,000-plus followers seemed to like it.
That was until last winter, when a female stranger approached them in the forest, bent down and addressed her son, then 3, by name and asked if he could show her some mushrooms.
"I immediately went cold at the realization that I had equipped complete strangers with knowledge of my son that puts him at risk," Barkman said in an interview this past June.
This incident, combined with research into the dangers of sharing too much, made her reevaluate her son's presence online. Starting at the beginning of this year, she vowed not to feature his face in future content.
"My decision was fueled by a desire to protect my son, but also to protect and respect his identity and privacy, because he has a right to choose the way he is shown to the world," she said.
These kinds of dangers have cropped up alongside the rise in child influencers, such as 10-year-old Ryan Kaji of Ryan's World, who has almost 33 million subscribers, with various estimates putting his net worth in the multiple tens of millions of dollars. Increasingly, brands are looking to use smaller, more niche, micro- and nano-influencers, developing popular accounts on Instagram, TikTok and YouTube to reach their audiences. And amid this influencer gold rush there's a strong incentive for parents, many of whom are sharing photos and videos of their kids online anyway, to get in on the action.
The increase in the number of parents who manage accounts for their kids -- child influencers' parents are often referred to as "sharents" -- opens the door to exploitation or other dangers. With almost no industry guardrails in place, these parents find themselves in an unregulated wild west. They're the only arbiters of how much exposure their children get, how much work their kids do, and what happens to money earned through any content they feature in.
Instagram didn't respond to multiple requests for comment about whether it takes any steps to safeguard child influencers. A representative for TikTok said the company has a zero-tolerance approach to sexual exploitation and pointed to policies to protect accounts of users under the age of 16. But these policies don't apply to parents posting with or on behalf of their children. YouTube didn't immediately respond to a request for comment.
"When parents share about their children online, they act as both the gatekeeper -- the one tasked with protecting a child's personal information -- and as the gate opener," said Stacey Steinberg, a professor of law at the University of Florida and author of the book Growing Up Shared. As the gate opener, "they benefit, gaining both social and possibly financial capital by their online disclosures."
The reality is that some parents neglect the gatekeeping and leave the gate wide open for any internet stranger to walk through unchecked. And walk through they do.
Meet the sharents
Mollie is an aspiring dancer and model with an Instagram following of 122,000 people. Her age is ambiguous but she could be anywhere from 11-13, meaning it's unlikely she's old enough to meet the social media platform's minimum age requirement. Her account is managed by her father, Chris, whose own account is linked in her bio, bringing things in line with Instagram's policy. (Chris didn't respond to a request for comment.)
You don't have to travel far on Instagram to discover accounts such as Mollie's, where grown men openly leer at preteen girls. Public-facing, parent-run accounts dedicated to dancers and gymnasts -- who are under the age of 13 and too young to have accounts of their own -- number in the thousands. (To protect privacy, we've chosen not to identify Mollie, which isn't her real name, or any other minors who haven't already appeared in the media.)
Parents use these accounts, which can have tens of thousands or hundreds of thousands of followers, to raise their daughters' profiles by posting photos of them posing and demonstrating their flexibility in bikinis and leotards. The comment sections are often flooded with sexualized remarks. A single, ugly word appeared under one group shot of several young girls in bikinis: "orgy."
Some parents try to contain the damage by limiting comments on posts that attract too much attention. The parent running one dancer account took a break from regular scheduling to post a pastel-hued graphic reminding other parents to review their followers regularly. "After seeing multiple stories and posts from dance photographers we admire about cleaning up followers, I decided to spend time cleaning," read the caption. "I was shocked at how many creeps got through as followers."
But "cleaning up" means engaging in a never-ending game of whack-a-mole to keep unwanted followers at bay, and it ignores the fact that you don't need to be following a public account to view the posts. Photos of children are regularly reposted on fan or aggregator accounts, over which parents have no control, and they can also be served up through hashtags or through Instagram's discovery algorithms.
The simple truth is that publicly posted content is anyone's for the taking. "Once public engagement happens, it is very hard, if not impossible, to really put meaningful boundaries around it," said Leah Plunkett, author of the book Sharenthood and a member of the faculty at Harvard Law School.
This concern is at the heart of the current drama concerning the TikTok account @wren.eleanor. Wren is an adorable blonde 3-year-old girl, and the account, which has 17.3 million followers, is managed by her mother, Jacquelyn, who posts videos almost exclusively of her child.
Concerned onlookers have pointed Jacquelyn toward comments that appear to be predatory, and have warned her that videos in which Wren is in a bathing suit, pretending to insert a tampon, or eating various foodstuffs have more watches, likes and saves than other content. They claim her reluctance to stop posting in spite of their warnings demonstrates she's prioritizing the income from her account over Wren's safety. Jacquelyn didn't respond to several requests for comment.
Last year, the FBI ran a campaign in which it estimated that there were 500,000 predators online every day -- and that's just in the US. Right now, across social platforms, we're seeing the growth of digital marketplaces that hinge on child exploitation, said Plunkett. She doesn't want to tell other parents what to do, she added, but she wants them to be aware that there's "a very real, very pressing threat that even innocent content that they put up about their children is very likely to be repurposed and find its way into those marketplaces."
Naivete vs. exploitation
When parent influencers started out in the world of blogging over a decade ago, the industry wasn't exploitative in the same way it is today, said Crystal Abidin, an academic from Curtin University who specializes in internet cultures. When you trace the child influencer industry back to its roots, what you find is parents, usually mothers, reaching out to one another to connect. "It first came from a place of care among these parent influencers," she said.
Over time, the industry shifted, centering on children more and more as advertising dollars flowed in and new marketplaces formed.
Education about the risks hasn't caught up, which is why people like Sarah Adams, a Vancouver mom who runs the TikTok account @mom.uncharted, have taken it upon themselves to raise the flag on those risks. "My ultimate goal is just have parents pause and reflect on the state of sharenting right now," she said.
But as Mom Uncharted, Adams is also part of a wider unofficial and informal watchdog group of internet moms and child safety experts shedding light on the often disturbing way in which some parents are, sometimes knowingly, exploiting their children online.
The troubling behavior uncovered by Adams and others suggests there's more than naivete at play -- specifically when parents sign up for and advertise services that let people buy "exclusive" or "VIP" access to content featuring their children.
Some parent-run social media accounts that Adams has found linked out to a site called SelectSets, which lets the parents sell photo sets of their children. One account offered sets with titles such as "2 little princesses." SelectSets has described the service as "a classy and professional" option for influencers to monetize content, allowing them to "avoid the stigma often associated with other platforms."
Over the last few weeks, SelectSets has gone offline and no owner could be traced for comment.
In addition to selling photos, many parent-run dancer accounts, Mollie's included, allow strangers to send the dancers swimwear and underwear from the dancers' Amazon wish lists, or money to "sponsor" them to "realize their dream" or support them on their "journeys."
While there's nothing technically illegal about anything these parents are doing, they're placing their children in a gray area that's not explicitly sexual but that many people would consider to be sexualized. The business model of using an Amazon wish list is one commonly embraced by online sugar babies who accept money and gifts from older men.
"Our Conditions of Use and Sale make clear that users of Amazon Services must be 18 or older or accompanied by a parent or guardian," said an Amazon spokesperson in a statement. "In rare cases where we are made aware that an account has been opened by a minor without permission, we close the account."
Adams says it's unlikely to be other 11-year-olds sending their pocket money to these girls so they attend their next bikini modeling shoot. "Who the fuck do you think is tipping these kids?" she said. "It's predators who are liking the way you exploit your child and giving them all the content they need."
Turning points
Plunkett distinguishes between parents who are casually sharing content that features their kids and parents who are sharing for profit, an activity she describes as "commercial sharenting."
"You are taking your child, or in some cases, your broader family's private or intimate moments, and sharing them digitally, in the hope of having some kind of current or future financial benefit," she said.
No matter the parent's hopes or intentions, any time children appear in public-facing social media content, that content has the potential to go viral, and when it does, parents have a choice to either lean in and monetize it or try to rein it in.
During Abidin's research -- in which she follows the changing activities of the same influencers over time -- she's found that many influencer parents reach a turning point. It can be triggered by something as simple as other children at school being aware of their child's celebrity or their child not enjoying it anymore, or as serious as being involved in a car chase while trying to escape fans (an occurrence recounted to Abidin by one of her research subjects).
One influencer, Katy Rose Pritchard, who has almost 92,000 Instagram followers, decided to stop showing her children's faces on social media this year after she discovered they were being used to create role-playing accounts. People had taken photos of her children that she'd posted and used them to create fictional profiles of children for personal gratification, which she said in a post made her feel "violated."
All these examples highlight the different kinds of threats sharents are exposing their children to. Plunkett describes three "buckets" of risk tied to publicly sharing content online. The first and perhaps most obvious are risks involving criminal and/or dangerous behavior, posing a direct threat to the child.
The second are indirect risks, where content posted featuring children can be taken, reused, analyzed or repurposed by people with nefarious motives. Consequences include anything from bullying to harming future job prospects to millions of people having access to children's medical information -- a common trope on YouTube is a video with a melodramatic title and thumbnail involving a child's trip to the hospital, in which influencer parents with sick kids will document their health journeys in blow-by-blow detail.
The third set of risks are probably the least talked about, but they involve potential harm to a child's sense of self. If you're a child influencer, how you see yourself as a person and your ability to develop into an adult is "going to be shaped and in some instances impeded by the fact that your parents are creating this public performance persona for you," said Plunkett.
Often children won't be aware of what this public persona looks like to the audience and how it's being interpreted. They may not even be aware it exists. But at some point, as happened with Barkman, the private world in which content is created and the public world in which it's consumed will inevitably collide. At that point, the child will be thrust into the position of confronting the persona that's been created for them.
"As kids get older, they naturally want to define themselves on their own terms, and if parents have overshared about them in public spaces, that can be difficult, as many will already have notions about who that child is or what that child may like," said Steinberg. "These notions, of course, may be incorrect. And some children may value privacy and wish their life stories were theirs -- not their parents -- to tell."
Savannah and Cole LaBrant have documented nearly everything about their children's lives.
Jim Spellman/WireImage
This aspect of having their real-life stories made public is a key factor distinguishing children working in social media from children working in the professional entertainment industry, who usually play fictional roles. Many children who will become teens and adults in the next couple of decades will have to reckon with the fact that their parents put their most vulnerable moments on the internet for the world to see -- their meltdowns, their humiliation, their most personal moments.
One influencer family, the LaBrants, were forced to issue a public apology in 2019 after they played an April Fools' Day Joke on their 6-year-old daughter Everleigh. The family pretended they were giving her dog away, eliciting tears throughout the video. As a result, many viewers felt that her parents, Sav and Cole, had inflicted unnecessary distress on her.
In the past few months, parents who film their children during meltdowns to demonstrate how to calm them down have found themselves the subject of ire on parenting Subreddits. Their critics argue that it's unfair to post content of children when they're at their most vulnerable, as it shows a lack of respect for a child's right to privacy.
Privacy-centric parenting
Even the staunchest advocates of child privacy know and understand the parental instinct of wanting to share their children's cuteness and talent with the world. "Our kids are the things usually we're the most proud of, the most excited about," said Adams. "It is normal to want to show them off and be proud of them."
When Adams started her account two years ago, she said her views were seen as more polarizing. But increasingly people seem to relate and share her concerns. Most of these are "average parents," naive to the risks they're exposing their kids to, but some are "commercial sharents" too.
Even though they don't always see eye to eye, the private conversations she's had with parents of children (she doesn't publicly call out anyone) with massive social media presences have been civil and productive. "I hope it opens more parents' eyes to the reality of the situation, because frankly this is all just a large social experiment," she said. "And it's being done on our kids. And that just doesn't seem like a good idea."
For Barkman, it's been "surprisingly easy, and hugely beneficial" to stop sharing content about her son. She's more present, and focuses only on capturing memories she wants to keep for herself.
"When motherhood is all consuming, it sometimes feels like that's all you have to offer, so I completely understand how we have slid into oversharing our children," she said. "It's a huge chunk of our identity and our hearts."
But Barkman recognizes the reality of the situation, which is that she doesn't know who's viewing her content and that she can't rely on tech platforms to protect her son. "We are raising a generation of children who have their entire lives broadcast online, and the newness of social media means we don't have much data on the impacts of that reality on children," she said. "I feel better acting with caution and letting my son have his privacy so that he can decide how he wants to be perceived by the world when he's ready and able."
TikTok Parents Are Taking Advantage of Their Kids. It Needs to Stop
TikTok Parents Are Taking Advantage of Their Kids. It Needs to Stop
Rachel Barkman's son started accurately identifying different species of mushroom at the age of 2. Together they'd go out into the mossy woods near her home in Vancouver and forage. When it came to occasionally sharing in her TikTok videos her son's enthusiasm and skill for picking mushrooms, she didn't think twice about it -- they captured a few cute moments, and many of her 350,000-plus followers seemed to like it.
That was until last winter, when a female stranger approached them in the forest, bent down and addressed her son, then 3, by name and asked if he could show her some mushrooms.
"I immediately went cold at the realization that I had equipped complete strangers with knowledge of my son that puts him at risk," Barkman said in an interview this past June.
This incident, combined with research into the dangers of sharing too much, made her reevaluate her son's presence online. Starting at the beginning of this year, she vowed not to feature his face in future content.
"My decision was fueled by a desire to protect my son, but also to protect and respect his identity and privacy, because he has a right to choose the way he is shown to the world," she said.
These kinds of dangers have cropped up alongside the rise in child influencers, such as 10-year-old Ryan Kaji of Ryan's World, who has almost 33 million subscribers, with various estimates putting his net worth in the multiple tens of millions of dollars. Increasingly, brands are looking to use smaller, more niche, micro- and nano-influencers, developing popular accounts on Instagram, TikTok and YouTube to reach their audiences. And amid this influencer gold rush there's a strong incentive for parents, many of whom are sharing photos and videos of their kids online anyway, to get in on the action.
The increase in the number of parents who manage accounts for their kids -- child influencers' parents are often referred to as "sharents" -- opens the door to exploitation or other dangers. With almost no industry guardrails in place, these parents find themselves in an unregulated wild west. They're the only arbiters of how much exposure their children get, how much work their kids do, and what happens to money earned through any content they feature in.
Instagram didn't respond to multiple requests for comment about whether it takes any steps to safeguard child influencers. A representative for TikTok said the company has a zero-tolerance approach to sexual exploitation and pointed to policies to protect accounts of users under the age of 16. But these policies don't apply to parents posting with or on behalf of their children. YouTube didn't immediately respond to a request for comment.
"When parents share about their children online, they act as both the gatekeeper -- the one tasked with protecting a child's personal information -- and as the gate opener," said Stacey Steinberg, a professor of law at the University of Florida and author of the book Growing Up Shared. As the gate opener, "they benefit, gaining both social and possibly financial capital by their online disclosures."
The reality is that some parents neglect the gatekeeping and leave the gate wide open for any internet stranger to walk through unchecked. And walk through they do.
Meet the sharents
Mollie is an aspiring dancer and model with an Instagram following of 122,000 people. Her age is ambiguous but she could be anywhere from 11-13, meaning it's unlikely she's old enough to meet the social media platform's minimum age requirement. Her account is managed by her father, Chris, whose own account is linked in her bio, bringing things in line with Instagram's policy. (Chris didn't respond to a request for comment.)
You don't have to travel far on Instagram to discover accounts such as Mollie's, where grown men openly leer at preteen girls. Public-facing, parent-run accounts dedicated to dancers and gymnasts -- who are under the age of 13 and too young to have accounts of their own -- number in the thousands. (To protect privacy, we've chosen not to identify Mollie, which isn't her real name, or any other minors who haven't already appeared in the media.)
Parents use these accounts, which can have tens of thousands or hundreds of thousands of followers, to raise their daughters' profiles by posting photos of them posing and demonstrating their flexibility in bikinis and leotards. The comment sections are often flooded with sexualized remarks. A single, ugly word appeared under one group shot of several young girls in bikinis: "orgy."
Some parents try to contain the damage by limiting comments on posts that attract too much attention. The parent running one dancer account took a break from regular scheduling to post a pastel-hued graphic reminding other parents to review their followers regularly. "After seeing multiple stories and posts from dance photographers we admire about cleaning up followers, I decided to spend time cleaning," read the caption. "I was shocked at how many creeps got through as followers."
But "cleaning up" means engaging in a never-ending game of whack-a-mole to keep unwanted followers at bay, and it ignores the fact that you don't need to be following a public account to view the posts. Photos of children are regularly reposted on fan or aggregator accounts, over which parents have no control, and they can also be served up through hashtags or through Instagram's discovery algorithms.
The simple truth is that publicly posted content is anyone's for the taking. "Once public engagement happens, it is very hard, if not impossible, to really put meaningful boundaries around it," said Leah Plunkett, author of the book Sharenthood and a member of the faculty at Harvard Law School.
This concern is at the heart of the current drama concerning the TikTok account @wren.eleanor. Wren is an adorable blonde 3-year-old girl, and the account, which has 17.3 million followers, is managed by her mother, Jacquelyn, who posts videos almost exclusively of her child.
Concerned onlookers have pointed Jacquelyn toward comments that appear to be predatory, and have warned her that videos in which Wren is in a bathing suit, pretending to insert a tampon, or eating various foodstuffs have more watches, likes and saves than other content. They claim her reluctance to stop posting in spite of their warnings demonstrates she's prioritizing the income from her account over Wren's safety. Jacquelyn didn't respond to several requests for comment.
Last year, the FBI ran a campaign in which it estimated that there were 500,000 predators online every day -- and that's just in the US. Right now, across social platforms, we're seeing the growth of digital marketplaces that hinge on child exploitation, said Plunkett. She doesn't want to tell other parents what to do, she added, but she wants them to be aware that there's "a very real, very pressing threat that even innocent content that they put up about their children is very likely to be repurposed and find its way into those marketplaces."
Naivete vs. exploitation
When parent influencers started out in the world of blogging over a decade ago, the industry wasn't exploitative in the same way it is today, said Crystal Abidin, an academic from Curtin University who specializes in internet cultures. When you trace the child influencer industry back to its roots, what you find is parents, usually mothers, reaching out to one another to connect. "It first came from a place of care among these parent influencers," she said.
Over time, the industry shifted, centering on children more and more as advertising dollars flowed in and new marketplaces formed.
Education about the risks hasn't caught up, which is why people like Sarah Adams, a Vancouver mom who runs the TikTok account @mom.uncharted, have taken it upon themselves to raise the flag on those risks. "My ultimate goal is just have parents pause and reflect on the state of sharenting right now," she said.
But as Mom Uncharted, Adams is also part of a wider unofficial and informal watchdog group of internet moms and child safety experts shedding light on the often disturbing way in which some parents are, sometimes knowingly, exploiting their children online.
The troubling behavior uncovered by Adams and others suggests there's more than naivete at play -- specifically when parents sign up for and advertise services that let people buy "exclusive" or "VIP" access to content featuring their children.
Some parent-run social media accounts that Adams has found linked out to a site called SelectSets, which lets the parents sell photo sets of their children. One account offered sets with titles such as "2 little princesses." SelectSets has described the service as "a classy and professional" option for influencers to monetize content, allowing them to "avoid the stigma often associated with other platforms."
Over the last few weeks, SelectSets has gone offline and no owner could be traced for comment.
In addition to selling photos, many parent-run dancer accounts, Mollie's included, allow strangers to send the dancers swimwear and underwear from the dancers' Amazon wish lists, or money to "sponsor" them to "realize their dream" or support them on their "journeys."
While there's nothing technically illegal about anything these parents are doing, they're placing their children in a gray area that's not explicitly sexual but that many people would consider to be sexualized. The business model of using an Amazon wish list is one commonly embraced by online sugar babies who accept money and gifts from older men.
"Our Conditions of Use and Sale make clear that users of Amazon Services must be 18 or older or accompanied by a parent or guardian," said an Amazon spokesperson in a statement. "In rare cases where we are made aware that an account has been opened by a minor without permission, we close the account."
Adams says it's unlikely to be other 11-year-olds sending their pocket money to these girls so they attend their next bikini modeling shoot. "Who the fuck do you think is tipping these kids?" she said. "It's predators who are liking the way you exploit your child and giving them all the content they need."
Turning points
Plunkett distinguishes between parents who are casually sharing content that features their kids and parents who are sharing for profit, an activity she describes as "commercial sharenting."
"You are taking your child, or in some cases, your broader family's private or intimate moments, and sharing them digitally, in the hope of having some kind of current or future financial benefit," she said.
No matter the parent's hopes or intentions, any time children appear in public-facing social media content, that content has the potential to go viral, and when it does, parents have a choice to either lean in and monetize it or try to rein it in.
During Abidin's research -- in which she follows the changing activities of the same influencers over time -- she's found that many influencer parents reach a turning point. It can be triggered by something as simple as other children at school being aware of their child's celebrity or their child not enjoying it anymore, or as serious as being involved in a car chase while trying to escape fans (an occurrence recounted to Abidin by one of her research subjects).
One influencer, Katy Rose Pritchard, who has almost 92,000 Instagram followers, decided to stop showing her children's faces on social media this year after she discovered they were being used to create role-playing accounts. People had taken photos of her children that she'd posted and used them to create fictional profiles of children for personal gratification, which she said in a post made her feel "violated."
All these examples highlight the different kinds of threats sharents are exposing their children to. Plunkett describes three "buckets" of risk tied to publicly sharing content online. The first and perhaps most obvious are risks involving criminal and/or dangerous behavior, posing a direct threat to the child.
The second are indirect risks, where content posted featuring children can be taken, reused, analyzed or repurposed by people with nefarious motives. Consequences include anything from bullying to harming future job prospects to millions of people having access to children's medical information -- a common trope on YouTube is a video with a melodramatic title and thumbnail involving a child's trip to the hospital, in which influencer parents with sick kids will document their health journeys in blow-by-blow detail.
The third set of risks are probably the least talked about, but they involve potential harm to a child's sense of self. If you're a child influencer, how you see yourself as a person and your ability to develop into an adult is "going to be shaped and in some instances impeded by the fact that your parents are creating this public performance persona for you," said Plunkett.
Often children won't be aware of what this public persona looks like to the audience and how it's being interpreted. They may not even be aware it exists. But at some point, as happened with Barkman, the private world in which content is created and the public world in which it's consumed will inevitably collide. At that point, the child will be thrust into the position of confronting the persona that's been created for them.
"As kids get older, they naturally want to define themselves on their own terms, and if parents have overshared about them in public spaces, that can be difficult, as many will already have notions about who that child is or what that child may like," said Steinberg. "These notions, of course, may be incorrect. And some children may value privacy and wish their life stories were theirs -- not their parents -- to tell."
Savannah and Cole LaBrant have documented nearly everything about their children's lives.
Jim Spellman/WireImage
This aspect of having their real-life stories made public is a key factor distinguishing children working in social media from children working in the professional entertainment industry, who usually play fictional roles. Many children who will become teens and adults in the next couple of decades will have to reckon with the fact that their parents put their most vulnerable moments on the internet for the world to see -- their meltdowns, their humiliation, their most personal moments.
One influencer family, the LaBrants, were forced to issue a public apology in 2019 after they played an April Fools' Day Joke on their 6-year-old daughter Everleigh. The family pretended they were giving her dog away, eliciting tears throughout the video. As a result, many viewers felt that her parents, Sav and Cole, had inflicted unnecessary distress on her.
In the past few months, parents who film their children during meltdowns to demonstrate how to calm them down have found themselves the subject of ire on parenting Subreddits. Their critics argue that it's unfair to post content of children when they're at their most vulnerable, as it shows a lack of respect for a child's right to privacy.
Privacy-centric parenting
Even the staunchest advocates of child privacy know and understand the parental instinct of wanting to share their children's cuteness and talent with the world. "Our kids are the things usually we're the most proud of, the most excited about," said Adams. "It is normal to want to show them off and be proud of them."
When Adams started her account two years ago, she said her views were seen as more polarizing. But increasingly people seem to relate and share her concerns. Most of these are "average parents," naive to the risks they're exposing their kids to, but some are "commercial sharents" too.
Even though they don't always see eye to eye, the private conversations she's had with parents of children (she doesn't publicly call out anyone) with massive social media presences have been civil and productive. "I hope it opens more parents' eyes to the reality of the situation, because frankly this is all just a large social experiment," she said. "And it's being done on our kids. And that just doesn't seem like a good idea."
For Barkman, it's been "surprisingly easy, and hugely beneficial" to stop sharing content about her son. She's more present, and focuses only on capturing memories she wants to keep for herself.
"When motherhood is all consuming, it sometimes feels like that's all you have to offer, so I completely understand how we have slid into oversharing our children," she said. "It's a huge chunk of our identity and our hearts."
But Barkman recognizes the reality of the situation, which is that she doesn't know who's viewing her content and that she can't rely on tech platforms to protect her son. "We are raising a generation of children who have their entire lives broadcast online, and the newness of social media means we don't have much data on the impacts of that reality on children," she said. "I feel better acting with caution and letting my son have his privacy so that he can decide how he wants to be perceived by the world when he's ready and able."
Facebook portal go amazon facebook portal go and portal facebook portal google meet can you go live with facebook portal facebook portal go portal facebook portal go facebook portal go review facebook portal go manual facebook portal go amazon facebook portal google assistant facebook portal go price facebook portal go tutorial facebook portal device facebook portal 10 inch facebook portal login facebook portal
Facebook Portal Go: A battery-powered smart display that doesn't make sense
Facebook Portal Go: A battery-powered smart display that doesn't make sense
Facebook, despite its ongoing privacy troubles and Congressional inquiries, is pressing forward with hardware. The second generation of its Portal Plus smart display and the new, battery-powered Portal Go are now available. When I heard the pitch from Facebook representatives for a battery-powered smart display with a camera shutter and a handle, I was optimistic. At least, I thought, Facebook was doing something new -- a portable smart display.
Molly Price/CNET
After a week of testing, it's hard for me to recommend the Portal Go. Limited apps and unintuitive quirks combined with Facebook's overarching privacy issues are enough for me to steer everyone in another direction. If the $200 Portal Go had wowed me in any facet of my evaluation, I would give it its due praise. It just didn't. It's expensive and unnecessary. Still, there are interesting points and features. If you're going to skip out on the latest tech, you should know why. Let's dive into the details.
The obvious question
Like most folks, I wondered "Why not just buy a tablet?" After all, Portal has Alexa built-in but is far from focused on the smart home like the Nest Hub or an Echo Show. It's billed mostly as a chatting/meeting device. I asked Facebook directly and here's its spokesperson's response in full:
The first major difference is that Portal is focused on connecting people and it's built for the home, so we've made design choices that optimize for that. Portal also has a Smart Camera with an ultrawide field of view which is a differentiator for video calling – group calls feel more natural and calls tend to last longer. And with Facebook Assistant and Alexa Built-in, Portal enables hands-free functionality which is increasingly expected in places like the kitchen where your hands are often busy with other tasks.
We spoke a lot about Portal+ as a great work companion device, but Portal Go is also great as a second screen for work – you can free up your laptop so that you can connect with colleagues and be more productive by taking notes or reviewing other materials. And the audio on Portal is better – it's tuned for video calling and a great speaker when you are not on calls.
And, Portal is meant to be used as a communal device that is available for everyone in the home. Tablets are really designed as personal devices and aren't typically used in ways where household members can depend on them being available. Household mode is an example of how we are building on this – a computer for the house that can be shared, and anyone can use to make a call and deliver a more personalized experience through our support for multiple users.
There's a fair argument here for Portals being a better shared device than a tablet. Most parents I know don't want their kid handling their personal tablet, but that's also why so many kid-friendly models are out there. Big rubber cases and kid-friendly apps with parental control options deliver everything a family wants and needs in a portable interface.
The Portal Go includes settings to allow multiple users with customized experiences, so kids won't accidentally Zoom your boss. Not all smart displays include kid profiles in that manner, but many tablets do. Samsung's line of Galaxy Tabs has a Samsung Kids mode. iPads don't deliver as much here, but there is a "guided access" mode that locks out the touchscreen so you can plop on that one episode of Paw Patrol and go about your business.
The Amazon Fire HD 10 Kids Edition is made just for kids.
Amazon
Spec-wise, both the Samsung Galaxy Tab A7 and the Amazon Fire HD 10 have a better screen resolution than the Portal (2,000x1,200 and 1,920x1,200 pixels respectively). The ability to download Amazon and Google smart home apps makes them decent tools for operating a smart home. Camera specs vary across tablets and displays, but the Portal Go isn't better than most of them when it comes to megapixels or field of view.
Design
Here's where Facebook gets a few points. I like the idea of a portable smart display. They're trying something new. It's not a tablet and it doesn't look like one. A built-in handle on the back of the Portal Go makes it easy to pick up. It's lightweight enough at three pounds. The thick base tapers up to the top for a sturdy, but still streamlined look.
The back of the Portal Go features a built-in handle.
Molly Price/CNET
Fabric covers the body of the display that houses two 5-watt, full-range speakers and one 20-watt woofer. The Portal Go puts out plenty of sounds for casual music and video streaming. I liked the intuitive, physical volume buttons on the top of the device.
It might not beat out a tablet option, especially if you buy a folio cover and prop your tablet up for viewing. It doesn't rotate like the Show 10, or even tilt like its larger sibling, the Portal Plus. Still, it's nice to look at and I enjoyed toting it around the house while I did chores and caught up on my favorite YouTube channels, via the Browser bookmark. It freed up my phone, tablet and my laptop.
Features
This is the limiting factor here for me. The Portal Go just doesn't do as much as any other tablet or smart display on the market. There are so few apps (24) that I have room to list them here. Imagine if I tried to do that with a tablet, Amazon Echo skills or Google Nest compatible apps and devices. We'd need a bigger internet. Here's the entire list, in alphabetical order:
Alexa
BlueJeans
Browser
Calendar
CBS News
Deezer
Facebook Live
Food Network
GoToMeeting
iHeartRadio
Newsy
Pandora
Photo Booth
Plex
Red Bull TV
SideChef
Spotify
Story Time
Tidal Music
Watch
Webex Meetings
Workplace
Workplace Live
Zoom
What's really odd here is what's missing. There's no native Facebook, Instagram, Messenger or WhatsApp app you can click on to open the platform. Features from those apps are built into various parts of the Portal experience. You can link your Instagram account to display your feed photos on the Portal Go's ambient screen. Messenger chat rooms can be created in the Portal Contacts app. You can use WhatsApp to set up the Portal instead of Facebook (but it will disable the "Hey, Portal" voice integration).
Facebook representatives told me that's because, "The core use for Portal is video calling and to make it easier for people to connect with their family, friends and colleagues. You can use the web browser on Portal to access your favorite Facebook apps. People have other devices they typically use for browsing Facebook, Instagram, etc."
That makes sense, I suppose. I'm not likely to scroll instagram on a display that isn't small and handheld or hunt and peck on a vertical touchscreen to reply to WhatsApp messages. Just be warned if you think this device is a one-stop shop for all your social apps -- it isn't.
Facebook could choose to add more apps with future updates. Other smart display makers have certainly done so in the past. It's possible that this list will expand. However, as it stands today, these are all your options for apps on the Portal Go.
Video chatting
You can video chat through Zoom, Webex, BlueJeans, GoToMeeting and Workplace. Facebook told me that Microsoft Teams support is in the works for later this year. Video calling your contacts happens through the Contacts app, where you'll create a Messenger room to chat. You can add up to seven people to your room, for a total of eight including yourself. The contacts available here are pulled from the WhatsApp or Facebook account you used to set up the Portal. The friends and family you call don't need to have a Portal device.
The Portal Go's AR camera and add-ins like Story Time are the big selling points here. Tapping the Experiences button in your video calls can add fun filters and special effects. There you'll find the safe effects and filters from the Photo Booth app, as well as access to Story Time, Photo Casting and the Watch Together feature for streaming Facebook Watch content.
A scene from the Dr. Seuss book There's a Wocket in my Pocket.
Molly Price/CNET
Open Story Time during a call and you'll find dozens of fun stories with filters and special effects that interact with the story teller on camera. Content includes Dr. Seuss, Pete the Cat, Llama Llama, Todd Parr and Jon Burgerman. There are also classic stories like Three Little Pigs, Itsy Bitsy Spider and Little Red Riding Hood. It really is fun and interactive and I wish it were available on better devices.
Alexa on Portal
Alexa is built into the Portal Go. You can access it via voice or from the Alexa app. In the app, a familiar Alexa dashboard displays your devices and has menu options for music, devices and more. Some Easter egg features like Alexa's recent Halloween routine with special effects and sounds didn't work on the Portal Go or Plus. The correct screen appeared, but the audio glitched and skipped most of the content. That's a small thing, but something you wouldn't encounter on an Amazon display.
Like previous models, the Portal Go has both "Hey, Portal" and Alexa capability. It certainly doesn't simplify anything and it sometimes leaves you wondering, "Wait, which assistant am I supposed to ask for this?"
The Alexa dashboard on the Portal Go shows options for using the assistant.
Molly Price/CNET
"Hey, Portal" commands control things like calling, playing music, putting the device to sleep and even some general knowledge. That means there's overlap between "Hey, Portal" (you can also say "Hey, Facebook") and Alexa, who is primarily present to answer general questions, set reminder, control smart home devices and all the normal Alexa things you'd expect.
The last thing a smart device should do is prompt more thinking or decision making when you're trying to use it. The split personality here doesn't feel intuitive enough to be useful.
Privacy
Adjacent to any Facebook hardware is the ongoing saga of Facebook's data and privacy failings. You can dig deeper into the Portal privacy features here, but these are the big takeaways:
Portal devices are listening for the "Hey, Portal" wake word. If it hears that phrase, Portal sends a short transcript and recording of the voice interaction to Facebook. Stored "Hey, Portal" and "Hey, Facebook" voice interactions are kept on Facebook's servers for up to three years, while "false wakes" are deleted within 90 days of detection.
You can review and delete voice interactions recorded and transcribed from the "Hey, Portal" prompt. You can also choose to disable voice interaction recording all together, though it is on by default. If you only connect your Portal to WhatsApp and not Facebook, "Hey, Portal" won't work at all, due to WhatsApp's end-to-end encryption.
You can disable the camera and microphone by tapping the mute button on the top of the device or sliding the camera's physical cover. A red light indicates those features are disabled. The Portal Go's camera uses AI technology that is processed locally, not on Facebook servers. The Portal Go's camera does not use facial recognition.
A physical shutter, red light and on-screen notification let you know the camera and microphones are off.
Molly Price/CNET
The better options
There are two ways you could approach Portal alternatives. If you're looking for streaming, lots of apps, including social media but liked the portability of the Portal Go, get a tablet. Nearly any model from Samsung, Amazon or Apple will give you more capability, processing power and options for streaming and social media platforms. Throw in an Apple Pencil, S Pen or generic stylus and you've got even more flexibility.
If you're hoping for an Alexa-focused smart home device, the Echo Show 8 and Echo Show 10 are both fantastic (and often more affordable) options with broader apps for streaming and video chatting.
The following charts compare popular tablets and smart displays with the Portal Plus and Portal Go.
Portals vs. tablets
Portal Go
Portal Plus
Galaxy Tab A7
Amazon Fire HD 10
iPad (2021)
Price
$200
$350
$650
$150
from $329
Screen size
10.1 inches
14 inches
11 inches
10.1 inches
10.2 inches
Resolution
1,200x800
2,160x1,440
2,560x1,600
1,920x1,200
2,160x1,620
Camera
12 megapixels
12 megapixels
8 megapixels front, 13 and 5 megapixels rear
2 megapixels front, 5 megapixels rear
12 megapixels front, 8 megapixels rear
Physical shutter
Yes
Yes
No
No
No
Field of view
125 degrees
131 degrees
80 degrees
Not listed
122 degrees
Portable
Yes
No
Yes
Yes
Yes
Kid mode
Yes
Yes
Yes
Yes
Limited (Guided Access)
Video apps
Zoom, Webex, BlueJeans, GoToMeeting, Workplace
Zoom, Webex, BlueJeans, GoToMeeting, Workplace
All available from the Google Play Store
All available from the Amazon app store
All available from the App Store
Music apps
Deezer, Pandora, Spotify, Tidal, iHeartRadio
Deezer, Pandora, Spotify, Tidal, iHeartRadio
All available from the Google Play Store
All available from the Amazon Appstore
All available from the App Store
Streaming apps
Facebook Watch, Red Bull TV, Food Network, Facebook Live, CBS News, Plex
Facebook Watch, Red Bull TV, Food Network, Facebook Live, CBS News, Plex
All available from the Google Play Store
All available from the Amazon Appstore
All available from the App Store
Smart home compatibility
Alexa
Alexa
Alexa app, Google Assistant app
Alexa
Siri, Alexa app, Google Assistant app
Internet browser
Yes
Yes
Yes
Yes
Yes
Portals vs. smart displays
Portal Go
Portal Plus
Nest Hub Max
Echo Show 8
Echo Show 10
Price
$200
$350
$229
$100
$250
Screen size
10.1 inches
14 inches
10 inches
8 inches
10.1 inches
Resolution
1,280x800
2,160x1,440
1,280x800
1,280x800
1,280x800
Camera
12 megapixels
12 megapixels
6.5 megapixels
13 megapixels
13 megapixels
Physical shutter
Yes
Yes
No
Yes
Yes
Field of view
125 degrees
131 degrees
127 degrees
110 degrees
110 degrees
Portable
Yes
No
No
No
No
Kid mode
Yes
Yes
No
Yes
Yes
Video apps
Zoom, Webex, BlueJeans, GoToMeeting, Workplace
Zoom, Webex, BlueJeans, GoToMeeting, Workplace
Zoom, Google Duo
Zoom, Skype, Chime
Zoom, Skype, Chime
Music apps
Deezer, Pandora, Spotify, Tidal, iHeartRadio
Deezer, Pandora, Spotify, Tidal, iHeartRadio
Apple Music, Spotify, YouTube Music, Pandora, iHeartRadio, Deezer
Apple music, Spotify, Pandora, Amazon Music, Audible, SiriusXM, Headspace, NPR and more
Apple music, Spotify, Pandora, Amazon Music, Audible, SiriusXM, Headspace, NPR and more
Streaming apps
Facebook Watch, Red Bull TV, Food Network, Facebook Live, CBS News, Plex
Facebook Watch, Red Bull TV, Food Network, Facebook Live, CBS News, Plex
Netflix, Youtube, CW, HBONow, Starz, Red Bull TV, Crackle
Netflix, Hulu, Amazon Prime, Food Network, Red Bull TV, Tubi
Netflix, Hulu, Amazon Prime, Food Network, Red Bull TV, Tubi