DJI Phantom 3

Facebook Rules For Groups

Embark on a Quest with Facebook Rules For Groups

Step into a world where the focus is keenly set on Facebook Rules For Groups. Within the confines of this article, a tapestry of references to Facebook Rules For Groups awaits your exploration. If your pursuit involves unraveling the depths of Facebook Rules For Groups, you've arrived at the perfect destination.

Our narrative unfolds with a wealth of insights surrounding Facebook Rules For Groups. This is not just a standard article; it's a curated journey into the facets and intricacies of Facebook Rules For Groups. Whether you're thirsting for comprehensive knowledge or just a glimpse into the universe of Facebook Rules For Groups, this promises to be an enriching experience.

The spotlight is firmly on Facebook Rules For Groups, and as you navigate through the text on these digital pages, you'll discover an extensive array of information centered around Facebook Rules For Groups. This is more than mere information; it's an invitation to immerse yourself in the enthralling world of Facebook Rules For Groups.

So, if you're eager to satisfy your curiosity about Facebook Rules For Groups, your journey commences here. Let's embark together on a captivating odyssey through the myriad dimensions of Facebook Rules For Groups.

Showing posts sorted by relevance for query Facebook Rules For Groups. Sort by date Show all posts
Showing posts sorted by relevance for query Facebook Rules For Groups. Sort by date Show all posts

Facebook Suspends Rules To Allow Some Calls For Violence Against Russian Invaders


Facebook Suspends Rules to Allow Some Calls for Violence Against Russian Invaders


Facebook Suspends Rules to Allow Some Calls for Violence Against Russian Invaders

What's happening

Facebook's parent company Meta said it's temporarily allowing some violent content against Russian invaders, making an unusual exemption to its rules against hate speech.

Why it matters

The move is already escalating tensions between Meta and Russia. Roskomnadzor, the country's telecommunications agency, said Friday it's restricting Instagram, a photo-and-video service owned by Meta. Russia's Investigative Committee is opening a criminal investigation against Meta.

What's next

Russia might take more actions against Meta as it moves forward with the criminal case against the social media giant. The company also owns messaging app WhatsApp though no restrictions against that service have been announced.

Facebook parent company Meta is setting aside its rules and allowing some violent speech against Russian invaders, saying it views these remarks as political speech. 

"As a result of the Russian invasion of Ukraine we have temporarily made allowances for forms of political expression that would normally violate our rules like violent speech such as 'death to the Russian invaders.' We still won't allow credible calls for violence against Russian civilians," Meta spokesman Andy Stone said in a tweet Thursday.

The rare exemption to the company's rules against hate speech, which bars people from posting content targeting a group of people, including violent content, shows how the world's largest social network is moderating content about Russia's invasion of Ukraine. The move, though, is already escalating tensions between Meta and the Russian government. 

Russia's Investigative Committee said in a statement Friday that it's opened a criminal case against Meta for allegedly violating the criminal code of the Russian Federation that bars public calls for extremist activities and assistance in terrorist activities. 

"As part of the criminal case, the necessary investigative measures are being carried out to give a legal evaluation to actions of Andy Stone and other employees of the American corporation," the committee, which reports to Russia President Vladimir Putin, said in the statement. 

Facebook has been facing a greater number of calls to crack down more heavily on propaganda and misinformation. Last week, Russia said it was blocking the social network after Facebook started to make content from Russian state-controlled media tougher to find on its platform and tapped third party fact-checkers to debunk false claims. On Friday, Russia's telecommunications regulator, Roskomnadzor, said in a statement that the Prosecutor General's Office of Russia demanded that the agency also restrict access to Meta-owned photo-and-video service Instagram. Roskomnadzor said the restrictions will take effect March 14 to allow users to transfer their photos and videos to other social networks and notify their followers and contacts. 

Nick Clegg, who leads global affairs at Meta, said in a statement Friday that the company's policies are "focused on protecting people's rights to speech as an expression of self-defense in reaction to a military invasion of their country." He added that Meta is applying the exemption only in Ukraine and that it made the decision because of "extraordinary and unprecedented circumstances."

"We have no quarrel with the Russian people. There is no change at all in our policies on hate speech as far as the Russian people are concerned. We will not tolerate Russophobia or any kind of discrimination, harassment or violence towards Russians on our platform," Clegg said.

The Russian Embassy in the US also responded to Thursday's decision, saying Meta's actions were equivalent to a declaration of information war against Russia, according to a report by Russian state-operated news agency Novosti. In a post on Twitter, the embassy called on US authorities to "stop the extremist activities of Meta."

For years, Facebook has also grappled with criticism that its rules are enforced unevenly. The company created a semi-independent oversight board to weigh in on its toughest content moderation decisions. 

Reuters, which first reported the policy change, said that in certain countries, including Russia, Ukraine and Poland, the social media giant is also allowing some posts that call for death to Russian President Vladimir Putin or Belarusian President Alexander Lukashenko. The changes also apply to Instagram. 

Citing internal emails, Reuters said that calls for death won't be allowed if they contain other targets or include "two indicators of credibility" such as the location or method of death. The posts must also be about the invasion of Ukraine. Calls for violence against Russian soldiers will also be allowed in Armenia, Azerbaijan, Estonia, Georgia, Hungary, Latvia, Lithuania, Poland, Romania, Russia, Slovakia and Ukraine, Reuters reported.

Also Thursday, Facebook and Twitter removed posts from Russia's embassy in the UK over false claims surrounding the bombing of a maternity hospital in the Ukraine city of Mariupol on Wednesday.

At least one child and two adults were killed at the hospital and another 17 were injured, Ukraine officials have said.

Meta didn't immediately answer questions about how long it expects the exemption will be in place or the number of posts that may be impacted. 

Meta hasn't released data about how many Facebook and Instagram users are in Russia. App analytics firm Sensor Tower estimates that since 2014 Instagram has been installed 166 million times from Google Play and the Apple App Store in Russia. Facebook in Russia has an estimated 56.2 million installs. Sensor Tower says that based on that data, Russia is the fifth largest market for Instagram and the 20th largest market for Facebook.


Source

Tags:

Can Facebook Be Broken Up? What You Need To Know


Can facebook be broken/updated mods can facebook be broken up british label can facebook be broken up synonym can facebook be broken up meaning can facebook be broken up stool can facebook be broken up car can facebook be broken ministry can facebook be sued for invasion of privacy can facebook be deleted permanently can facebook be used in divorce court can facebook use my photos how long can facebook videos be can facebook cause depression can facebook censor free speech

Can Facebook be broken up? What you need to know


Can Facebook be broken up? What you need to know

Facebook CEO Mark Zuckerberg wields so much power that even one of the social network's co-founders thinks it's both "unprecedented" and "un-American."

Chris Hughes, who co-founded Facebook with Zuckerberg while they were students at Harvard, called for the social network to be broken up in an op-ed published Thursday by The New York Times. "I'm angry that his focus on growth led him to sacrifice security and civility for clicks," Hughes wrote, referring to Facebook's boss and major shareholder. "I'm disappointed in myself and the early Facebook team for not thinking more about how the News Feed algorithm could change our culture, influence elections and empower nationalist leaders."

Facebook's rapid growth has been fueled by acquisitions, including Instagram and WhatsApp, a messaging service. Critics and experts say Facebook simply purchased its competition, rather than innovating to meet the challenges they posed.

"Their whole business model is to identify potential threats and then buy them or beat them in some way," said Stephen Diamond, an associate professor of law at Santa Clara University School of Law.

And Facebook has been called out for not doing enough to combat election meddling, misinformation and hate speech. Its enormous power, critics argue, needs to be kept in check. Facebook doesn't want to spin off Instagram and WhatsApp.

Here's what you need to know:

Who wants Facebook broken up? Why?

Calls to break up Facebook aren't new. But it is startling to hear one of the company's co-founders call for such an extreme measure. Hughes argues that Zuckerberg holds so much power that even the company's board of directors can't keep him accountable. Zuckerberg controls around 60 percent of Facebook's voting shares, which means the board technically can't fire him even if he messes up.

Hughes isn't alone. Advocacy groups, including the Electronic Privacy Information Center, Color of Change and Common Sense Media, have previously asked the Federal Trade Commission, the agency that enforces antitrust law, to make Instagram and WhatsApp separate companies. A split would also make it easier for other social media companies to compete with Facebook, the organizations argue.

In addition, a group called Freedom from Facebook has called on the FTC to force Facebook to spin off its Messenger service too.

Sen. Elizabeth Warren, a Democrat from Massachusetts who's also a presidential candidate, is among the lawmakers who want to break up Facebook, as well as other tech giants, including Google and Amazon.

How would Facebook be broken up?

One way to break up Facebook would be for the federal government to file a lawsuit against the company, arguing it stifles competition. That could prompt a negotiation between the parties that could lead to Facebook agreeing to make itself smaller.

Another alternative would be for Congress to pass a law covering tech monopolies. Warren has proposed such a law, which would require tech platforms that take in $25 billion or more in sales to "structurally separate" their products. Amazon, for example, would have to spin off its house brand Amazon Basics. Warren said that if she won the presidential election her administration would also appoint regulators to unwind the mergers of Instagram and WhatsApp from Facebook.

What does Facebook think about the idea?

Facebook has pushed back, arguing that breaking up the company wouldn't hold the social network more accountable for its actions. Instead, Facebook has called for more internet regulation around harmful content, election integrity, privacy and data portability.

"Accountability of tech companies can only be achieved through the painstaking introduction of new rules for the internet," Nick Clegg, Facebook's vice president of global affairs and communications, said in a statement Thursday. The social network also said that having Instagram and WhatsApp under Facebook helps them fight spam, election meddling and crime. Facebook says it has plenty of competition, pointing to YouTube, Snapchat, iMessage and WeChat, among others.

Clegg touched on all those points in a Saturday editorial in The New York Times.

Have tech companies been broken up in the past?

Yes, but it's unusual. In 1974, the US Department of Justice filed an antitrust lawsuit against AT&T but the matter wasn't settled until eight years later. The telephone company was required to spin off two-thirds of its assets into separate companies, according to a 1982 article from The Washington Post. The government has also tried to break up Microsoft and in 2000 a US federal judge ordered that the tech giant split into two companies. Microsoft appealed and the decision was reversed.

What would this mean for users of Facebook?

Facebook is trying to integrate its messaging services so users of Facebook, Instagram and WhatsApp can send messages to one another without switching apps. Splitting up the companies might prevent that from happening.

Those who want the government to break Facebook up argue the move would fuel more competition among social media companies, which could mean more options for consumers. About 2.7 billion people use Facebook, Instagram, WhatsApp or Messenger every month.

Diamond said that breaking up Facebook could also lead to the company stepping up its privacy efforts to match its social media competitors.

What are the chances this happens?

The FTC declined to comment on whether it's looking to break up Facebook. But if history is any indication, it would be a rare move.

"I doubt there is sufficient political momentum to break up Facebook," Diamond said. "I'm skeptical, even though I think there might be good reasons to do it."

Originally published May 10, 5:40 a.m. PT.
Update, May 11: Adds mention of Facebook VP Nick Clegg's Saturday editorial in the Times.


Source

Facebook Removed More Than 20 Million Posts For COVID-19 Misinformation


Facebook removed more than 20 million posts for covid 19 sportscenter facebook removed more than 20 million posts for covid do you take facebook removed more than 20 million posts for covid vaccine facebook removed more than 20 million posts for signs facebook removed more than 20 years facebook removed more than 20 amino facebook more together commercials facebook removed most recent feed
Facebook removed more than 20 million posts for COVID-19 misinformation


Facebook removed more than 20 million posts for COVID-19 misinformation

Facebook and its photo-service Instagram took down more than 20 million pieces of content containing COVID-19 misinformation between the start of the pandemic and June but couldn't say how prevalent these types of false claims are on the platforms.

The social network measures the prevalence of other types of content such as hate speech and adult nudity because it gives the company a sense of what offensive posts Facebook missed. Providing this metric for COVID-19 misinformation, the company said, is more complex. 

"When it comes to COVID, though, things are evolving even more quickly so it does make prevalence even more difficult to define and measure," said Guy Rosen, Facebook's vice president of integrity, during a press conference on Wednesday.

The action came about a month after the White House singled out Facebook in saying that about a dozen people were responsible for creating 65% of the vaccine misinformation on social media platforms -- all of whom remained active on the social networking giant.

Despite the action against "disinformation dozen," the White House continued to criticize Facebook's response to misinformation.

"In the middle of a pandemic, being honest and transparent about the work that needs to be done to protect public health is absolutely vital, but Facebook still refuses to be straightforward about how much misinformation is circulating -- and being actively promoted -- on their platform," a White House spokesperson told CNN Business on Wednesday.

Facebook didn't immediately respond to a request for comment on the spokesperson's remarks.

Politicians, including US President Joe Biden, and advocacy groups have criticized social networks for failing to effectively combat the spread of COVID-19 and vaccine misinformation. Facebook partners with fact-checkers, directs people to authoritative information and labels misinformation. But researchers have questioned how effective those measures are in curbing the spread of false claims online.

"There will always be examples of things we missed and, with a scale of our enforcement, there will be examples of things that we take down by mistake," Rosen said. "There is no perfect here." 

Facebook said it has more than 65 criteria for false claims about COVID-19 and vaccines that would prompt it to remove posts from its platforms. The company has added to this list, including false claims that COVID-19 vaccines cause Alzheimer's and that being around vaccinated people could cause secondary side effects to others.

The social network said it removed more 3,000 accounts, pages and groups for violating its rules against COVID-19 and vaccines. It has also displayed warnings on more than 190 million pieces of COVID-related content on Facebook that fact-checkers rated, and it displays these posts lower in people's News Feeds.

Facebook, which partnered with Carnegie-Mellon University and the University of Maryland on a COVID-19 survey, said that vaccine hesitancy for people in the US on Facebook has declined by 50%. Vaccine acceptance increased by 35% in France, 25% in Indonesia and 20% in Nigeria, the social network said.

The company also shared new data including what domains, links, pages and posts were the most widely viewed in the US on Facebook between April and June. Facebook counts a view when content appears on the News Feed, so the metric differs from engagement. The social network owns data analytics tool CrowdTangle, but executives have reportedly raised concerns about data that shows high engagement with right-wing sites. 

"The narrative that has emerged is quite simply wrong," Rosen said, noting that CrowdTangle includes data about interactions from a limited set of certain pages, groups and accounts.

Facebook said the most viewed domain was YouTube. The most viewed link was the Player Alumni Resources, and the top page was from Unicef. The most viewed post was an image from a motivational speaker that asked people about the first words they see in a block of letters.


Source

Can Facebook Be Broken Up? What You Need To Know


Can Facebook be broken up? What you need to know


Can Facebook be broken up? What you need to know

Facebook CEO Mark Zuckerberg wields so much power that even one of the social network's co-founders thinks it's both "unprecedented" and "un-American."

Chris Hughes, who co-founded Facebook with Zuckerberg while they were students at Harvard, called for the social network to be broken up in an op-ed published Thursday by The New York Times. "I'm angry that his focus on growth led him to sacrifice security and civility for clicks," Hughes wrote, referring to Facebook's boss and major shareholder. "I'm disappointed in myself and the early Facebook team for not thinking more about how the News Feed algorithm could change our culture, influence elections and empower nationalist leaders."

Facebook's rapid growth has been fueled by acquisitions, including Instagram and WhatsApp, a messaging service. Critics and experts say Facebook simply purchased its competition, rather than innovating to meet the challenges they posed.

"Their whole business model is to identify potential threats and then buy them or beat them in some way," said Stephen Diamond, an associate professor of law at Santa Clara University School of Law.

And Facebook has been called out for not doing enough to combat election meddling, misinformation and hate speech. Its enormous power, critics argue, needs to be kept in check. Facebook doesn't want to spin off Instagram and WhatsApp.

Here's what you need to know:

Who wants Facebook broken up? Why?

Calls to break up Facebook aren't new. But it is startling to hear one of the company's co-founders call for such an extreme measure. Hughes argues that Zuckerberg holds so much power that even the company's board of directors can't keep him accountable. Zuckerberg controls around 60 percent of Facebook's voting shares, which means the board technically can't fire him even if he messes up.

Hughes isn't alone. Advocacy groups, including the Electronic Privacy Information Center, Color of Change and Common Sense Media, have previously asked the Federal Trade Commission, the agency that enforces antitrust law, to make Instagram and WhatsApp separate companies. A split would also make it easier for other social media companies to compete with Facebook, the organizations argue.

In addition, a group called Freedom from Facebook has called on the FTC to force Facebook to spin off its Messenger service too.

Sen. Elizabeth Warren, a Democrat from Massachusetts who's also a presidential candidate, is among the lawmakers who want to break up Facebook, as well as other tech giants, including Google and Amazon.

How would Facebook be broken up?

One way to break up Facebook would be for the federal government to file a lawsuit against the company, arguing it stifles competition. That could prompt a negotiation between the parties that could lead to Facebook agreeing to make itself smaller.

Another alternative would be for Congress to pass a law covering tech monopolies. Warren has proposed such a law, which would require tech platforms that take in $25 billion or more in sales to "structurally separate" their products. Amazon, for example, would have to spin off its house brand Amazon Basics. Warren said that if she won the presidential election her administration would also appoint regulators to unwind the mergers of Instagram and WhatsApp from Facebook.

What does Facebook think about the idea?

Facebook has pushed back, arguing that breaking up the company wouldn't hold the social network more accountable for its actions. Instead, Facebook has called for more internet regulation around harmful content, election integrity, privacy and data portability.

"Accountability of tech companies can only be achieved through the painstaking introduction of new rules for the internet," Nick Clegg, Facebook's vice president of global affairs and communications, said in a statement Thursday. The social network also said that having Instagram and WhatsApp under Facebook helps them fight spam, election meddling and crime. Facebook says it has plenty of competition, pointing to YouTube, Snapchat, iMessage and WeChat, among others.

Clegg touched on all those points in a Saturday editorial in The New York Times.

Have tech companies been broken up in the past?

Yes, but it's unusual. In 1974, the US Department of Justice filed an antitrust lawsuit against AT&T but the matter wasn't settled until eight years later. The telephone company was required to spin off two-thirds of its assets into separate companies, according to a 1982 article from The Washington Post. The government has also tried to break up Microsoft and in 2000 a US federal judge ordered that the tech giant split into two companies. Microsoft appealed and the decision was reversed.

What would this mean for users of Facebook?

Facebook is trying to integrate its messaging services so users of Facebook, Instagram and WhatsApp can send messages to one another without switching apps. Splitting up the companies might prevent that from happening.

Those who want the government to break Facebook up argue the move would fuel more competition among social media companies, which could mean more options for consumers. About 2.7 billion people use Facebook, Instagram, WhatsApp or Messenger every month.

Diamond said that breaking up Facebook could also lead to the company stepping up its privacy efforts to match its social media competitors.

What are the chances this happens?

The FTC declined to comment on whether it's looking to break up Facebook. But if history is any indication, it would be a rare move.

"I doubt there is sufficient political momentum to break up Facebook," Diamond said. "I'm skeptical, even though I think there might be good reasons to do it."

Originally published May 10, 5:40 a.m. PT.
Update, May 11: Adds mention of Facebook VP Nick Clegg's Saturday editorial in the Times.


Source

Tags:

Spotify's Joe Rogan Problem: Turns Out His Deal Might Be Worth $200 Million


Spotify s joe rogan problem turns out synonym spotify s joe rogan problem turns out my online spotify s joe rogan problem turns all year spotify s joe rogan problematic spotify s joe problem spotify student spotify sign in how to download spotify songs cancel spotify subscription
Spotify's Joe Rogan Problem: Turns Out His Deal Might Be Worth $200 Million


Spotify's Joe Rogan Problem: Turns Out His Deal Might Be Worth $200 Million

Joe Rogan and his podcast, The Joe Rogan Experience, are at the center of growing concerns over COVID-19 misinformation and the host's use of racial slurs in dozens of episodes. This has put pressure on Spotify, the music streaming service that signed the comedian to an exclusivity deal in 2020.

In January, rock legend Neil Young pulled his music from Spotify over objections to false claims about COVID-19 vaccines on Rogan's popular podcast. Some other artists joined the boycott, but the backlash grew soon after when a compilation video of Rogan using a racial slur on numerous past episodes began circulating on social media.

Spotify CEO Daniel Ek confirmed that Rogan chose to remove multiple episodes of his popular podcast from the streaming service after the company's leadership discussed his use of "racially insensitive language," according to a memo sent to employees. 

Spotify continues to grapple with a dilemma that many internet giants like Facebook and YouTube face: balancing freedom of expression and effective moderation of objectionable content on their platforms. It views Rogan as a key component to its growth as an audio platform, and the comedian has said being able to express himself is one of the reasons he moved his podcast to the streaming service. The company paid the comedian a reported $200 million, double the amount previously thought, according to a report from The New York Times Thursday. 

Rogan posted an apology to Instagram on Feb. 5, saying he "wasn't trying to be racist" and agreeing that he shouldn't use such slurs, regardless of the context. Rogan said the backlash was a "political hit job" in an episode of his podcast posted on Feb. 8 but added that it was a "relief" to address comments he regrets making. 

Here's what you need to know about the backlash against Joe Rogan and Spotify. 

Why were episodes of Rogan's podcast removed? 

Videos of Rogan using racial slurs on past episodes went viral on social media at the end of January. This was layered on top of a growing musician boycott over concerns that Rogan's podcast serves as a platform for COVID misinformation. The hashtags #DeleteSpotify and #CancelSpotify began trending on Twitter as some people called for the removal of Rogan's podcast. A consumer poll from Feb. 1 found 19% of Spotify subscribers said they canceled or will cancel their service, according to a report from Variety. 

On Feb. 4, a fan-made website found that more than 100 episodes of Rogan's podcast were no longer available on Spotify. The website, JREMissing, uses Spotify's API to compare available episodes to a database of all episodes recorded. A total of 113 episodes of Rogan's podcast were shown to be removed: 42 happened last year when Rogan moved his show to Spotify. The other 71 were deleted on Feb. 4 without explanation at the time.

Ek sent a memo to Spotify employees about the development on Feb. 6. He confirmed that Rogan chose to remove multiple episodes of his podcast from the streaming service. This came after Spotify's leadership spoke to the comedian about his use of "racially insensitive language."

CNET couldn't confirm a link between the circulating videos and the episodes that were removed from Spotify.

"Some of Joe Rogan's comments [are] incredibly hurtful -- I want to make clear that they do not represent the values of this company," Ek wrote in the memo, which was provided to CNET by a company spokeswoman. "While I strongly condemn what Joe has said and I agree with his decision to remove past episodes from our platform, I realize some will want more. And I want to make one point very clear -- I do not believe that silencing Joe is the answer."

Ek went on to say the company would invest $100 million -- the earlier reported amount it paid to Rogan for exclusivity rights -- for the "licensing, development, and marketing of music (artists and songwriters) and audio content from historically marginalized groups. This will dramatically increase our efforts in these areas." 

Spotify didn't respond to a request for comment on whether it will increase the investment to $200 million to match the newly reported amount of Rogan's deal. 

What has Rogan said about this?

Rogan uploaded a video to his Instagram account on Feb. 5, the day after the podcast episodes were removed, in which he talked about his use of racial slurs and apologized for his actions. 

"I certainly wasn't trying to be racist," he said, "and I certainly would never want to offend someone for entertainment with something as stupid as racism." Rogan agreed he shouldn't use such slurs, regardless of the context.

In episode #1773 of his podcast, Rogan had comedian Akaash Singh on and started the show talking about the blowup, saying it was a "relief."

"This is a political hit job," he said on his podcast. "They're taking all this stuff I've ever said that's wrong and smooshing it all together. It's good because it makes me address some s*** that I really wish wasn't out there." 

How did this all get started?

In December, Rogan had two guests on his show who have been at the forefront of COVID misinformation. Dr. Peter McCullough, a cardiologist, and Dr. Robert Malone, who has described himself as the inventor of the mRNA vaccine, have used their credentials to try to give credibility to false conspiracy theories regarding the pandemic and vaccines. 

COVID-19 vaccines are highly effective at reducing hospitalizations and deaths, and other public health measures like masking and social distancing have helped slow the spread of the virus. The dangers of the illness are clear. To date, there have been more than 419 million cases of COVID-19 around the world and more than 5.8 million deaths, according to the coronavirus resource center at Johns Hopkins University.

On Jan. 12, 250 doctors, professors and researchers signed an open letter to Spotify calling out the streaming service for platforming COVID misinformation, in particular on Rogan's podcast. Since then, more than 1,000 additional medical professionals have signed the letter. 

After coming across the letter, singer-songwriter Young, who rose to fame in the 1960s and '70s, made an ultimatum to Spotify on Jan. 24: either Rogan goes or his music goes. He removed his music Jan. 27, but some songs featuring Young with other artists are still on the platform.  

Other musicians joined Young in a boycott of the service, including: 

The controversy escalated when Grammy-winning singer Arie joined the boycott, saying she found Rogan problematic, not just for his interviews around COVID, but also his language around race. 

Is Spotify doing anything about COVID misinformation on its platform?

Following the musicians' protest over COVID misinformation, Ek responded in a blog post Jan. 30, saying his company doesn't want to be a "content censor" but will make sure that its rules are easy to find and that there are consequences for spreading misinformation. He acknowledged that Spotify hasn't been transparent about them, which led to questions about their application to serious issues including COVID-19.

"Based on the feedback over the last several weeks, it's become clear to me that we have an obligation to do more to provide balance and access to widely accepted information from the medical and scientific communities guiding us through this unprecedented time," Ek said.

Included in the post was a link to Spotify's platform rules detailing what content isn't allowed on the service. Regarding COVID misinformation, the rules specifically prohibit saying that COVID-19 isn't real, encouraging the consumption of bleach to cure diseases, saying vaccines lead to death and suggesting people get infected to build immunity. 

Ek also said the company is working on a content advisory for any podcast episode that talks about COVID. The advisory will guide listeners to the service's COVID-19 hub.

In a Feb. 2 company town hall, Ek told Spotify employees that Rogan's podcast was key to the future of Spotify, according to audio obtained by The Verge. 

"If we want even a shot at achieving our bold ambitions, it will mean having content on Spotify that many of us may not be proud to be associated with," Ek said during the town hall. "Not anything goes, but there will be opinions, ideas and beliefs that we disagree with strongly and even makes us angry or sad."

Spotify employees were reportedly disappointed by his remarks. Members of the company's board of directors were also reportedly not happy with the response according to The New York Times. 

In an Instagram post Jan. 30, Rogan defended his choice to bring on guests like Malone but said he was happy for Spotify to add disclaimers to podcasts on what he called "controversial" topics. He added that if he could do anything differently, it would be to get experts with differing opinions on directly after "controversial ones." 

Who else had something to say about this? 

The White House chimed in on Spotify's move to add misinformation warnings to podcast episodes. In a Feb. 1 press briefing, press secretary Jen Psaki was asked if tech companies should go further than these disclaimers. 

"Our hope Is that all major tech platforms, and all major news sources for that matter, be responsible and be vigilant to ensure the American people have access to accurate information on something as significant as COVID-19. That certainly includes Spotify," Psaki said. "So this disclaimer, it's a positive step, but we want every platform to continue doing more to call out misinformation and disinformation while also uplifting accurate information." 

Psaki also referred to Surgeon General Dr. Vivek Murthy's warning from July about the dangers of misinformation, calling it an "urgent threat."

The CEO of Rumble, a video streaming service known for being a hub of misinformation and conspiracy theories, said Feb. 7 that he'd offer Rogan $100 million over the course of four years if he brought his podcast to the company, 

"This is our chance to save the world," Chris Pavlovski said in a letter to Rogan posted to Twitter. "And yes, this is totally legit." 

During a question and answer portion of a recent comedy show, Rogan told a crowd he plans to stick with Spotify according to a Feb. 8 report from Hollywood Reporter.

Former President Donald Trump on Feb. 7 posted a message on his site saying Rogan shouldn't apologize for what he said. "How many ways can you say you're sorry," the former president wrote. 


Source

Search This Blog

Menu Halaman Statis

close