Facebook removes 3.2 billion fake accounts, millions of child abuse posts

Facebook removes 3.2 billion fake accounts, millions of child abuse posts
(Reuters) – Facebook Inc <FB.O> removed 3.2 billion fake accounts between April and September this year, along with millions of posts depicting child abuse and suicide, according to its latest content moderation report released on Wednesday.

That more than doubles the number of fake accounts taken down during the same period last year, when 1.55 billion accounts were removed, according to the report.

The world’s biggest social network also disclosed for the first time how many posts it removed from popular photo-sharing app Instagram, which has been identified as a growing area of concern about fake news by disinformation researchers.

Proactive detection of violating content was lower across all categories on Instagram than on Facebook’s flagship app, where the company initially implemented many of its detection tools, the company said in its fourth content moderation report.

For example, the company said it proactively detected content affiliated with terrorist organizations 98.5% of the time on Facebook and 92.2% of the time on Instagram.

It removed more than 11.6 million pieces of content depicting child nudity and sexual exploitation of children on Facebook and 754,000 pieces on Instagram during the third quarter.

Facebook also added data on actions it took around content involving self-harm for the first time in the report. It said it had removed about 2.5 million posts in the third quarter that depicted or encouraged suicide or self-injury.

The company also removed about 4.4 million pieces involving drug sales during the quarter, it said in a blog post.

(Reporting by Akanksha Rana in Bengaluru and Katie Paul in San Francisco; Editing by Maju Samuel and Lisa Shumaker)

Twitter, Facebook accuse China of using fake accounts to undermine Hong Kong protests

FILE PHOTO: A 3-D printed Facebook logo is seen in front of displayed binary code in this illustration picture, June 18, 2019. REUTERS/Dado Ruvic/Illustration/File Photo

By Katie Paul and Elizabeth Culliford

(Reuters) – Twitter Inc and Facebook Inc said on Monday they had dismantled a state-backed information operation originating in mainland China that sought to undermine protests in Hong Kong.

Twitter said it suspended 936 accounts and the operations appeared to be a coordinated state-backed effort originating in China. It said these accounts were just the most active portions of this campaign and that a “larger, spammy network” of approximately 200,000 accounts had been proactively suspended before they were substantially active.

Facebook said it had removed accounts and pages from a small network after a tip from Twitter. It said that its investigation found links to individuals associated with the Chinese government.

Social media companies are under pressure to stem illicit political influence campaigns online ahead of the U.S. election in November 2020. A 22-month U.S. investigation concluded Russia interfered in a “sweeping and systematic fashion” in the 2016 U.S. election to help Donald Trump win the presidency.

The Chinese embassy in Washington and the U.S. State Department were not immediately available to comment.

The Hong Kong protests, which have presented one of the biggest challenges for Chinese President Xi Jinping since he came to power in 2012, began in June as opposition to a now-suspended bill that would allow suspects to be extradited to mainland China for trial in Communist Party-controlled courts. They have since swelled into wider calls for democracy.

Twitter in a blog post said the accounts undermined the legitimacy and political positions of the protest movement in Hong Kong.

Examples of posts provided by Twitter included a tweet from a user with photos of protesters storming Hong Kong’s Legislative Council building, which asked: “Are these people who smashed the Legco crazy or taking benefits from the bad guys? It’s a complete violent behavior, we don’t want you radical people in Hong Kong. Just get out of here!”

In examples provided by Facebook, one post called the protesters “Hong Kong cockroaches” and claimed that they “refused to show their faces.”

In a separate statement, Twitter said it was updating its advertising policy and would not accept advertising from state-controlled news media entities going forward.

Alphabet Inc’s YouTube video service told Reuters in June that state-owned media companies maintained the same privileges as any other user, including the ability to run ads in accordance with its rules. YouTube did not immediately respond to a request for comment on Monday on whether it had detected inauthentic content related to protests in Hong Kong.

(Reporting by Katie Paul in Aspen, Colorado, and Elizabeth Culliford in San Francisco; Additional reporting by Sayanti Chakraborty in Bengaluru; Editing by Lisa Shumaker)

Facebook boots 115 accounts on eve of U.S. election after tip

A voter fills out her ballot at an early voting polling station in Milwaukee, Wisconsin, U.S. November 4, 2018. REUTERS/Nick Oxford

By Paresh Dave and Philip George

SAN FRANCISCO (Reuters) – Facebook Inc blocked about 115 user accounts after U.S. authorities tipped it off to suspicious behavior that may be linked to a foreign entity, the company said in a blog post on Monday, hours before U.S. voters head to the polls.

The social network said it needed to do further analysis to decide if the accounts are linked to Russia’s Internet Research Agency or another group. The United States has accused the Russian government body of meddling in U.S. politics with social media posts meant to spread misinformation and sow discord.

Eighty-five of the removed accounts were posting in English on Facebook’s Instagram service, and 30 more were on Facebook and associated with pages in French and Russian, the post said.

Some accounts “were focused on celebrities” and others on “political debate,” it added.

The tip came from U.S. law enforcement on Sunday night, Nathaniel Gleicher, Facebook’s head of cybersecurity policy, wrote in the post.

The company announced its actions earlier in its investigation than typical “given that we are only one day away from important elections in the U.S.,” he added.

This year’s contest has been portrayed as crucial by both Republicans and Democrats because both chambers of Congress, and the accompanying ability to pass or reject President Donald Trump’s agenda, are up for grabs.

“Americans should be aware that foreign actors, and Russia in particular, continue to try to influence public sentiment and voter perceptions through actions intended to sow discord,” including through social media, federal authorities said in a statement on Monday.

Social media companies say they are now more vigilant against foreign and other potential election interference after finding themselves unprepared to tackle such activity in the U.S. presidential election two years ago.

(This story corrects headline, paragraph 5 to show tip came from U.S. law enforcement, not FBI)

(Reporting by Philip George in Bengaluru and Paresh Dave in San Francisco; Editing by Gopakumar Warrier and Clarence Fernandez)

Facebook removes fake accounts tied to Iran that lured over 1 million followers

FILE PHOTO: A woman looks at the Facebook logo on an iPad in this photo illustration taken June 3, 2018. REUTERS/Regis Duvignau/Illustration/File Photo

By Christopher Bing and Munsif Vengattil

WASHINGTON (Reuters) – Facebook Inc said on Friday it had deleted accounts originating in Iran that attracted more than 1 million U.S. and British followers, its latest effort to combat disinformation activity on its platform.

Social media companies are struggling to stop attempts by people inside and outside the United States to spread false information on their platforms with goals ranging from destabilizing elections by stoking hardline positions to supporting propaganda campaigns.

The fake Facebook accounts originating in Iran mostly targeted American liberals, according to the Atlantic Council’s Digital Forensic Research Lab, a think tank that works with Facebook to study propaganda online.

Facebook said it removed 82 pages, groups and accounts on Facebook and Instagram that represented themselves as being American or British citizens, then posted on “politically charged” topics such as race relations, opposition to U.S. President Donald Trump and immigration, Facebook’s head of cybersecurity policy, Nathaniel Gleicher, said in a blog post.

In total, the removed accounts attracted more than 1 million followers. The Iran-linked posts were amplified through less than $100 in advertising on Facebook and Instagram, Facebook said.

While the accounts originated in Iran, it was unclear if they were linked to the Tehran government, according to Facebook, which shared the information with researchers, other technology companies and the British and U.S. governments.

The Iranian U.N. mission did not immediately respond to a request for comment.

The action follows takedowns in August by Facebook, Twitter Inc and Alphabet Inc of hundreds of accounts linked to Iranian propaganda.

The latest operation was more sophisticated in some instances, making it difficult to identify, Gleicher said during a press conference phone call on Friday.

Although most of accounts and pages had existed only since earlier this year, they attracted more followers than the accounts removed in August, some of which dated back to 2013. The previously suspended Iranian accounts and pages garnered roughly 983,000 followers before being removed.

“It looks like the intention was to embed in highly active and engaged communities by posting inflammatory content, and then insert messaging on Saudi and Israel which amplified the Iranian government’s narrative,” said Ben Nimmo, an information defense fellow with the Digital Forensic Research Lab.

“Most of the posts concerned divisive issues in the U.S., and posted a liberal or progressive viewpoint, especially on race relations and police violence,” Nimmo said.

Social media companies have increasingly targeted foreign interference on their platforms following criticism that they did not do enough to detect, halt and disclose Russian efforts to use their platforms to influence the outcome of the 2016 U.S. presidential race.

Iran and Russia have denied allegations that they have used social media platforms to launch disinformation campaigns.

(Reporting by Chris Bing in Washington and Munsif Vengattil in Bengalaru, additional reporting by Jack Stubbs in London and Michelle Nichols in New York; Editing by Steve Orlofsky, Bernadette Baum and Susan Thomas)

Facebook says it uncovers new meddling before 2018 U.S. elections

FILE PHOTO: Facebook logo is seen at a start-up companies gathering at Paris' Station F in Paris, France on January 17, 2017. REUTERS/Philippe Wojazer/File Photo

By Joseph Menn and Paresh Dave

(Reuters) – Facebook Inc has identified a new coordinated political influence campaign to mislead users and organize rallies ahead of November’s U.S. congressional elections, taking down dozens of fake accounts on its site, the company said on Tuesday.

A Russian propaganda arm tried to tamper in the 2016 U.S. election by posting and buying ads on Facebook, according to the company and U.S. intelligence agencies. Moscow has denied involvement.

Facebook on Tuesday said they had removed 32 pages and accounts from Facebook and Instagram, part of an effort to combat foreign meddling in U.S. elections, attempts that lawmakers have called dangerous for democracy.

The company said it was still in the early stages of its investigation and did not yet know who may be behind the influence campaign for 2018 elections that will determine whether or not the Republican Party keeps control of Congress.

Chief Operating Officer Sheryl Sandberg said on a call with reporters that the attempts to manipulate public opinion would likely become more sophisticated to evade Facebook’s scrutiny, calling it an “arms race.”

“This kind of behavior is not allowed on Facebook because we don’t want people or organizations creating networks of accounts to mislead others about who they are, or what they’re doing,” the company said in a blogpost.

More than 290,000 accounts followed at least one of the pages and that about $11,000 had been spent on about 150 ads, Facebook said. The pages had created about 30 events since May 2017.

Facebook for months has been on the defensive about influence activity on its site and concerns over user privacy tied to longstanding agreements with developers that allowed them access to private user data.

DIVISIVE ISSUES

Facebook identified influence activity around at least two issues, including a counter-protest to a “Unite the Right II” rally set next week in Washington. The other was the #AbolishICE social media campaign aimed at the U.S. Immigration and Customs Enforcement agency.

In the blog post, Facebook said it was revealing the influence effort now in part because of the rally. A previous event last year in Charlottesville, South Carolina, led to violence by white supremacists.

Facebook said it would tell users who had expressed interest in the counter-protest what action it had taken and why.

Facebook officials on a call with reporters said that one known account from Russia’s Internet Research Agency was a co-administrator of one of the fake pages for seven minutes, but the company did not believe that was enough evidence to attribute the campaign to the Russian government.

The company previously had said 126 million Americans may have seen Russian-backed political content on Facebook over a two-year period, and that 16 million may have been exposed to Russian information on Instagram.

Adam Schiff, the top Democrat on the U.S. House of Representatives Intelligence Committee, in a statement urged Facebook to move against foreign groups trying to sway American voters and to warn legitimate users that such activity, as seen in 2016, is recurring this year.

“Today’s announcement from Facebook demonstrates what we’ve long feared: that malicious foreign actors bearing the hallmarks of previously-identified Russian influence campaigns continue to abuse and weaponize social media platforms to influence the U.S. electorate,” Schiff said.

Senate Intelligence Committee Chairman Richard Burr, a Republican, said in a statement there would be a hearing on Wednesday about the online threat to U.S. election security.

Burr said the goal of influence operations “is to sow discord, distrust, and division in an attempt to undermine public faith in our institutions and our political system. The Russians want a weak America.”

Washington imposed punitive sanctions on Russia following U.S. intelligence agency conclusions that Moscow interfered to undermine the 2016 U.S. elections, one of the reasons U.S.-Russian relations are at a post-Cold War low. Both U.S. President Donald Trump and Russian President Vladimir Putin have said, however, that they want to improve ties between the two nuclear powers.

Facebook disclosed in September that Russians under fake names had used the social network to try to influence U.S. voters in the months before and after the 2016 election, writing about divisive issues, setting up events and buying ads.

U.S. intelligence agencies said Russian state operators ran the campaign combining fake social media posts and hacking into Democratic Party networks, eventually becoming an effort to help Republican candidate Trump, who scored a surprise victory over Democrat Hillary Clinton.

Over the past several months, the company has taken steps meant to reassure U.S. and European lawmakers that further regulation is unnecessary. Chief Executive Officer Mark Zuckerberg says the company has 20,000 people working to police and protect the site.

Costs associated with that effort are part of the reason Facebook said last week that it expects its profit margins to decline, a warning that sent shares tumbling about 25 percent, the biggest one-day loss of market cap in U.S. stock market history. Shares of Facebook were up about 1.5 percent in Tuesday’s midafternoon trading, part of a broader tech rebound.

(Joseph Menn and Paresh Dave in San Francisco; additional reporting by Munsif Vengattil in Bengaluru, Kevin Drawbaugh in Washington; writing by Peter Henderson; Editing by Saumyadeb Chakrabarty and Grant McCool)