Friday, September 17, 2021

Facebook cracks down on real-user networks over harmful activities

 


Facebook told Reuters that it is using the same strategy as its security teams to stop fake accounts and coordinated accounts from real users engaging in harmful activities on its platform.

This is the first time that the new method has been reported. It uses the same tactics used by Facebook security teams to shut down networks involved in influence operations. These include fake accounts that manipulate public opinion, like Russian troll farm.

This could have significant implications for the way that Facebook handles coordinated political and other movements that violate its rules. It comes at a time where Facebook's handling of abuses on its platforms has been under intense scrutiny by global lawmakers and civil rights groups.

Facebook stated that it will now use the same network-level approach to groups of real accounts that systemically violate its rules. This could be done through mass reporting where many users falsely reported a target's account or content to get it shut down. Or brigading. This is an online harassment technique where users may coordinate to target an individual via mass comments or posts.

Facebook announced on Thursday that it would adopt the same approach to real user campaigns that cause "coordinated harm" on its platforms. It also announced that it had taken down the Querdenken anti- COVID restrictions movement.

A spokeswoman for Facebook said that these expansions were still in the early stages. This means that Facebook's security team could identify core movements behind such behavior and take more drastic actions than the company taking down individual accounts or removing posts.

The Facebook security experts are independent from its content moderators. They handle threats from adversaries and crack down on influence operations using fake account in 2017. This was after the 2016 US election, in which US intelligence officers concluded that Russia had used Facebook as part of a cyber campaign. Moscow has denied this claim.

Facebook called this prohibited activity by groups of fake accounts "coordinated authentic behaviour" (CIB), while its security teams began announcing sweeping takesdowns in monthly reports. Some specific threats are not handled by security teams, including fraud or cyber-espionage networks and overt influence operations such as state media campaigns.

Sources claimed that the company's teams had been debating how to intervene at the network level in large user account movements that were violating its rules.

Reuters published a July report on the Vietnam army's internet information warfare unit. They engaged in mass reporting accounts to Facebook, but often used their real names. These mass reporting attempts led to Facebook removing some accounts.

Global regulators, lawmakers and employees are putting increasing pressure on Facebook to stop wide-ranging abuses of its services. Others have criticized the company for censorship, anticonservative bias and inconsistent enforcement.

The expansion of Facebook's network disruption model to target authentic accounts raises more questions about the impact of changes on types of public discourse, online movements, and campaign tactics across all political spectrums.

Evelyn Douek, a Harvard Law lecturer and researcher on platform governance, said that problematic behavior often looks very similar to social movements. It will hinge on the definition of harm... however, people can have very subjective and ambiguous definitions of harm.

There have been many high-profile examples of coordinated activity in the US election last year. These include teens and K-pop fans using TikTok for sabotage a rally in Tulsa, Oklahoma for former President Donald Trump, and political campaigns paying online me-makers.


0 comments:

Post a Comment