The deplatforming of President Trump

OSTN Staff

After years of placid admonishments, the tech world came out in force against President Trump this past week following the violent assault of the U.S. Capitol building in Washington D.C. on Wednesday. From Twitter to PayPal, more than a dozen companies have placed unprecedented restrictions or outright banned the current occupant of the White House from using their services, and in some cases, some of his associates and supporters as well.

The news was voluminous and continuous for the past few days, so here’s a recap of who took action when, and what might happen next.

Twitter: a permanent ban and a real-time attempt to shut down all possible account alternatives

Twitter has played a paramount role over the debate about how to moderate President Trump’s communications, given the president’s penchant for the platform and the nearly 90 million followers on his @realDonaldTrump account. In the past, Twitter has repeatedly warned the president, added labels related to electron integrity and misinformation, and outright blocked the occasional tweet.

This week, however, Twitter’s patience seemed to have been exhausted. Shortly after the riots at the Capitol on Wednesday, Twitter put in place a large banner warning its users about the president’s related tweet on the matter, blocking retweets of that specific message. A few hours later, the company instituted a 12-hour ban on the president’s personal account.

At first, it looked like the situation would return to normal, with Twitter offering Thursday morning that it would reinstate the president’s account after he removed tweets the company considered against its policies around inciting violence. The president posted a tweet later on Thursday with a video attachment that seemed to be relatively calmer than his recent fiery rhetoric, a video in which he also accepted the country’s election results for the first time.

Enormous pressure externally on its own platform as well as internal demands from employees kept the policy rapidly changing though. Late Friday night, the company announced that it decided to permanently ban the president from its platform, shutting down @realDonaldTrump. The company then played a game of whack-a-mole as it blocked the president’s access to affiliated Twitter handles like @TeamTrump (his official campaign account) as well as the official presidential account @POTUS and deleted individual tweets from the president. The company’s policies state that a blocked user may not attempt to use a different account to evade its ban.

Twitter has also taken other actions against some of the president’s affiliates and broader audience, blocking Michael Flynn, a bunch of other Trump supporters, and a variety of QAnon figures.

With a new president on the horizon, the official @POTUS account will be handed to the new Biden administration, although Twitter has reportedly been intending to reset the account’s followers to zero, unlike its transition of the account in 2016 from Obama to Trump.

As for Trump himself, a permanent ban from his most prominent platform begs the question: where will he take his braggadocio and invective next? So far, we haven’t seen the president move his activities to any social network alternatives, but after the past few years (and on Twitter, the last decade), it seems hard to believe the president will merely return to his golf course and quietly ride out to the horizon.

Snap: a quick lock after dampening the president’s audience for months

Snap locked the president’s account late Wednesday following the events on Capitol Hill, and seemed to be one of the most poised tech companies to rapidly react to the events taking place in DC. Snap’s lock prevents the president from posting new snaps to his followers on the platform, which currently number approximately two million. As far as TechCrunch knows, that lock remains in place, although the president’s official profile is still available to users.

Following the death of George Floyd in Minneapolis and the concomitant Black Lives Matter protests, the company had announced back in June that it would remove the president’s account from its curated “Discover” tab, limiting its distribution and discoverability.

The president has never really effectively used the Snap platform, and with an indefinite ban in place, it looks unlikely he will find a home there in the future.

Facebook / Instagram: A short-to-medium ban with open questions on how long “indefinite” means

Facebook, like Twitter, is one of the president’s most popular destinations for his supporters, and the platform is also a locus for many of the political right’s most popular personalities. It’s moderation actions have been heavily scrutinized by the press over the past few years, but the company has mostly avoided taking direct action against the president — until this week.

On Wednesday as rioters walked out of the halls of Congress, Facebook pulled down a video from President Trump that it considered was promoting violence. Later Wednesday evening, that policy eventually extended into a 24-hour ban of the president’s account, which currently has 33 million likes, or followers. The company argued that the president had violated its policies multiple times, automatically triggering the one-day suspension. At the same time, Facebook (and Instagram) took action to block a popular trending hashtag related to the Capitol riots.

On Thursday morning, Mark Zuckerberg, in a personal post on his own platform, announced an “indefinite” suspension for the president, with a minimum duration of two weeks. That timing would neatly extend the suspension through the inauguration of president-elect Biden, who is to assume the presidency at noon on January 20th.

What will happen after the inauguration? Right now, we don’t know. The president’s account is suspended but not deactivated, which means that the president cannot post new material to his page, but that the page remains visible to Facebook users. The company could remove the suspension once the transition of power is complete, or it may continue the ban longer-term. Given the president’s prominence on the platform and the heavy popularity of the social network among his supporters, Facebook is in a much more intense bind between banning content it deems offensive, and retaining users important to its bottom line.

Shopify / PayPal: Ecommerce platforms won’t sell Trump official merchandise for the time being

It’s not just social networks that are blocking the president’s audience — ecommerce giants are also getting into moderating their platforms against the president. On Thursday, Shopify announced that it was removing the storefronts for both the Trump campaign and Trump’s personal brand.

That’s an evolution on policy for the company, which years ago said that it would not moderate its platform, but in recent years has removed some controversial stores, such as some right-wing shops in 2018.

PayPal meanwhile has been deactivating the accounts of some groups of Trump supporters this week, who were using the money-transfer fintech to coordinate payments to underwrite the rioters’ actions on Capitol Hill. PayPal has been increasingly banning some political accounts, banning a far-right activist in 2019 and also banning a spate of far-right organizations in the wake of violent protests in Charlottesville in 2017. These bans have so far not extended directly to the president himself from what TechCrunch can glean.

Given the president’s well-known personal brand and penchant for product tie-ins before becoming president, it’s a major open question about how these two platforms and others in ecommerce will respond to Trump once he leaves office in two weeks. Will the president go back to shilling steaks, water and cologne? And will he need an ecommerce venue to sell his wares online? Much will depend on Trump’s next goals and whether he stays focused on politics, or heads back to his more commercial pursuits.

Google removes Parler from the Google Play Store, while Apple mulls a removal as well

For supporters of Trump and others concerned about the moderation actions of Facebook and other platforms, Parler has taken the lead as an alternative social network for this audience. Right now, the app is number one in the App Store in the United States, ahead of encrypted and secure messaging app Signal, which is at number four and got a massive endorsement from Elon Musk this week.

Parler’s opportunism for growth around the riots on Capitol Hill though has run into a very real barrier: the two tech companies which run the two stores for mobile applications in the United States.

Google announced Friday evening that it would be removing the Parler app from its store, citing the social network’s lack of moderation and content filtering capabilities. The app’s page remains down as this article was going to press. That ban means that new users won’t be able to install the app from the Play Store, however, existing users who already have Parler installed will be able to continue using it.

Meanwhile, Buzzfeed reports that Apple has reportedly sent a 24-hour takedown notice to Parler’s developers, saying that it would mirror Google’s actions if the app didn’t immediately filter content that endangers safety. As of now, Parler remains available in the App Store, but if the timing is to be believed, the app could be taken down later this Saturday.

Given the complexities of content moderation, including the need to hire content moderators en masse, it seems highly unlikely that Parler could respond to these requests in any short period of time. What happens to the app and the president’s supporters long-term next is, right now, anyone’s guess.

Discord / Twitch / YouTube / Reddit / TikTok: All the socials don’t want to be social anymore with President Trump

Finally, let’s head over to the rest of the social networking world, where Trump is just as unpopular as he is at Facebook and Twitter HQ these days. Companies widely blocked the president from accessing their sites, and they also took action against affiliated groups.

Google-owned YouTube announced Thursday that it would start handing out “strikes” against channels — including President Trump’s — that post election misinformation. In the past, videos with election misinformation would have a warning label attached, but the channel itself didn’t face any consequences. In December, the company changed that policy to include the outright removal of videos purveying election misinformation.

This week’s latest policy change is an escalation from the company’s previous approach, and would result in lengthier and lengthier temporary suspensions for each additional strike that a channel receives. Those strikes could eventual result in a permanent ban for a YouTube channel if they happen within a set period of time. That’s precisely what happened with Steve Bannon’s channel, which was permanently banned Friday late afternoon for repeated violations of YouTube’s policies. Meanwhile, President Trump’s official channel has less than 3 million followers, and is currently still available for viewing on the platform.

Outside YouTube, Twitch followed a similar policy to Facebook, announcing Thursday morning that it would ban the president “indefinitely” and at least through the inauguration on January 20th. The president has a limited audience of just about 151,000 followers on the popular streaming platform, making it among the least important of the president’s social media accounts.

In terms of the president’s supporters, their groups are also being removed from popular tech platforms. On Friday, Reddit announced that it would ban the subreddit r/DonaldTrump, which had become one of a number of unofficial communities on the platform where the president’s most ardent supporters hung out. The social network had previously removed the controversial subreddit r/The_Donald back in June. Discord on Friday shut down a server related to that banned subreddit, citing the server’s “overt connection to an online forum used to incite violence.”

Lastly, TikTok announced on Thursday that it was limiting the spread of some information related to the Capitol riots, including redirecting hashtags and removing violent content as well as the president’s own video message to supporters. The president does not have a TikTok account, and therefore, most of the company’s actions are focused on his supporters and broader content surrounding the situation on Capitol Hill this week.

 

Powered by WPeMatico

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.