Instagram has removed posts and blocked hashtags about one of Islam’s holiest mosques because its content control system mistakenly associated the site with a title reserved for terrorist organizations. The mistake is the recent failure of Instagram and its parent company, Facebook, to refine content, which has led users around the world to accuse it of censoring content related to the Israeli aggression against the Palestinians.
The error, which was identified internally by disgruntled activists on Tuesday, led Instagram to remove or block posts with hashtags from the Al-Aqsa Mosque, the third holiest site in the Islamic faith. Since Friday, the mosque has been the site of clashes between Israeli police forces and Palestinians, many of whom visited the scene to pray in the last days of Ramadan.
In an effort to draw attention to the violence, Instagram users have posted videos tagged with the hashtag #Alaqsa or its Arabic equivalent #Aqsa or # الأقصى, only to have their posts removed or hidden from search results. Some notices have shown that Facebook-owned Instagram posts have been removed because they were associated with “violent or dangerous organizations.” When the staff learned about the removal and the reasoning behind them, some filed internal complaints.
In one case, an employee noticed that Instagram had removed an infographic describing al-Aqsa’s situation due to its involvement in a “violent or terrorist organization.” After the employee filed a complaint, they wrote in an internal post, informing them that the image had been downloaded based on a reference to “Alexa,” a designated entity, a Facebook word that means “dangerous person and organization.” Later the content was finally restored.)
“These two mistakes and many others are completely unacceptable,” a Facebook employee wrote on an internal communication platform on Tuesday. “Al-Aqsa is the holiest place in Islam and a central aspect of the faith of about 1.1 billion people.”
Censorship of Facebook posts about Al-Aqsa comes at a time of extreme tension and violence in the region. Since the fighting began last week, more than 5 dozen Palestinians have been killed, including more than a dozen children and six Israelis, and more than 30,000 have been injured. Since people have used Instagram and Facebook to spread information from the ground up – from the forced eviction of Palestinians in the vicinity of East Jerusalem to the violence in al-Aqsa – some have seen their posts blocked or removed.
For critics and even some employees, Facebook’s recent content control failures are evidence of the American company’s lack of understanding and resources in the region, and show how careless errors can have a large-scale impact when its products are used by more than 3 billion people. World.
Facebook had previously told Middle East Media National that posts with the hashtag al-Aqsa were “limited to errors”, but an internal post obtained by BuzzFeed News on Wednesday added that the content had been removed because al-Aqsa was also a US-approved one. The name of the organization. “
A Facebook spokesman declined to comment outside of what was in Wednesday’s internal post.
Last week, Palestinian Instagram users also complained that Instagram stories, or transient videos and photos on the platform about the clashes lasted 24 hours. On Friday, the company blamed the mistake on a bug in the social network that affected users sharing stories around the world.
Those mistakes are reflected in some Facebook workers. In a post over the weekend, an employee wrote to an insider group that “the external perception is that the FB is silencing political speeches in a timely manner and later apologizing.”
“Some of these cases are human review errors and others are automated and I’m not familiar with what is more common but why decision makers can’t use local skills. [Middle East and North Africa] Consult them before deciding to remove areas such as public policy or commas and sensitive hashtags or political content, ”they wrote, before sharing screenshots with various users who complained that their Instagram posts had been censored. They also mentioned that Instagram users around the world had launched a campaign to give bad ratings to Instagram apps on the Google Play Store.
In response, Guy Rosen, vice president of Facebook Integrity, wrote a day later that the agency’s teams “free up triasing and blockades as soon as any issues come to the fore.”
However, that effort did not prevent him from consistently removing content about the Al-Aqsa Mosque, where clashes began last Friday when Israeli police attacked Palestinians who had gathered to observe the last Friday of the Muslim holy month of Ramadan. Allegations of censoring content with al-Aqsa hashtags continued until Tuesday, when a concerned employee complained of incorrect removal of a post.
However, an armed Palestinian alliance in the West Bank known as the Al-Aqsa Martyrs Brigade, which the United States and the European Union consider a terrorist entity, and other well-known organizations such as the Al-Aqsa Foundation. The US government, its support network, a critical Facebook employee, said it was no excuse for censoring the hashtags of al-Aqsa mosques.
“If there was a select group of troublemakers in Washington and posts that only mentioned the word Washington were withdrawn, it would be completely unacceptable” “I really want to emphasize that this part of our userbase is already feeling isolated and censored After all – whether they are technical or product based – our users will not take advantage of our doubts. “
On Wednesday, an employee of the company’s dangerous organization and individual policy team wrote in their internal post that the term al-Aqsa (الأقصى) “should not and should not violate our policy.”
They wrote, “As many of you have rightly mentioned, just using the same name as a nominated organization does not make the place and the organization the same.” “Our policies do not call for the removal of any person, place or thing that only shares a name with a designated entity – so any removal based solely on the name of the mosque must be an error of application and should never occur under it. Our policy.”
Others were less confident in Facebook’s internal interpretation. Ashraf Zeitun, who served as Facebook’s chief policy officer for the Middle East and North Africa region from 2011 to mid-2017, noted that the company had hired the world’s top terrorist experts who could certainly distinguish the Al-Aqsa Martyrs Brigade.
“Identifying a word with two names associated with a terrorist organization is a lame excuse for them,” he said, noting that he was involved in drafting how the company designated terrorist groups and their content. “They’re more qualified and more efficient than that.”
Zeitoun cited an internal fear on Facebook that Al-Aqsa was undermining Israeli interests and over-reporting the content as possible reasons for removing the videos and photos.
In response, a Facebook spokesman told BuzzFeed News that al-Aqsa content was limited due to human error, and not due to any government request.
The removal and blocking of some Palestinian content on Facebook has resulted in employees of the social network speaking internally. Ahead of a company-based regular meeting on Thursday, led by CEO Mark Zuckerberg, some workers began to raise a question that asked, “Our integrity system is failing marginalized groups (see: Palestine, BLM, Indigenous Women). What can we do about it?”
The question is low on the list of top questions, behind at least three different questions on Facebook’s policy of working from home, and one wonders if Mark Zuckerberg will ever host Live Saturday night, A variety of shows this past weekend following the appearance of Tesla CEO Elon Musk.
In another question, an employee asked if Facebook would move its regional office from Tel Aviv, which some Palestinian-American employees could not access due to Israeli sanctions. Mentioning that Human Rights Watch has identified Israel as a racist state, they asked if Facebook would ever reconsider its position in Israeli cities.
A Facebook spokesman declined to comment.