Sulli Deals and Bulli Bai | How much responsibility should GitHub take?

Panic, anger, disbelief and resignation. Then, just constant anger.

Hana Mohsin Khan, a commercial pilot, first realized it after learning it had been “auctioned” in July 2020 on the Sully Deals app hosted on US-based software collaboration and hosting platform GitHub. “I used to be a really happy person. It all happened because I am a Muslim and an opinionated woman,” she told Moneycontrol.

After six months of agony, when she was ready to leave everything behind and start anew in 2022, another app, Bully Bai, came out, and on January 1st, more than 100 women were auctioned off. “I was not a part of Bulli Bai, but it brings memories and the feeling that it will never stop. There is no progress, no hope,” she said.

What really bothers her and many other women and digital rights activists is the total lack of accountability from GitHub, where the two apps were hosted. “Platforms like GitHub need to take responsibility. We need more security and scrutiny and now is the time to do something about it.

The recent incident has put the spotlight on Microsoft-owned GitHub on moderation, or lack thereof, and the need for transparency in how these issues are addressed.

GitHub

Founded in 2008 in the US, GitHub has been dubbed as a social network for developers. Millions of developers and hundreds of organizations use the platform to host software projects.

There are around 73 million developers on GitHub, of whom 16 million joined in 2021 alone. The platform has around 5.8 million users in India, which is one of its fastest growing markets.

Hundreds of developers upload applications every day, and edit and collaborate with peers around the world. As a Bengaluru-based security researcher points out, most of the information on the platform is in Java and other programming languages, unlike pictures or text. The programming behind many of these applications is public, and this makes the platform vulnerable to moderation issues, not only in India but globally.

For example, the web developer, Sami, detailed on Twitter how the Bully Buy app used the same source code as Sully Deal. “Whoever created the GitHub page ‘SullyDeals’ is also the same person who now created the Github page ‘Bullybaby’. It looks like they’ve rewritten the texts on the page, but it’s the same code, ‘Sully’ in the code.” There is only one function with the name.”

Raj Pagaria, a technology lawyer and partner-client relations partner at The Cyber ​​Blog India, says there is a big issue on the platform’s efforts to ensure that incidents that violate its community guidelines are not repeated , as in the case of Sully Deal and Bully Bai.

“When it happens once, it makes sense. But when it happens twice there is trouble. The fact is that the platform did not put in enough effort to ensure that there is no repetition,” he said.

not once not twice

In July 2021, around 80 women were auctioned on Sulli Deals, hosted on GitHub.

In the following days, FIRs were registered in Uttar Pradesh and New Delhi. But so far no progress has been made in the investigation of the case. Khan, who filed the complaint in Noida, said that even after repeated follow-ups, no action was taken and many victims gave up.

Getting support from GitHub in criminal proceedings is also a challenge.

Anoushka Jain, Associate Counsel, Surveillance and Transparency, Internet Freedom Foundation, explained that for any information to be obtained from GitHub for criminal proceedings, it has to be done through the MLAT agreement between India and the US. A Mutual Legal Assistance Treaty (MLAT) is an agreement between two or more countries for the purpose of collecting and exchanging information in an effort to enforce public or criminal laws.

When things were calming down, Bully Bye, another app hosted on Github, surfaced on January 1, with 100 women’s photos being auctioned off. The app was immediately taken down.

However, unlike last time, the response has been huge. The police acted swiftly with politicians stepping in in an FIR registered in Mumbai. So far, four arrests have been made, all of them students – an engineering student from Bengaluru, two from Uttarakhand and a man from Assam.

While prompt action by law enforcement has been encouraging, many pointed out that it does not address the core issue, i.e., how GitHub addresses such issues.

question of restraint

“If you look at platforms like GitHub, they are quite large in size and for any large platform, content moderation is a struggle,” Pagaria said.

“But even if you are older, one would expect the platform to do something to prevent the Sully deal from happening a second time. But it didn’t; only 6 months later, it came up again,” he said.

Padmini Ray Murray, founder of tech and design collective Design Beku, told Moneycontrol that when two apps use the same code, there should be a check in the platform so that another app doesn’t get created. “But they haven’t done anything, or if they did, we don’t know what it is. More transparency is needed,” she said.

Akanksha S Srivastava, founder of Akancha Against Harassment, which works with law enforcement on cyberbullying, said: “For the platform, simply blocking them is not enough. They have a responsibility and they should take preventive action. Also the response from GitHub needs to be better. ,

According to experts, the time has come for platforms to have better moderation tools, and be more transparent about how they address these concerns.

content moderation

Content moderation in general is a slippery slope. But unlike social media platforms, where they are required to comply with Indian rules and regulations, GitHub gets into a thorny area.

Under the new IT rules, all important social media intermediaries must have a grievance, compliance and zonal officer appointed in India. From Facebook to local social media apps, they all now have local authorities to cooperate with the government, and users can get in touch with complaints.

But GitHub does not have the same authorities posted in India and victims have to take a legal route to get information from the platform.

Recently, law student Amar Banka sent a legal notice to GitHub on the issue. He posted the response he got from GitHub on Twitter. “Foreign enforcement officers who wish to request information from GitHub should contact the Office of International Affairs of the United States Department of Justice Criminal Division. Through a U.S. court through the GitHub Mutual Legal Assistance Treaty (MLAT) or Letter Rogatory Will respond promptly to requests issued from

A digital rights activist who spoke on condition of anonymity said that the MLAT is just for show and doesn’t work in most cases.

Moneycontrol sent GitHub detailed questions on moderation, compliance with local laws, its complaints and compliance officers in India, and the level of cooperation the government has expanded.

GitHub did not respond to specific questions but shared a statement: “GitHub has longstanding policies against harassment, discrimination, and incitement of violence against content and conduct. We removed a user account after investigating reports of such activity.” suspended, all of which violate our policies.”

To ensure this, GitHub has set policies prohibiting unlawful, defamatory and abusive content targeting any individual or group, as was the case with Sulli Deals and Bulli Bai. The platform takes down content when they violate community guidelines and are reported.

This brings to the fore whether moderating the programming language is a way to go, given that this isn’t the first time the company has run into trouble with moderation.

History of Moderation Issues

In 2014, India blocked 32 sites, including GitHub, for hosting ISIS-related content. Globally, the platform has come under scrutiny for hosting code that allows people to create deepfakes. Deepfakes can be used to create non-consensual pornographic videos, which is an infringement. The platform has also been censored in other countries including China and Russia.

While victims have called for better moderation, this raises the question of how far a platform should go to moderate content.

The security researcher cited earlier, who works for Bengaluru-based Unicorn, explained that actively moderating code hosted by millions of developers would be a challenge as most of the information on the platform is in Java and other programming languages. “Therefore, in order to moderate it, one has to go behind each project to find out whether it violates policies. When there are hundreds of thousands of codes, it is not possible to moderate each one,” said the researcher. .

read all breaking news, today’s fresh news And coronavirus news Here.

,