The past two years have been spent trying to create a platform that allows people to share their data. Our team is dissatisfied with the current internet state, where data lives in silos while user activities are locked to one platform. We are trying to create an alternative, where core concepts such as a user’s social graph or content feed can be followed from platform to platform. Any developer can also build a new infrastructure that makes use of all the existing data.
While the journey has been a success overall, there have been some unexpected roadblocks when it comes to managing our platform's content. Censorship of the Internet occurs at all layers, even those that most people don’t know to exist. Our website has been censored on platforms like Twitter and search engines like Google. The Chrome browser has even censored us.
Today, my goal is to show you all the layers of web censorship. Many people don't realize how many actors are involved in maintaining websites available to the public. They also don't realize how many actors can decide independently that a website should be removed.
Gone with a Bang
Our story begins in May 2020 when our website disappeared suddenly.
The first round of debugging was completely unsuccessful. Cloudflare was correctly configured, many community members from different continents confirmed that there was an outage, the server was responding and all services were reported as normal. The most important thing is that the outage was not a 504 error or "server is busy" message. It was acting as if nobody had ever registered the domain.
Namecheap informed us that they had unregistered our domain after we were accused of phishing fraud. Upon further examination, we discovered that our partner had filed the abuse complaint which led to our removal.
We had been working with Uniswap in order to host an authentic clone of their application on our site. Uniswap's antiphishing team had discovered our clone, mistaking it for a common phishing scheme, and had asked Namecheap not to take down our website.
Namecheap, as well as Uniswap, were contacted and we worked out the problem and brought our website back online. Although it was a very user-facing incident, our first encounter with censorship turned out to be an honest error. We created better abuse reporting channels, started some hot Twitter threads, then went back to our normal business. We realized that censorship was unlikely to occur again since we had improved communication with our domain registry.
After about a month things were fine, we received yet another round of reports that our website had gone down. Instead of the website being down completely, we received a standard message saying "server not found". This was only occurring in our US East region. We attempted to hack into the US-East servers but they weren't there.
Our US hosting provider, IONOS, had actually frozen our servers and taken our account to abuse. This time, the complaint was for phishing. The offending website was a legitimate website that was phishing users to obtain their... IONOS credentials. IONOS was not amused, it is obvious. They didn't give back our servers, unlike Namecheap. After we blocked the bad guys.
We stabilized the situation by routing US-East to Europe. Although this did come with a latency penalty it still ensured that our US customers were up-to-date. After finding a new US hosting company, we migrated all of our data over. While the overall impact on our users was minimal, it was much more significant for our team.
We began to question if we were doing it too well. After running our website for only four months, we were now facing abuse complaints and deplatforming from multiple sources. This abuse, unlike the previous experience with censorship, was legitimate. It looked like things were going to get worse before they got better.
Freedom of Speech
Our website is merely infrastructure. Our website is just infrastructure. We create a peer-to-peer network that allows users to share files and data. When I refer to "files and data", I don't mean just Linux ISOs or photo albums. Data is big tech. It refers to a user’s social graph, their comments feed on videos and the algorithms that build their content feed. We are trying to rebuild the Internet's foundation.
This data can be accessed via our website. Our website doesn't host the application code when you load an application such as Uniswap. The Sia network is delivering the application code. We don't "block" anything on our website. We're simply refusing to fetch the data from the Sia network.
This makes content management for us much more difficult than it is for platforms like Imgur and Github. A user can request a file or an application from our server. We don't know if the data is malicious, and there's a high chance none of our servers has ever seen it before. Because our website can load whole applications, there is no mechanism to report abusive content. We don't have any control over the UI presented to our users.
We believe this is a positive thing for freedom of speech. Anyone should be able to publish a website on the Internet. We believe anyone should be able to share files with their friends. You don't have to register yourself with any corporation before you can publish your files on the Internet.
We decided that after the IONOS conference, we would only block content on our website (sometimes called a Portal) if we received abuse reports.
We didn't make it for a month, I doubt. Then this policy was thrown at us. The problem this time was not phishing but malware. Google Chrome was the catalyst, and it wasn't our infrastructure provider.
Two billion people around the world were blocked from our website after a single update. Google provided help when we tried to reach them. They explained that our website was infected with malware. After presenting a list of links that were infected, they said that it was only a small sample and that there was more malware. They said that they would not remove the blockage of our website until we had removed all malware.
This action is more concerning when viewed through the prism of freedom of speech and censorship than any of our previous encounters. Namecheap, IONOS, and other providers presented challenges to us. These were providers we had formally engaged. We had signed Terms of Service and had other infrastructure providers that we could switch to in case of conflict.
Chrome has never entered into any Terms of Service. We do not have any business relationship with them. Simply stated: Google can decide not to like you and you will cease to exist for 65% of the planet. There is no other option.
This is the scary part. Google isn't an elected entity. Google has made itself an unelected regulator of the Internet and is only accountable for its share price. Currently, the ban applies to malware. However, it is possible that this ban will expand over time to include any content Google doesn't consider favorable. Google can remove websites from the Internet in general just as easily and as quickly as they can remove content creators from YouTube.
We spent over a year trying to fix the problem. After installing a malware scanner on our servers, we were able to remove the warning. The malware scanner watches each user's download request and runs a scan once the user has finished the download. We ban all future downloads from the software we use ClamAV if it detects malware. This has been sufficient to appease Google in practice.
This is a great solution but it's not working in the right place. We are a startup of 16 people with $3 million in VC funding. Google is a $1.5 billion dollar company with more than 100,000 employees. Google decided to fix a malware issue for its users by becoming an unelected regulator for everyone else, driving up the cost of running a website (and even competing with Google). Google has also reduced competition by offering smaller businesses this option.
I have no mouth and must scream
We noticed a sudden increase in infrastructure providers de-platforming us around mid-2021. Previously quite chill entities became hostile and began pulling their servers from the Internet.
We went through our email and checked our ban list. We also checked our abuse management tools. All of this indicated that we were still processing abuse reports on time and maintaining a clean website. The number of abuse reports that we received had actually decreased.
Yet, our hosting providers appeared to be ignoring abuse complaints. We were eventually able to identify the problem after some frustration and disjointed communication.
Silently dropping emails containing URLs to our website was a common occurrence. The sender indicated that the emails were received successfully. All indications from the receiver indicated that the emails never existed. They were not going to spam; they were being banished from hell.
Chrome was already bad enough. It was already bad enough. We were unable to receive reports so we could not remove abusive content. Email is not part of our user-facing software!
We were able to move our email to a new domain, and we also warned our hosting providers not to allow the domain name to be included in emails when they filed abuse reports. We were pleased with most of our hosting providers but had to remove a few who were unwilling or unable to comply.
We also learned another important lesson about censorship. There are people who can make it clear that email is not allowed. If they do this, your business will likely fail. It is not clear who these powers are, or what they hold accountable. If these powers turn against your business, it's also not clear how you can appeal a mistake.
The Need for Neutrality and Inter-connectivity
Modern businesses are dependent on many independent services. Every service feels the pressure to ensure its customers act in a morally upright manner because of the current political climate. When your business relies on services that are based in many jurisdictions, with hundreds of different moral perspectives, this becomes a problem.
Our economy and services are becoming more intertwined. As a result, a growing number of players have more power and the ability to de-platform more businesses and users. These requirements can be a problem for each other. To avoid conflict, everyone is obliged to allow one service provider to be particularly opinionated and quickly de-platform.
This is not sustainable. This will lead to a monoculture that is too afraid of taking risks and breaking the status quo. Nobody can afford to disrupt even one of the hundreds of services they depend on. Because only Facebook and Google can bully others into making changes, our culture is established and defined rather than users or creators.
This is why it is so important to insist on neutral infrastructure. Infrastructure that is not neutral, given the globalization of the Internet and global economy today, will invariably turn against its users and force them to adhere to arbitrary and unconstitutional moral standards.
We had dealt with three types of abuse up to this point. We had to police Phishing content at the DNS layer. The malware was being policed by both the web browsers and the email layer. Sometimes we also encountered law enforcement regarding terrorist propaganda.
One day, we began to see child porn (also known as CSAM). While other forms of abusive content are often policed only in select cases, CSAM is aggressively and actively policed by all. It is understandable. Our infrastructure problems got worse almost immediately after we began seeing CSAM.
Hetzner, our favorite hosting provider, told us that any CSAM abuse complaints must be dealt with within one hour, instead of the usual 24 hours. If not, they would take our servers offline. They initially gave us 30 days to resolve the issue. However, 7 days later, they changed their minds and stated that they would immediately terminate our service and that we would have 48 hours to transfer everything to another provider. Did I mention Hetzner was almost half of our fleet at that point? Did I mention these "48 hours" were December 24th and 25th? We wish you a Merry Christmas.
Mevspace did not bother to send us an abuse report. They immediately took our servers off the shelves when they heard that CSAM could be accessed through our website. We never got our data back.
We lost around 80% of our servers between November and January. Then, we were hit by this beauty.
Our website was flagged by ISPs around the globe (mostly in Europe) as a child-abuse website. They refused to allow customers to visit it. Instead, they sent the following text to their customers, which roughly translated as "STOP! This is a child porn site." This hotline is available to help you!
There are millions of non-pedophiles users. Many of them were being sent a message informing them that they were being accused of pedophilia, and encouraging them to seek treatment. This is more than being offline for a few hours. We now have the opportunity to add "local ISPs" to the list. These infrastructure providers have taken it upon them to make sure that we don't exist.
My landlord called me and that was the climax of our encounter. "Hey uh... The child porn cop visited you and left you a business card. Is there anything you need to tell ...?"? This was followed up by a conversation between a lawyer and me that ended with "You should remain in Denver for now since we don't yet know if they will arrest you at Boston's airport." Your Valentine's Day was more stressful.
It was just too much. We decided that it was impossible to keep the website alive, so we closed the site.
Building back better
Our CSAM problem was the most popular suggestion. We were told to connect our website with Facebook and send every file our users upload to them. Facebook will scan each file and decide if it should be allowed on our website. Microsoft offers a similar service, while Apple and Google both offer something similar. It doesn't seem that this is a bad deal since they are all free.
This is a terrible deal. These services will not be available until everyone on the internet agrees that they should be compulsory in order to protect children. The big four will have a monopoly and can set the price for any website they wish to run on the Internet.
It is not certain that these services will be restricted to child pornography moderation. These services are managed and controlled by large corporations that have profit motives. If they determine that you cannot live without CSAM filtering and you also agree to remove copyright-infringing files, then your website will be forced into any copyright monitoring program they have implemented.
Other egregious events could also occur. What if Facebook refused to sign an $800 billion agreement with China to restrict political content? I don’t know and I don’t want to be in a situation where we find out.
These privacy considerations overlook the basic privacy concerns that come with sending all files to Facebook. Our users love us because we aren’t Facebook. If we send literally all our data to Facebook, we’re also putting our company at a competitive disadvantage. As we have seen with Amazon, big corporations will use any unfair advantage they can to take over your company.
The Facebook solution is now available. In the interest of protecting children, we won't be sending files or data from our users to third-party services.
Now, we've spent over six months creating alternatives. We have increased the number and types of ways someone can report abuse. We have created forms that can be filled out to ban all mass banners. Automated processing of incoming emails has been significantly improved. We have increased the frequency with which we state "This content is not and never was stored on our servers".
We have integrated directly with NCMEC's API -- the National Center for Missing and Exploited Children. We immediately forward CSAM reports to NCMEC with metadata such as IP addresses, timestamps, and (if available) credit card details.
Our blocking procedures have been improved. We have banned more VPNs and IPs than ever before and are more consistent in keeping Tor blocked. It is not easy to completely block Tor. If there are too many offenders in a subnet, we block it.
Our website was eventually restored online. We were able to get the police off of our backs. We were able to get the FBI off of our backs. Chrome was removed from our hands. Emails with links to our website seem to be sent again successfully. In case the main website goes down again, we have two alternative websites. Signup is required for the first. It has been a success in filtering out abusive users who prefer not to sign up. The second is subject to a monthly credit card bill payment.
Twitter and Medium continue to censor us. I won't link to the original website as Medium and/or Google might respond. This is an excellent example of the chilling effect when censorship is unclear and arbitrary.
We still have to deal with foreign policy. We don't know where to begin with fixing relationships with ISPs who are calling our users pedophiles. My landlord and neighbors no longer trust me and there isn't much I can do to change that. We are alive and growing again.
The Way Forward
In the coming years, I believe things will get worse than they are now. Based on the current comments and conversations from the Biden administration I believe that Facebook's child abuse API is going to be a requirement for any website in America. It's something I believe we should be actively fighting, both politically and technologically.
My opinion is that blocking all reports is a temporary solution. We've seen this happen with YouTube's copyright system. If you don't have adequate controls over who can file abuse claims, you could end up deplatforming legitimate content that is being used by malicious actors. We have seen legitimate content being blocked by many of our users due to incorrect abuse complaints.
When choosing content management solutions, we must be proactive and cautious about what rights we are granting. People tend to lose their minds when CSAM is involved and want to "burn it all down instantly". Because people are so upset, CSAM can be very useful in pushing political and corporate agendas. You should be cautious when someone claims that a concession is being made or rights are being infringed "to protect children".
There is a lot of work to do. Not only is content moderation important for malware, phishing and CSAM but also for other mundane problems such as spam. If we fail to address the problem immediately, central corporations such as Facebook and Google will solve the problem by taking greater control over our lives.
There are many things I think we can do to create a future that protects users' freedoms and blocks content such as spam and CSAM. Many of these ideas are still in the early stages and beyond the scope of this blog. If you'd like to learn more or be involved yourself, check out our company website at https://skynetlabs.com or visit our discord at https://discord.gg/skynetlabs.