On January 9th AWS let Parlor know that their account would be suspended because calls of violence across the platform violated Amazon's terms of service. The subsequent responses on social media complained of first ammendment violations and the loss of free expression in today's environment. But it's important to point out as many have that the first ammendment only makes assertions about the obligations on behalf of the government, not private enterprises like Amazon. That being said, most people when bringing up the first ammendment are actually alluding to free expression. Instead of a legal argument most people are actually debating whether we as a society should support the behaviors of companies like Apple, Google, or Twitter when they deplatform a site.
Some folks have brought forward a slippery slope argument, that the deplatforming of Parlor from AWS will lead to the deplatforming of other voices from the broader internet. We've seen examples of fairly aggressive and broad deplatforming of folks like Alex Jones, Gab, Storm Front and others. On the face of it there doesn't seem to be any limitations to the ability to remove access to the internet. This raises interesting questions about the relationship between free enterprise and wether we as a society should consider access to the internet a right in and of itself. While idealogically I find the listed examples reprehensible, I am simpathetic to the idea that it's possible we want to protect access to publishing on the internet for alternative voices. I would, for instance, be sympathetic to protecting access to DNS, IP, and TCP for all legal behaviors.
Another set of discussions has revolved around changes to so called "Section 230" that provides limited liability protections to "interactive computer services" (aka websites) for user generated content. The line of reasoning is that there already exist laws to protect against the insitement of violence by individuals, thus removing an entire platform. Instead, they argue, we should allow law enforcement to do their job and police the site. Some of these discussions will also promote adjustments to Section 230 such that if the site chooses to use targeted moderation they would lose section 230 protections. By and large I find these arguments to be dissengenious. It's been clear that Parlor had no intentions of aggressive self moderation to remove the illegal content, also that law enforcmement lacks the tools, the will, and the capabilities to police large social services.