Blog

Opinion: Criminal liability for digital marketplaces is a no-brainer. Congress should regulate.

Digital marketplaces are revolutionary for their ability to matchmake buyers and sellers online. Through a set of algorithms enabled by likes, shares, and scrolls, these marketplaces often facilitate the distribution, exchange, and consumption of goods and services at breakneck speeds. But while many of the day-to-day exchanges of antique boots, Xbox Ones, and used cars are normal, not all of these exchanges are as innocent.

Criminal activity online is nothing new. In 1995, two young hackers penetrated a US Air Force base’s networks, copied war game simulations, and installed software that tracked the passwords of unsuspecting personnel. In 2006, a team of hackers began an attack on a series of businesses’ protected credit and debit card data. Crimes like this are generally covered under laws like the Computer Fraud and Abuse Act, which criminalizes all behavior from “Negligently Causing Damage and Loss by Intentional Access” to “Obtaining National Security Information” But criminal liability for digital marketplaces is seemingly absent.

There are two notable examples of government enforcement against criminal activity in an online marketplace. The Silk Road was a marketplace located on the dark web that facilitated the sale of guns, drugs, child pornography, and even assassins-for-hire. In its years of operation between 2011 and 2013, the site reportedly made nearly $183 million in sales and had 146,946 unique buyers. The FBI shut down Silk Road in October 2013 and arrested Ross Ulbricht, the website’s founder, on charges including distributing narcotics by means of the Internet, and conspiring to commit computer hacking.

The second example is the infamous website Backpage.com. Backpage began as an online classifieds forum, similar to Craigslist, that facilitated the exchanges of cars, home rentals, and jobs for hire. It was most known, however, for its facilitation of prostitution and sex trafficking. In April 2018, the site was shut down pursuant to a federal investigation against the company’s executives into allegations of prostituon, child sex trafficking, and money laundering, among other complaints. Like Silk Road, Backpage was a haven for criminal activity. Its reputation and its user base furthered facilitated that activity.

However, digital platforms like eBay, Pinterest, and Amazon do not appear to face the same liability. In 2019, eBay was in the spotlight for facilitating the sale of firearm parts and conversion kits that would violate California law. Though the company has policies to remove this content its methods are often not quick enough to stop a potential exchange. During the height of Congressional inquiries into the opioid epidemic, the Senate Judiciary Committee wrote to Pinterest and other companies in an effort to have them step up enforcement against the sale of Oxycodone, Adderall, and other regulated pharmaceuticals on its website. Amazon routinely faces issues with counterfeit products in its own marketplace, drawing ire from customers, companies, as well as state and federal regulators in relation to the 10 billion listings of counterfeit goods on its site. Yet, there are no criminal charges pending against these groups for the issues mentioned above.

There is a major difference between platforms like Silk Road or Backpage and eBay and Amazon. While the reputation of a site like Silk Road or Backpage predominately facilitates criminal activity, eBay, Amazon, and their peers generally serve more mainstream purposes. That is to say, one assumes that a user on eBay or Amazon is not as likely to engage in criminal activity on a mainsstream platform. Perhaps that is true, but the documented instances of criminal activity on such a large scale warrant significant attention. Ultimately, this assumption leaves lawmakers blind to the real dangers of not regulating this activity more acutely.

When thinking about criminal liability for what appears on digital platforms, most people reference Section 230 of the Communications Decency Act (CDA 230). The statute reads: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” However, this law only covers civil liability, and does not take into account activities like obscenity, child pornography, and human trafficking. Those criminal acts are covered under separate statutes and would make digital platforms liable if the content is not removed in a timely manner. Other forms of online illegality, like the ones committed on the more mainstream digital platforms, do not have the same thrust of criminal liability. On the one hand, that makes sense - digital marketplaces should not necessarily be held liable for an individual’s conduct on their platform. Indeed, that was the original spirit of CDA 230. On the other hand, when criminal behavior is known to platforms, why are they not under the same pressure to report and remove that content promptly?

A large part of why platforms are not as held accountable for these issues is because of the work they already do to stem illegal content. Online Trust and Safety is an emerging function within companies who deal primarily with user-generated content and digital marketplaces. These departments are tasked with the responsibility of mitigating the risks associated with users’ ability to produce, advertise, and exchange content on their site.

Trust and Safety teams use a framework of policies, procedures, and algorithms to detect and remove content that violates their terms of service. Some of these frameworks generally cover abuses like bullying or misinformation - content issues that make the experience on a platform less enjoyable. But there are others that focus on the sale of illegal or otherwise regulated goods in digital marketplaces - these are issues that more tangibly lead to real-world harm (e.g., sales of drugs, counterfeit goods, human organs). These different platforms have different variations of their policies depending on their user base and the product. For instance, TikTok outright bans the display and/or promotion of firearms, while Snapchat is more permissive about their brandishing. Yet, these policies represent platforms’ attempts to moderate the challenge of policing the activity of hundreds of millions of people accessing their sites each day for commercial purposes. One might think it only natural that a few bad actors slip through the cracks.

Still, the efforts of these teams still do not fully answer the question of why Congress has yet to place to same criminal liability on platforms that facilitate the sale of firearms and other illegal or regulated content as it has on platforms that faciliate obscenity, child pornography, and fraud. This is particularly perplexing as the real-world harm the content poses is so well documented.

To observe the dangers of an unregulated digital marketplace, one need look no further than Facebook’s (now Meta) Marketplace.The platform relies on users representing their authentic selves in order to better facilitate transactions, but that inherent trust is often abused by malicious actors. According to ProPublica, Marketplace’s fraudulent listings have facilitated the sales of vaccine cards, firearms, and illegal drugs. The digital marketplace has also been responsible for facilitating at least 13 deaths - including a situation where a woman was murdered while meeting a person who advertised selling an inexpensive refrigerator.

However, it is more disturbing that Facebook has been unable to stem this issue. The company has 400 contractors and an undisclosed number of full-time employees working to mitigate the risk of the marketplace. While artificial intelligence is the big bet to mitigate the harm from these listings, according to experts in the space “these detection services frequently fail to ban obvious scams and listings that violate Facebook’s commerce policies.” ProPublica’s recent reporting highlights not only the demonstrated harm the platform has in the real-world, but its ineffectiveness at implementing a risk mitigation strategy. This report speaks volumes for why state and federal lawmakers should begin to think about what, if any, criminal liability platforms should bear when hosting content that promotes or facilitates the sale of illegal and regulated goods.

The time for regulating digital marketplaces has come. Overall, the industry generates hundreds of billions in revenue each year, yet the harm that remains inherent to their design remains unabated. But before thinking about regulation, Congress should take the time to more fully understand the ways that digital platforms operate. Questions about how Facebook makes money or whether or not it will ban “Finstas” completely miss the scope of the problem, and show the technology learning curve is steep for lawmakers and regulators alike. To that end, Congress should consider the following ahead of regulation as a way to enhance its oversight responsibilities:

Call for Increased Transparency and Mandatory Disclosure: In order to begin assigning liability, mechanisms for increased transparency and accountability are key. Companies like Facebook, YouTube, and others have provided increased access to the data they have on removals of harmful content or content that otherwise violates their policies. While this information is certainly helpful, some information is curiously absent. The figures that are represented by the companies never speak to the total number of violative cases that appear on their platforms, just what they were able to find through proactive and reactive detection (i.e. AI and user reports). In the case of Facebook, their transparency center has no specific breakdown for Facebook Marketplace - leaving much uncertainty as to the overall enforcement efforts taken to ensure safety on that product. Congress should take steps to inform its legislative prerogatives and increase accountability, such as:

  1. Formalize transparent data reporting,
  2. Require that platforms disclose to the public when marketplace transactions result in injuries or death, and
  3. Mandate a plan of action from platforms to remediate ongoing harm.

Establish a Task Force to Develop Regulatory Standards: It is clear that self-regulation of digital marketplaces does not benefit users. Yet, Congress lacks the expertise to ask the questions necessary to ensure safety and security for users on, and eventually off-platform. A task force can help Congress focus on the key issues at stake and the harm that can come from unregulated digital marketplaces and should comprise experts not just from the technology side, but academics, regulators, and community organizations with capacity and knowledge to meaningfully contribute to policy making in this space. This group, whatever its form, should focus on the following:

  • Enumerating the challenges that digital marketplaces pose to government oversight,

  • Capturing the privacy and safety concerns of consumers on those platforms and what the government’s role should be in risk mitigation, and

  • Understanding the ways in which criminal behavior takes shape on platforms where anonymity and integrity are often two sides of the same coin.

While these steps are not entirely sufficient, they are certainly necessary if Congress hopes to set a path for harm reduction, platform accountability, and transparency.

Digital marketplaces are a boon for the social web given their ability to meet consumers’ needs at an unprecedented scale. However, while these platforms send signals about prioritizing user safety and integrity, they are also business entities with conflicts of interest. What is worse, those conflicts either impede or obviate the will to do what is necessary to mitigate the real-world harm posed by criminal activity in their marketplaces. Given the renewed scrutiny of big tech, Congress should take the opportunity to improve government oversight and identify effective controls to hold platforms liable for their part in facilitating criminal actions. Taking these steps will not only lead to better outcomes for consumers, but will better enable a more free and safe Internet.