Charting a path forward for trust and safety in domain name registries and registrars.
In recent years, policymakers across the globe have laid down rules and regulations for content providers to establish a framework that balances fundamental rights and the ability of users to interact in an online environment that is safe and free of toxicity.
However, the way trust and safety interplays in the Domain Name System (DNS) has received little attention, yet it constitutes a core issue in the way users experience the internet. The DNS is essentially the internet’s phonebook—it is the system users use to access information and services, like through www.nytimes.com. Approaching trust and safety issues at the DNS level has to be done through careful consideration of the impact it may have on users.
Both domain name registries and registrars have attempted to create a more layered governance structure to register domain names. For example, every domain name is registered pursuant to a registration agreement between a registrant and a sponsoring registrar, but this governance structure does not, most of the time, account for novel issues that have a more direct impact on the content of the domain name. Recent attempts to address trust and safety issues or content abuse in the DNS are partly due to pressures from stakeholders, such as the copyright industry, which for a long time has been using domain names to make claims about infringements of intellectual property rights. In response, registries and registrars have tried to come up with self-regulatory frameworks either through their work with trusted notifiers or by creating their own processes on content abuse issues.
Early efforts to address these issues, such as the Healthy Domains Initiative—which was meant to address online security abuse, such as malware or phishing as well as content abuse issues—demonstrate the first signs of self-governance on trust and safety issues. But despite some progress, they have been largely unsuccessful and criticized for their ineffectiveness and overreach.
The aim to create best practices to minimize website content abuse, or the registration of a domain name for malicious or illegal purposes, remains critical. More recent efforts—such as the NetBeacon Institute (formerly DNS Abuse Institute), which works toward simplifying and enhancing DNS abuse reporting, and eco’s topDNS Initiative, which is meant to facilitate a multistakeholder initiative that seeks to educate policymakers and promote existing efforts to fight abuse online—address some trust and safety issues and could be considered self-regulation solutions.
These initiatives, however, focus on the technical aspects of DNS abuse on an individual level, such as malware, phishing, and botnets. Registries and registrars have also engaged with various initiatives such as the Internet & Jurisdiction Policy Network—or have come up with their own lightweight framework—to explain how they combat illegal and abusive content in the registration of domain names.
Notwithstanding these efforts, a more consistent, transparent, and consolidated effort by the registries and registrars is needed. Considering the increasing relevance of the role of registries and registrars in dealing with content abuse, it’s important to consider what role trust and safety have in domain name governance. How responsive should the domain name industry be overall? If the DNS is such an important part of the internet, the answer to this question then must point to efforts for a coordinated approach among actors participating in the DNS ecosystem to ensure consistency, predictability, and trust.
Domain Name Registries and Registrars
The internet is an infrastructure that encompasses a lot of different and complex applications. Internet users are exposed to various interfaces that correspond to a single name, such as “Google” or “Netflix.” But it is what goes on behind the scenes of these interfaces that tie all of these applications to the global internet: Internet Protocol (IP) addresses. According to the nonprofit Internet Society, “Every bit of data flowing between a user’s computer and the applications being used is in an IP packet, and every single IP has an address that says where it is going. These IP addresses allow any two systems on the Internet to find each other without ambiguity.” As identifiers, IP addresses are closely related to domain names, which are supported through the DNS. Put simply, the DNS functions as a map between IP addresses and names and ensures predictability and consistency.
To this end, the main function of the DNS is to translate the numbers that enable users to access digital services on the internet into memorable names such as netflix.com or google.com. At its core, the DNS allows people to connect with websites. The DNS, however, goes beyond just acting as a place where internet identifiers exist; it further enables the delivery of digital services on the internet. This includes online streaming, service connectivity, and so on.
What’s more, the DNS is also the place where a multitude of actors meet to ensure the internet’s resilience and integrity. Despite the unique role each actor plays in the DNS ecosystem, the focus of this article is domain name registries and registrars, which constitute the connection points between the internet and individual users. For a digital service—such as a social media network, an online market intermediary, or a personal website about cats—users, that is, creators or providers, need to register a domain name to have an online presence.
To register a domain name, for example, <www.trustandsafety.com>, the user most often has to go to a registrar, such as GoDaddy or Tucows, or approach a domain name reseller. Domain name registrars are accredited by the Internet Corporation for Assigned Names and Numbers (ICANN) through a contract. ICANN is the sole coordinator for the allocation of domain names and is also responsible for setting policy rules that underlie the registration of domain names. Resellers, in turn, have a contractual relationship with domain name registrars and sell domain names on their behalf. Additionally, domain name registries sit at the top of the DNS hierarchy, as they are responsible for the management of the top-level domain names they operate—that is, websites that feature a .com or .org in their URLs.
The Trust and Safety Governance Framework
Registries and registrars rarely get involved directly with content and conduct governance. Making decisions about content and conduct through registries and registrars is, generally, a blunt approach due to the impact their actions can have on the operation of the websites and the users of the online service (hampering or restricting their access to the service).
Additionally, any sort of content governance at the registries and registrars level also can grant them a lot of power to make decisions on what can stay up on the internet and what should be removed. For example, by threatening to deregister the domain names of web services, therefore placing a website’s online presence in jeopardy, registries and registrars can pressure smaller platforms, digital services, and individual registrants to take specific content or a web page down. What’s more, enforcement at this level is where real power lies: For social media platforms, trust and safety policies deal with either the removal of specific content or the presence of a user in that platform. But when it comes to the DNS, the effect is more chilling: DNS registrars and registries can determine what websites as a whole are or are not visible on the internet.
A domain name registry or registrar will likely deal with issues of trust and safety in the following manner: When an illegal or “abusive” website gets reported (by law enforcement, an affected user, a trusted notifier, or others), the registrar will proceed to notify the web service provider. If the registry is informed about the potential violation, and the website violates its terms and conditions, the registry informs the respective registrar to take action; normally, such an action would involve informing the registrant to take the specific content down or to remove the domain name altogether. If the registrar does not take action within a specified period, the registry may feel compelled to take action itself.
Traditionally, regulation of content has focused on those services that are visible to users, such as Facebook or X, thus maintaining the exemptions of liability for providers of internet infrastructure. This is for various reasons, notwithstanding the idea that at the infrastructure level, the actors that facilitate the flow of data from point A to point B are meant to be agnostic of the content that data carries. This is because content moderation at the infrastructure level can be disproportionate and affect third parties’ access to services and content.
As regulation increases, however, so does confusion about whether registries and registrars should be recognized as online service providers in the context of content moderation. An online service provider can, for example, include actors like an internet service provider, an email provider, an entertainment provider (music, movies), a search engine, a social media site, and many others. Examples include companies such as Facebook, Bing, Netflix, or Amazon. Registries and registrars, in turn, provide the registration of the domain name—and their job ends there. The type of service or content associated with the domain name is not part of the registries’ and registrars’ mandate.
This does not mean that providers of domain name services do not—or should not—engage in trust and safety practices. As more domain names have been used for illegal activity or become compromised, a number of registries and registrars have begun to adopt trust and safety processes to deal with the increasing number of complaints they have received. (It’s important to acknowledge, however, that these practices are not yet framed as “trust and safety” by registries and registrars. Registries and registrars usually call them “content abuse.” See, for example, Public Interest Registry’s anti-abuse policy.)
Further, there is an effort to distinguish between DNS technical abuse and content abuse. Such trust and safety processes were limited to two main efforts: (a) regulating smaller web services that are set up for the sole purpose of spreading and distributing illegal content and (b) dealing with services that do not have trust and safety teams or are unresponsive. For example, Public Interest Registry, which is responsible for registering domain names under .org generic top-level domains, has set out a policy that allows intervention only for sites that engage in the distribution of child sexual abuse materials (CSAM) and other sites dedicated to nonconsensual sexual imagery, the distribution of illegal opioids and narcotics, patently illegal or patently fraudulent activity, credible and specific incitements to violence, and credible threats to human health or safety.
Most registries and registrars have toyed with different trust and safety frameworks, but most, if not all, are still in their infancy. The largest registry, Verisign, for instance, does not have any concrete trust and safety policy in place, and much of its involvement revolves around notices brought by trusted flaggers and domain name registrants’ contractual obligations. This might be due to the fact that Verisign, “as a top-level-domain registry operator, has contractual commitments with the US government to operate the .COM infrastructure in a ‘content-neutral’ manner.” They do mention a few initiatives they have taken on in order to bring trust and safety. The programs include:
- A “trusted notifier” pilot program with the U.S. National Telecommunications and Information Administration and the U.S. Food and Drug Administration to curb access to illegal online opioid sales.
- A similar “trusted notifier” relationship with the Internet Watch Foundation (IWF), under which they are “committed to taking action” against every .com and .net domain name reported by IWF as being used to host CSAM-related content.
- Ongoing work with the FBI and other law enforcement agencies to fight scam websites related to the coronavirus pandemic.
- A commitment, as an electronic service provider registered with the National Center for Missing and Exploited Children (NCMEC), to bring to NCMEC’s attention instances of the online exploitation of children.
Drawing Analogies From Current Trust and Safety Practices
Given the existing inconsistency and ambiguity regarding the trust and safety of domain name services, the Digital Trust and Safety Partnership (DTSP) framework can be used as a point of reference for potential policy frameworks. The DTSP is a collaborative effort among leading technology companies to address some of the most challenging and complex trust and safety issues. While not all best practices at the DTSP—such as the ability to identify, evaluate, and make any necessary adjustments to content or the ability to conduct risk assessments during product development—might be applicable to domain names, they can help to advance a set of operational norms such as the need for explainable processes for governance or for continuous assessment and improvement of any governance processes in place.
The DTSP’s safe framework applies to the governance of user conduct and content and is divided into five commitments tech companies are required to make when joining the partnership: development, transparency, governance, enforcement, and improvement. Each of these commitments is tied to a set of best practices to ensure the trust and safety of users. Below is an attempt to extrapolate some of these principles as they manifest in the context of content moderation and apply them to the domain name registration services. Transparency, governance, and enforcement are notions that should exist within any trust and safety framework. This is especially true in domain name registration, where there is a tendency for those issues to be addressed still in a quite opaque way that often lacks basic tenets of accountability.
Abuse Pattern
Identifying the abuse pattern is one of the most common procedures that registries and registrars deploy to combat DNS abuse. For example, domain names used for phishing are associated with abnormal DNS patterns in the follow-up to the domain name being registered. Usually, when a domain name is registered for legal purposes, there is a gradual increase in the number of queries; on the contrary, when the registration of the domain name is abusive, there is often a rather sharp traffic peak. Still, this does not constitute a rule, which makes it generally hard to create a process and delineate its scope.
It is important that registries and registrars be clear in their content abuse policies about whether they use the “abuse pattern” method and, if so, how they do it and what the remedies are for tackling false positives and mistakes. Identifying the domain name registration abuse pattern may lead to finding more websites that violate a registrant’s user agreement, and the registry or the registrar can take action if they identify abusive registration. This practice, if used, should be undertaken with care as it can disproportionately affect registrants because of the high number of false positives and mistakes.
Trust and Safety Consultations
Registries and registrars do not usually hold specific trust and safety consultations, but they are part of different initiatives that allow them to exchange information valuable for their approach to trust and safety, such as the Internet & Jurisdiction Policy Network. Most of these initiatives can benefit from civil society participation, mainly those that focus on digital rights and accessibility, such as Article 19, the Electronic Frontier Foundation, and global majority organizations such as research ICT Africa and others. Such participation ensures a more equitable approach to the way trust and safety policies are interpreted. With some notable exceptions—such as Public Interest Registry, which has actively sought feedback on its policies, or Verisign’s blog on community work to tackle “security threats”—the majority of registries and registrars do not seem to have specific trust and safety processes in place that consult with external actors and/or receive feedback. Like many other issues related to the internet, trust and safety considerations require a bottom-up, collaborative approach due to their complexity and polymeric nature. Many people and actors can be affected by registries’ and registrars’ content abuse decisions; therefore, they should be consulted during policymaking and enforcement processes. Additionally, other fora such as the Internet & Jurisdiction Policy Framework also provide a possible collaborative space for such consultations. These fora (though not without shortcomings) ultimately bring stakeholders together to work on processes for tackling domain name and content abuse issues.
An Accountable Trust and Safety Team
Designating a trust and safety team that is responsible and accountable for managing risks as well as those responsible for managing the product or responding to specific incidents is crucial. Generally speaking, many content incidents are not resolved at the registry and registrar levels; therefore, such entities most often use their public policy, general counsel, and compliance teams to resolve issues related to content abuse. They also use third-party services such as CleanDNS, an online harm mitigation solution that, among others, simplifies abuse management and removes bad actors. These practices, however, are not enough, as they are not able to compensate for in-house expertise that can be more responsive to the exact needs of the registry or the registrar. An in-house trust and safety team can, for example, not only possess the required knowledge of the company but also be responsive faster and more effectively to different trust and safety issues.
Risk Assessment
Risk assessment is a process that tech companies use to assess the risks of the digital products to users. Until recently, companies that undertake risk assessments did so on a voluntary basis through initiatives such as the Digital Trust and Safety Partnership. However, over the past few years, legislation around the world, including the EU’s Digital Services Act or the U.K.’s Online Safety Act, have mandated risk assessments. These new requirements come alongside a wide range of frameworks, assessment methodologies, and operational guidelines that have been helping with the management of risks. For example, the United Nations Guiding Principles on Business and Human Rights sets out a human rights due diligence process that includes assessing current and potential human rights impacts.
Registries and registrars may engage in risk assessments in an effort to assess the impact of their policies on registrants and users. For instance, they can measure and report on the number of domain name takedown requests they receive from governments, government agencies, or other trusted notifiers. Thus far, and based on their transparency reports, registries and registrars appear to be at the very early stages of assessing the risks of their domain name registration policies on internet trust and safety and fundamental rights. This might be because it is not clear whether the Digital Services Act or other trust and safety regulations apply to registries and registrars and, if they do, whether a full-scale risk assessment of their products is needed.
With this in mind, for trust and safety governance to be effective according to the DTSP safe framework, the following should be observed: 1) the “acceptable use policy” mechanisms to incorporate user community input, 2) registries and registrars should work with civil society groups and experts for input on policies, and 3) they should document the policies, investigations, research, and analysis. The major registries, such as Public Interest Registry and Verisign, and registrars, such as Tucows and GoDaddy, all have domain name registration and abuse policies, but not all of them engage in wider consultations with stakeholders—especially civil society organizations. Human rights impact assessments, conducted in cooperation with various civil society organizations, should become standard practice.
Trust and Safety Policy Enforcement in Registries and Registrars
A critical issue in all trust and safety frameworks is the existence of adequate enforcement mechanisms that strike the appropriate balance between existing policies and human rights considerations. Enforcement best practices include the creation of a team for enforcement, the development of infrastructure for sorting through reports of violations and escalation paths for more complex issues, formalization of trainings, investment in wellness and resilience of teams dealing with sensitive materials, and, where appropriate, the identification of areas where early detection and intervention is warranted.
Registries and registrars have also been working toward streamlining the reporting of content abuse and providing appeals mechanisms for registrants. For example, Public Interest Registry convened the DNS Abuse Institute, which works with CleanDNS to streamline reporting of registration abuse to the registries and registrars that sign up for the service. They have also focused on evidence-based reporting, so that they can make informed decisions and not have to trust whoever reports the abuse. Additionally, they have set up timelines for responding to various reports and they have prioritized certain content abuse cases, such as CSAM.
In the end, however, trusted notifiers—which are designated entities that are responsible for informing registries about illegal activity, content, and/or DNS abuse associated with a domain name—constitute the core of how enforcement is exercised. Various actors in the domain name space have relied on “trusted notifiers to address both DNS Abuse Issues as well as website content abuse questions that fall within their respective policies.”
Considering the evolving nature of trusted notifiers, registries and registrars have also come up with a process for how a notifier can become trusted:
Trusted Notifiers earn the registries’ and registrars’ trust with a recognized subject matter expertise, an established reputation for accuracy, and a documented relationship with and defined process for notifying the registries and registrars of alleged abuse. While it is ultimately the responsibility of the registries and registrars to take action on verified forms of abuse, Trusted Notifiers can serve as a crucial resource to enhance the abuse monitoring and disruption procedures of registries and registrars.
Whether trusted notifiers are suitable for all forms of content abuse is disputed. Academic research points to the fact that trusted notifiers, at least in the context of copyright infringement, can lead to turning registries and registrars into regulators of content and result in the encroachment on freedom of speech. In short, the system is far from perfect, and the overreliance on trusted flaggers can result in capture and abuse.
Regarding transparency, trust and safety best practices command a commitment to publishing and reporting on relevant policies such as enforcement practices, providing notice to users whose content or conduct is part of an enforcement action, logging incoming complaints, creating processes for supporting academic and other researchers, and, where appropriate, creating in-product indicators for actions taken. To this end, registries and registrars have recently started publishing quarterly transparency reports, which include the number of domain names that have been taken down due to incitement to violence, coronavirus misinformation, or the sale of stolen credit card information. The transparency report by Identity Digital, for example, provides a timeline for how long it takes a registrar to take action, how many cases are remediated by third parties but end up not escalating, the number of cases that the registrants remediated, and how many cases the registry ultimately decides to take. According to Identity Digital’s latest transparency report, there were no content incidents, but the report mentioned the cases the registry decided to address: one case that involved CSAM, one that involved online illegal distribution of opioids, one related to human trafficking, and one related to specific and credible incitements to violence.
***
Trust and safety practices in the domain name ecosystem are taking shape and evolving. Registries and registrars have begun using various trust and safety approaches to provide minimum structures for domain name governance, such as transparency reports, policy processes and enforcement mechanisms, and some attempts to engage with other communities to receive feedback and undertake public consultation. By applying the current trust and safety framework to the registries and registrars, policymakers must be cautious, since content moderation at the domain name level can be disproportionate and oftentimes ineffective. While registries and registrars are not so unique that they need a completely different framework for analysis, scholars and practitioners should clarify what parts of practices and current frameworks are not applicable to the domain name system and why.
A path forward should incorporate three lessons: First, trust and safety practices illustrate that registries and registrars should continue working on adopting an industry-wide best practice framework; collaboration with other stakeholders and inclusion of research and user feedback will be key to address trust and safety and ensure an equitable DNS environment. Second, if registries and registrars do not act, governments will act on their behalf. As trust and safety legislation is sweeping jurisdictions around the world, registries and registrars must be proactive in setting effective policies that can become standards for regulation. And third, for trust and safety to work effectively in domain name registration, it is important that registries and registrars continue to focus on trust and safety practices, adopt terminologies and approaches that can provide transparency and consistency, and implement the practices while considering fundamental rights.
– Farzaneh Badiei, Konstantinos Komaitis, Published courtesy of Lawfare.