After/Lives: Rethinking Moment of Death Content on Social Media Platforms
Neda Agha-Soltan was protesting in Iran’s 2009 elections when police fatally shot her. A bystander recorded and uploaded the final, brutal moments of her life to YouTube when, at that time, company leadership decided to preserve the video on the platform. Though Soltan was often referred to as the YouTube martyr, the reality is that countless “martyrs” exist on social media. This moment of death (MOD) content often presents many of the thorniest questions for platforms—from balancing the need to preserve potential evidence of abuses of power with the desire to shield the darkest aspects of humanity from the world.
Proponents of leaving this footage online cite the benefits for individuals who specialize in open-source investigations; the need to accumulate war crime evidence; holding abusers of power accountable; and concerns about removing critical content at the expense of general moderation efforts. MOD videos implicate specific free speech rationales that proponents often point to, whether directly or indirectly. A desire to protect such content may also stem from a wariness of governmental intrusion or governmental attempts to suppress evidence of violence inflicted upon its citizenry. On the other hand, critics argue that MOD has adverse health effects on viewers and moderators, and some may point to the fact that these posts have yet to concretely or single-handedly achieve ostensibly noble goals of pursuing social change or holding abusers accountable.
This Piece seeks to complicate the takedown/leave up binary that surfaces in discussions around this content by putting forth a normative claim that ends up somewhere on the margins of both. Though MOD content can encompass a variety of situations, the Piece will home in on MOD involving state-imposed violence. I argue that MOD content has no place for “ordinary” consumption but that this content must be preserved for specific, limited audiences who can find purpose out of at times seemingly senseless acts of violence. The Piece first discusses the benefits and disadvantages of this normative stance before using state laws around viewing death penalty executions to illustrate certain principles that emerge in regulating the consumption of death. Finally, the Piece concludes by examining how these principles may work across several potential solutions to the enforceability of removing this content and preservability for future needs.
Part I: Back to (Normative) Basics—A Closer Look
There are several key reasons behind the normative stance proposed in the introduction and legitimate counterpoints or criticisms that all deserve mention. This Part will first lay out what I believe are the most important reasons for blocking MOD footage online: dignity for the deceased and racial equity, particularly recognizing the trauma communities may experience from this content. Afterward, I will identify, and engage with, the most salient criticisms of restricting or removing MOD videos or posts.
Privacy for the Deceased
One benefit of removing MOD content from social media platforms is to restore the dignity and privacy that should accompany death. An individual’s passing should be a private matter for the individual and for those closest to them—an idea that many jurisdictions have gravitated towards in both common law and statutory provisions in situations where images of the deceased were made public (or attempts to obtain such images were denied). The notion of preserving the unrepeatability of a “temporally sacred moment” has emerged in common law jurisprudence, which American case-law can illustrate. The United States Supreme Court engaged with this “privacy-of-death” notion in National Archives & Records Administration v. Favish. The case centered around an attorney for President Clinton who was found dead after committing suicide in a public park, and police had taken photographs of the scene after his death. A private citizen, Allan Favish, had asked for the death scene photographs by filing a Freedom of Information Act (FOIA) request with the National Park Service. After Favish was denied, he brought suit against the National Archives and Records Administration.
The District Court dismissed his claim, and the Court of Appeals for the District of Columbia affirmed the lower court’s decision. In its unanimous decision, the Supreme Court rejected Favish’s claim that the person “who is the subject of the information is the only one with a privacy interest.” It noted that family members have a right “to direct and control disposition of the body of the deceased and limit attempts to exploit pictures of the deceased family member’s remains for public purposes.” The opinion goes on to say that the family of the deceased has “a personal stake in honoring and mourning their dead and objecting to unwarranted public exploitation that by introducing upon their own grief, tends to degrade the rights and respect they seek to accord to the deceased who was once their own.” In an analysis of cases that cite Favish, Clay Calvert discusses a case in which the New York Times sued the New York City Fire Department to gain access to 911 calls on the day of the September 11th terrorist attacks. The state court acknowledged the public’s right to know whether there were problems in the city’s emergency response the day of the terrorist attack but stressed that the family members’ privacy interest “trumped the public’s rights to know what the deceased said.”
American case-law provides common ground for those who may be tepid about the normative stance that all death should be private. There are important factors that prompt courts to reject Favish-style arguments when images of the dead are released. In Showler v. Harper’s Magazine, the plaintiffs sued Harper’s after it published photos of a dead soldier in a casket when a photographer attended a soldier’s funeral service and viewing. Citing Supreme Court jurisprudence, the Tenth Circuit wrote that the Court has “recognized a family’s right to control the death images of the deceased” when these images are gruesome or do not involve a public funeral. The photographs were not of the soldier’s “death-scene” and were taken in a public setting where the community could attend and view the corpse in the casket; thus, the Tenth Circuit was unwilling to entertain the plaintiff’s suit against Harper’s Magazine. Though American courts have not necessarily established a right to one’s image like other countries—for example, Spain—there is a growing body of common law that affirms the notion that the family of the deceased should have control over the deceased’s images. The principles that are developed by courts, as well as limits they place on when a family “loses” this right, can serve as helpful guideposts to companies.
Racial Equity and Community Trauma
While companies are quick to boast about how citizen journalists use their platforms or give lip service to human rights, some voices on the internet are growing increasingly critical of platform attitudes about violence in the Global South compared to the Global North and in Black and brown communities in America as well. Jay Aronson notes that such footage “may ultimately desensitize the public to what is being portrayed. At a certain point, many people will learn to tune out the reality of what they see because it is too graphic.” In some ways, the decision to leave up MOD content—often justified by platforms in human rights language—can be counterproductive or harmful to brown and Black communities, exposing them to re-traumatization from this documentation.
Company executives—many of whom are white or rich (or both)—far removed from violence are the ones often making these sensitive decisions and routinely consider human rights and documentation considerations while making high-profile escalation decisions, though it is unclear to what extent such footage spurs global calls for action. For example, Kate Klonick quotes Nicole Wong (a former lawyer for Google) in YouTube’s decision to leave up a video of Saddam Hussein’s hanging “because we [YouTube] felt from a historical perspective it had real value.” Zuckerberg himself commented on users uploading videos of Philando Castile’s murder as the reason a “connected world” is critical.
However, activists, writers, and scholars have criticized the need to document and broadcast violence in the name of human rights. Some go so far as to assert that platforms only do so because they devalue brown and Black bodies worldwide. In an open letter to Zuckerberg, activists from Ethiopia, Myanmar, Syria, Bangladesh, and the Philippines write that Facebook has actually failed to “‘provide a duty-of-care for users in sometimes repressive regimes’” and go on to further chastise the company and demand that Facebook protect users in all of its markets equally.
Others note that such footage can actually disadvantage Black and brown individuals. For example, some noted how footage of blatant police violence against Rodney King harmed efforts for accountability. News outlets chose to freeze stills from the recording that showed King in “aggressive” positions, with Patricia Williams noting that the freeze-frame turned King’s body “‘into a gun . . . a bullet of a body always aimed, poised, and about to fire itself into deadly action which served to justify the ‘reasonable’ nature of the officer’s actions.’” Coupled with racist language, these freeze-framed depictions illustrate what Ryan Watson terms the “ballistic black man.”
In addition to the harms that Black people and Black communities experience when witnessing these ugly, racist perspectives in public discourse, the documentation itself can also impose specific trauma on community members who experienced violence. Scholars, in a study of a community that not only experienced a grisly act of violence but also extensive publication of the act, detailed how the community consequently experienced alienation, anger at how the community was constructed in public, intrusive attention on the community, and “renewed feelings of loss and grief.” Graphic content raises the potential to re-traumatize individuals who knew a victim closely and also threaten to harm a community’s ability to heal after a violent event. For members of a minority community, such as people of color in the United States, gruesome videos “combined with lived experiences of racism, can create severe psychological problems reminiscent of post-traumatic stress syndrome.”
Confronting and Critiquing Counterpoints
Some may disagree vehemently with the justifications I presented for removing MOD content online. A portion may root their disagreement in the idea that free speech is critical to disseminating information. Zeynep Tüfekçi herself has made these types of arguments, saying that:
The massive censorship of reality and images of this reality by mainstream news organizations from their inception has been incredibly damaging. It has severed this link of common humanity between people “audiences” in one part of the world and victims in another. This censorship has effectively relegated the status of other humans to that of livestock, whose deaths we also do not encounter except in an unrecognizable format in the supermarket.
Another camp may also adopt a similar but ultimately distinct argument rooted in social change: if these deaths are captured, then abusers of power can be held accountable and charged with the crimes they have committed.
Both camps of the “leave up” side of the debate raise critical points regarding MOD content. However, it is worth interrogating these justifications to ask if MOD footage is genuinely achieving the points they raise. First and foremost, many have challenged these points by discussing concerns around desensitization, dehumanization, and inefficacy. Susan Sontag once noted that images of violence lead to “the bemused awareness, continually restocked by photographic information, that terrible things happen.” If one is to consider the cumulative effects of violent images and videos from over a hundred years, the idea of MOD content as “clichéd signifiers” of atrocity becomes a more vital point to consider. In comparing videos of Neda Agha-Soltan and Oscar Grant, Jennifer Malkowski notes that with each viewing, users are invited to “reviv[e] the dead,” and commentators, scholars, and critics have discussed these points in the context of MOD footage of Black men and described the harm associated with “Black Death” captured on video.
Black deaths that are captured on video “have not had the same legitimacy as other forms of records generated by the State, and are therefore often considered suspect by legal, state sanctioned decision makers.” Instead, the frequent exposure of bodies of color as dead ones not only threatens to desensitize viewers with each new video but creates an insidious association that the value of a person of color (or member of another minority group) comes not from life but death. While some may point to the explosion of rage following the recording of George Floyd’s murder as an example of MOD footage that is not only highly effective in terms of mobilizing public sentiment and driving reform efforts, the reality is that the majority of these recordings do not affect widespread change. Ethan Zuckerman powerfully and succinctly described this point, saying that “[i]magery may matter as far as getting people out into the streets, but it does not matter as far as preventing police from using violence in the first place.” In fact, The New York Times published a disturbing fact during the Derek Chauvin trial: from the start of the trial on March 29th to April 17th, at least 64 people (an average of more than 3 people a day) died because of police violence. Though Floyd’s name has been seared into the public consciousness, there are countless others whose names we do not know who continue to be murdered at the hands of the state.
Lest we forget that the undercurrent underneath many MOD videos is an obsessive business metric for companies: views. Whether a live stream of police violence, an upload of a murder, or other injustices, this content is primed for viral spread. Even if companies have policies in place that limit the monetization of such content, the reality is that this content attracts viewers to the site more broadly, and this “political economy” of Black death helps “corporations and their sponsors benefit financially from the repetitious viewing of these records as a result of increased viewership.”
This Section has sought to directly address counterarguments that people point to as reasons we need MOD content—accountability, social engagement, and change—and complicates these arguments by showing the dehumanization, desensitization, and potential inefficacy of recording MOD material. This is not to say that there is no role whatsoever for these videos and posts. On the contrary, some parties should consume this content—a topic that will be illustrated and discussed in more detail below. The following Part seeks to explain how this normative stance (of severely restricting who can watch and access MOD footage) might look like in practice using a non-digital example of states determining witness procedures in death penalty executions.
Part II: Judge, Jury, Executioner, and Gatekeeper? – Exploring Statutes and Cases Around Death Penalty Viewings
Thus far, the Piece has presented a series of normative claims around the need to restrict MOD content. Until now, however, I have not presented a practical framework of how the restrictions may work. This Part presents American states’ efforts to curtail viewings of executions to certain parties as a potential roadmap. There are currently 27 states that still carry out the death penalty in America. Some of these states have not executed prisoners in over a decade, while others continue to engage in the practice with some regularity. No matter the frequency with which states use this brutal practice, key similarities emerge across all the state statutes. First, states seek to impose limitations on the number of people permitted to attend. Second, there is a nearly universal consensus on the need for media representation in the audience. Third, the statutes permit families to be in attendance. Finally, a number of statutes actually mandate that there be a form of public representation in the audience witnessing the execution. The final Part will propose a range of options that platforms can integrate these principles embodied into their content moderation efforts.
Before diving into the discussion, it is essential to clarify precisely why an analogy to state laws on death penalty executions is needed. The analogy, though helpful and, hopefully, provocative is imprecise: companies are not the ones imposing death on their users like a state does when it executes a prisoner; community guidelines are far more malleable than statutes and lack the sort of protections that ostensibly come from judicial review. Deaths from state-imposed execution are also announced publicly and made known if, hypothetically, even no one was present. Governments attempting to restrict viewing access must also deal with First Amendment allegations that a company would defeat with ease.
Nonetheless, some similarities should be highlighted to justify using state laws on witnessing executions as a worthwhile illustration for this Part. First and foremost, when it comes to death penalty provisions, states rely on statutes as a company relies on its “statutes,” or community guidelines. Second, as will be highlighted in greater detail, many of these laws have drawn similar normative lines in that only specific individuals are allowed to watch the moment of death, and there is a general understanding of the value journalists and watchdogs provide by being part of the event.
There are currently 27 states that still carry out the death penalty in America. Until the mid-nineteenth century, executions were public affairs in the United States out of a commonly held belief that watching such violence served a check on potential abuse at the state’s hands. As concerns grew that people who attended these public spectacles had “animal feelings,” states slowly began introducing private execution laws, with state legislatures taking control of the execution process in the twentieth century.
Though there were challenges to bringing executions out of public settings and into the confines of prisons, executions never became fully private. While states have, over the years, limited who can attend, courts have been quite persistent that there is a “public” right to watch. During a much later case, the Ninth Circuit stated, while respecting the Supreme Court’s stance in Holden, “we reach the question and conclude that the public does indeed enjoy a First Amendment right of access to view executions from the moment the condemned is escorted into the execution chamber. However, . . . the public’s right of access may be reasonably limited.” The Ninth Circuit later stressed that the “[i]ndependent public scrutiny—made possible by the public and media witnesses to an execution—plays a significant role in the proper functioning of capital punishment.”
In addition to recognizing the right to witness an execution and its ability to be restricted, litigation has also illuminated other questions that the legal system needed to resolve regarding witnessing executions. For example, courts have said that limiting the ability to watch the entire administration of the execution can be unconstitutional. Similarly, restricting the ability to hear the execution in full has recently been found to be impermissible conduct on behalf of prison officials. Simultaneously, prison regulations that forbid witnesses from using cell phones during lethal injections have been upheld by courts, as have circumstances where death-row inmates are denied their legal counsel at the time of execution. In short, we see that courts have been quite protective of the right to witness an execution in full (e.g., auditorily and visually; from start to finish) while at the same time upholding some restrictions.
While litigation may have helped clarify specific ideas—for example, limited public viewing and the idea of proportional restrictions—around full access to the event, before concluding this Part it is worthwhile to briefly discuss similarities across state statutes delineating witness procedures. All state statutes require a physician to be present and limit the number of people able to attend an execution, though some states codify a total limit and others determine the limit on a case-by-case basis. Nearly all state statutes examined in this research allow the prisoner to choose among friends and family to invite and extend the ability to invite a religious or spiritual leader. States also permit the victim’s family members to attend,  though some states like Wyoming place limits on the number of family members and friends permitted to be present. All statutes impose an age limit, though the minimum age limit differs across states between at 18 and 21 years of age. Nearly all states permit media presence; Wyoming is the only state that does not.
The last item worth discussing is the public participation component. A few jurisdictions require that a certain number of members of the public be present—such as Pennsylvania, Florida, Arkansas, and Virginia—in order for the execution to take place, with some states struggling to find enough people from the public to participate at times. The idea behind this requirement is, as the head of the Death Penalty Information Center once said, was that these people “are considered public eyewitnesses, and go to executions standing in the place of the general public” and that “[i]t’s a recognition that these proceedings need to take place in public view.”
Part III: Implementing Principles into Content Moderation Policies
Part II laid out key themes in state statutes on who can witness death penalties. Most states require a limited audience comprised of relatives of the prisoner or the victim, while some also mandate a public participation requirement supplementing, or in rare circumstances replacing, family and friends. States also overwhelmingly permit media presence. Cases examined above scrutinize some restrictions to the execution process that curtail the witness process such as limiting sound or view while maintaining there is a right to view executions.
These principles that have emerged in the state’s regulation of witness access to the death penalty may offer a roadmap for companies to adopt in their content policies around violence and MOD footage. This Part lays out several approaches across the platforms’ policies, product features, and enforcement strategies that can adopt these principles to restrict the number of viewers and reasons for viewing MOD. Though the Piece has grappled with MOD at the hands of state actors, these solutions could be applied with other contentious MOD material as well. I propose a spectrum of options for companies to consider, arranged from most to least restrictive in terms of enforcement. While the most restrictive option of all, to simply remove all MOD imagery, is operationally feasible for companies, it is normatively undesirable—as discussed earlier in the article—because of the value that some individuals may find in watching this content. As such, it is not presented among the sample of choices below.
An Independent Archive? A Digital Graveyard?
The most restrictive option would be to remove all MOD material and place it in an archive where specific audiences would be granted access to this material. Some have theorized about such a model using the German “Giftschränke,” or “poison cabinets.” While Giftschränke had their origins as “cordoned-off sections of libraries,” their goal was to “house materials deemed unfit for widespread circulation.” While John Bowers and Jonathan Zittrain have written about Giftschränke as a model to capture critical content and the record of “moderation actions along with metadata capturing the specifics of those actions,” an independent archive would fulfill the ethos of the Giftschränke—to preserve critical content while restricting access to the material—but aimed less at also creating a record of enforcement actions. Instead, this repository would aim to prevent the loss of critical human rights evidence from automated or human enforcement, even if the enforcement was not “erroneous” under a company’s community standards. In other words, a digital archive can accommodate a platform’s needs to remove content under its policies or because of brand risk while ensuring that this material lives somewhere online.
Human rights organizations have also explored this topic within the context of compiling evidence of violations of international crimes. In a recent Human Rights Watch report discussing the erasure of content, often under the auspices of “terrorist content,” the organization called for “an independent mechanism to take on the role of liaising with social media platforms and preserving publicly posted content.” The report goes on to recommend that this mechanism “should then be responsible for sorting and granting access to the content for archival and investigative purposes in a manner that respects privacy and security concerns.”
Some may argue that platforms should make their own archives on their sites, but a model inspired by the Global Internet Forum to Combat Terrorism (GIFCT) database is necessary and can, from a technical perspective at least, quickly be developed for this purpose. In 2017, Facebook, Microsoft, and YouTube helped establish the database, which consists primarily of images and videos belonging to the Islamic State and al-Qaeda. “Hashes,” or unique fingerprints of this content, are added to the database and other companies can use this repository to find such images or videos on their platforms. The GIFCT is now an independent institution and continues to maintain this database and oversee the contributions from GIFCT member companies.
To replicate the level of company cooperation and government support embodied in the GIFCT initiative, I propose establishing an independent institution and database dedicated to human rights content, including MOD imagery. With this approach, this critical content can be stored safely without fear of accidental erasure by algorithms or content moderators, with strict oversight over who may access such footage, and, ultimately, serve a secondary purpose as a “digital graveyard” out of respect for the families, friends, and communities of the person(s) whose last moments were preserved online in perpetuity. Content would be identified by a participating company, hashed, and then submitted to this independent database for labelling and archiving. Individuals would also submit material directly, an idea that has also been championed in the Berkeley Protocol for open-source investigations. A shared database, with technical and financial aid from companies, could collect MOD across a variety of instances, from car accidents to suicide and from police brutality to war crime evidence using hash-matching technology. As an independent institution, it can also be shielded from corporate interference and develop admission criteria for people and organizations to review this material. This model can even incorporate certain principles illustrated in Part II, such as a public participation component to viewing MOD, and have more generous admissions “criteria” for people; alternatively, if this database wants to be even more privacy-respecting of death, then entrance can be gated to institutions and individuals that can demonstrate compelling reason to have access.
Another model that could be implemented is to involve family input. This option is in the middle of the spectrum because it assumes that families will vary in their responses: some may believe that despite the pain it is critical for the broader public to watch this content while others may refrain from public sharing. For example, after 13-year old Adam Toledo’s murder, his family had the opportunity to view the footage of Toledo’s encounter with the police. In fact, the Civilian Office of Police Accountability (COPA) in Chicago “would not immediately release the video publicly at the request of the Toledo family.” According to a statement by COPA, although the family expressed “their desire to avoid public release of materials related to Adam’s tragic death” the organization had to release the video eventually as part of city policy.
The idea of incorporating the family of the deceased into policy and enforcement decisions would center the needs of the family, even if other parties may want to view such footage. YouTube, for example, already has an intake form for videos that show someone dying or being critically injured, and this can be repurposed somewhat easily for family members to provide information, including the enforcement approach (e.g., leave up, age-restrict, remove, etc.) they would prefer companies to take. While the intake form could be repurposed, this option, while normatively highly desirable, may be the most difficult to operationalize for several reasons. First, it will require extensive communication to ensure that claims are not fraudulent. Second, not everyone may know of this feature or know how to contact the company, which ultimately places a burden on family at a time of grief. Third, family members may find themselves subject to pressure from multiple sides of a dispute, perhaps facing immense pressure to remove content or leave it up. Finally, by centering family input above all else, companies may also find themselves in the position of enforcing inconsistently, a criticism that many have faced repeatedly over the years.
The least restrictive approach could be for companies to continue their enforcement approach as is but with several key twists. First, MOD content would be reviewed under a platform’s community standards. Content that shows MOD but that would otherwise not violate policies would be tagged by reviewers and set to re-enqueue for review after a set period of time (e.g., one week, two weeks, two months from initial flag). By remaining on the platform, it gives the chance for the public, and other interested third parties, to “participate” by watching the video. MOD footage that triggers public outcry or goes viral can then be protected from removal by moderators or company employees for an extended duration to continue to fuel productive conversation. Companies could then follow up with local teams, civil society organizations, researchers, or other third parties to help decide when it is time to pull viral MOD material.
This approach would continue to allow companies to make their own determinations on newsworthiness of content but would offer some peace of mind to families and afflicted communities by providing knowledge that the footage will no longer be visible at some point in time. The theory is that the visual of the death serves a purpose only for a brief window of time. After a certain point, the visual’s galvanizing potential to encourage legislative or political advocacy is either actualized or no meaningful change emerges. For example, Chauvin’s trial for George Floyd’s murder did not take place in April 2021 because Floyd’s murder is still public on YouTube; the video may have helped spark protests in the summer of 2020 but any organizing taking place now is because of collective memory and deep-rooted frustration at the state, the police and the judicial system. After the MOD content is removed, it can be hashed and not only given, for example, to an independent archive but also used to block re-uploads of the material at a later date.
The Piece has laid out a normative claim for how MOD footage should be treated, drawing from an institution that has developed similar approaches for death penalty cases before concluding with how social media companies can develop solutions for themselves. The painful truth is that while some may fear sanitizing the internet and point to MOD aggregation sites like LiveLeak, or the need for such material to prompt change, the reality is that we have come to make a spectacle of death. And more importantly: how exactly do these videos help?
The trial of Derek Chauvin, the police officer charged with murdering George Floyd, has captivated the country’s attention. At the same time, Black communities and communities of color were forced to relive the graphic details of Floyd’s murder, all the while listening to lawyers attempt to (yet again) sow doubt into this recorded footage of Black death. Shortly before Chauvin’s guilty verdict was announced in Minnesota, Ma’Khia Bryant, a 16-year-old Black girl, was shot and killed by police, a painful reminder that Chauvin’s trial was at best the beginning of change and not the end. An individual might be held accountable from bystander or body camera footage on a case-by-case basis, but when we look at the mountain of work, and the reimagining of society needed to bring this violence to an end, I cannot help but wonder: how many more YouTube martyrs do we need?
*Juris Doctor Candidate, Harvard Law School; previously Policy and Enforcement Manager at YouTube covering political extremism and graphic violence. Many thanks to the brilliant individuals for their encouragement, feedback, and comments throughout the development of this Piece: Evelyn Douek, Nama Khalil, Jeff Lazarus, Maroussia Lévesque, Nicolás Parra-Herrera, Juan Rivera, and Susannah Barton Tobin. Special thanks as well to the Harvard Human Rights Journal for providing editorial support. The ideas presented in this piece grew from my time at YouTube witnessing the final, intimate moments of “YouTube martyrs,” whose deaths are often at the hands of state violence. May they rest in peace and power. All errors are strictly my own.
 See, e.g., Sachin Kalbag, YouTube Martyr is Now Voice of Iran, India Today (June 24, 2009), https://www.indiatoday.in/latest-headlines/story/youtube-martyr-is-now-voice-of-iran-50694-2009-06-24; Jolyon Mitchell, Making and Mistaking Martyrs, Oxford Univ. Press Blog (Dec. 14, 2012), https://blog.oup.com/2012/12/making-and-mistaking-martyrs/.
 And while Soltan’s example suggests that MOD content is only in circumstances of state-imposed violence against citizens, that is most certainly not the case. MOD can also refer to suicide, non-state actors, interpersonal conflict, accidents, and racially or gender-motivated violence, among many other (grisly) circumstances.
 See generally Anna Banchik, Disappearing Acts: Content Moderation and Emergent Practices to Preserve At-Risk Human Rights-Related Content, 18 New Media & Soc’y 410 (2020) (interviewing human rights organizations to understand how they conceptualize, and rely on, graphic violence footage for their work).
 See, e.g., Nigeria: Gruesome Footage Implicates Military in War Crimes, Amnesty Int’l (Aug. 5, 2014), https://www.amnesty.org/en/latest/news/2014/08/nigeria-gruesome-footage-implicates-military-war-crimes/.
 In an American context, this topic came up repeatedly during the racial justice protests in the summer of 2020. See, e.g., Karen Hao, How to Turn Filming the Police into the End of Police Brutality, MIT Tech. Rev. (June 10, 2020), https://www.technologyreview.com/2020/06/10/1002913/how-to-end-police-brutality-filming-witnessing-legislation/.
 See, e.g., Abdul Rahman Al Jaloud, et al., Caught in the Net: The Impact of “Extremist” Speech Regulations on Human Rights Content 8 (2019), https://www.eff.org/files/2019/05/30/caught_in_the_net_whitepaper_2019.pdf.
 See Eric Barendt, Freedom of Speech 193 (2nd ed. 2007).
 See, e.g., Alison Holman, et al., Media’s Role in Broadcasting Acute Stress Following the Boston Marathon Bombings, 111 Proc. Nat’l Acad. Sci. 93 (finding that individuals who viewed six or more hours of footage from the Boston Marathon bombings experienced higher levels of stress than people who had “direct exposure” from being at the bombings themselves).
 See, e.g., Facebook and YouTube Moderators Sign PTSD Disclosures, BBC News (Jan. 25, 2020), https://www.bbc.com/news/technology-51245616.
 See supra note 2.
 See André Bazin, Death Every Afternoon, in Rites of Realism: Essays on Corporeal Cinema 28 (Ivone Margulies ed., 2003).
 Clay Calvert, The Privacy of Death: An Emergent Jurisprudence and Legal Rebuke to Media Exploitation and a Voyeuristic Culture, 26 Loy. L.A. Ent. L. Rev. 133, 135 (2005).
 541 U.S. 157 (2004).
 Id. at 160.
 Id. at 167; see also Calvert, supra note 12, at 135.
 Nat’l Archives & Records Admin., 541 U.S at 167.
 Calvert, supra note 12, at 139.
 See 222 Fed.Appx. 755 (2007).
 Id. at 762.
 See id.
 The Spanish Constitution says the “right to honour, to personal and family privacy and to the own image is guaranteed.” Constitucíon Española (C.E.) (Constitution), B.O.E. n. 18.1, Oct. 31, 1978, as amended, Sept. 27, 2011. In addition, the Organic Law 1/1982, of May 5 also establishes civil protection of the right to honor, personal and family privacy and reputation. See Ramón Herrera de las Heras, The Rights of Freedom of Speech and Information Against the Right to Honor in Spain in Light of the Judgement of the European Court of Human Rights of October 10, 2013 in the Case Delfi v. Estonia, 12 Revista Internacional de Doctrina y Jurisprudencia 1, 7 (2016) (Spain).
 Jay Aronson, User-Generated Content in Human Rights Investigations, in New Technologies for Human Rights Law and Practice 129, 147 (Molly Land and Jay Aronson, eds., 2018).
 See id.
 Kate Klonick, The New Governors: The People, Rules, and Processes Governing Online Speech, 131 Harv. L. Rev. 1598, 1619 (2018).
 Id. at 1600.
 Hannah Kozlowska, #DearMark Letters Underline that Facebook Cares Only About The Crises it Creates in Wealthy Countries, QZ (May 23, 2018), https://qz.com/1284128/dearmark-letters-underline-that-facebook-cares-only-about-the-crises-it-creates-in-wealthy-countries/.
 Ryan Watson, In the Wakes of Rodney King: Militant Evidence and Media Activism in the Age of Viral Black Death, 4 The Velvet Light Trap 34, 42 (2019).
 Linda Kay, et al., Help or Harm? Symbolic Violence, Secondary Trauma, and the Impact of Press Coverage on a Community, 4 Journalism Practice 421, 421 (2010).
 Id. at 431.
 Kenya Downs, When Black Death Goes Viral, It Can Trigger PTSD-Like Trauma, PBS News Hour (July 22, 2016), https://www.pbs.org/newshour/nation/black-pain-gone-viral-racism-graphic-videos-can-create-ptsd-like-trauma.
 Zeynep Tüfekçi, Twitter and the Anti-Playstation Effect on War Coverage, Technosociology (Mar. 28, 2011), http://technosociology.org/?p=393.
 See, e.g., Kimberly Fain, Viral Black Death: Why We Must Watch Citizen Videos of Police Violence, JSTOR Daily (Sept. 1, 2016), https://daily.jstor.org/why-we-must-watch-citizen-videos-of-police-violence/.
 Susan Sontag, Regarding the Pain of Others 13 (2003).
 Jennifer Malkowski, Streaming Death: The Politics of Dying on YouTube, Jump Cut: A Review of Contemporary Media, http://www.ejumpcut.org/archive/jc54.2012/malkowskiYoutubeDeaths/2.html.
 See Armond Towns, That Camera Won’t Save You! The Spectacular Consumption of Police Violence, 5 Present Tense 1, 1 (2015) (arguing that footage of police violence against Black people “normalizes antiblack violence as spectacle.”); see also Allissa Richardson, Why Cellphone Videos of Black People’s Deaths Should be Considered Sacred, Like Lynching Photographs, The Conversation (May 28, 2020), https://theconversation.com/why-cellphone-videos-of-black-peoples-deaths-should-be-considered-sacred-like-lynching-photographs-139252; Allissa Richardson, We Have Enough Proof, Vox (Apr. 21, 2021, 11:00 AM); Alexandra Juhasz, How Do I (Not) Look? Live Feed Video and Viral Black Death, JSTOR Daily (July 20, 2016), https://daily.jstor.org/how-do-i-not-look/.
 Safiya Umoja Noble, Critical Surveillance Literacy in Social Media: Interrogating Black Death and Dying Online, 9 Black Camera 147, 150 (2018).
 Reha Kansara, Black Lives Matter: Can Viral Videos Stop Police Brutality?, BBC News (July 6, 2020), https://www.bbc.com/news/blogs-trending-53239123.
 John Eligon & Shawn Hubler, Throughout Trial Over George Floyd’s Death, Killings by Police Mount, N.Y. Times (Apr. 17, 2021), https://www.nytimes.com/2021/04/17/us/police-shootings-killings.html.
 See, e.g., Violence: Advertiser-Friendly Content Guidelines, YouTube Help, https://support.google.com/youtube/answer/6162278?hl=en (last visited Apr. 20, 2021).
 Tonia Sutherland, Making a Killing: On Race, Ritual, and (Re)Membering in Digital Culture, 46 Preservation, Digital Tech. & Culture 32, 34 (2017).
 Facts About the Death Penalty, Death Penalty Information Center (Mar. 24, 2021), https://documents.deathpenaltyinfo.org/pdf/FactSheet.pdf.
 For example, Connecticut has not executed an inmate since 2006, Illinois not since 2000, and Indiana not since 2010. See Executions by State and Year, Death Penalty Info. Ctr., https://deathpenaltyinfo.org/executions/executions-overview/executions-by-state-and-year (last visited Nov. 9, 2021).
 In using state laws as a case study, this by no means implies any type of agreement with, or approval of, the act of state-imposed executions, the death penalty, or the American criminal justice system more broadly.
 Facts About the Death Penalty, Death Penalty Information Center (Mar. 24, 2021), https://documents.deathpenaltyinfo.org/pdf/FactSheet.pdf.
 John Bessler, Televised Executions and the Constitution: Recognizing a First Amendment Right of Access to State Executions, 45 Fed. Comm. L. J. 355, 359-60 (1993).
 Id. at 361.
 Id. at 362.
 Id. at 363.
 See generally Holden v. State of Minnesota, 137 U.S. 483 (1890) (holding that a Minnesota law that only allowed executions within the walls of a prison and restricted the number of witnesses to a prisoner’s hanging was permissible).
 California First Amend. Coal. v. Woodford, 299 F.3d 868, 873 (9th Cir. 2002).
 Id. at 876.
 See, e.g., Guardian News & Media LLC v. Ryan, 225 F.Supp.3d 859, 870 (D. Ariz. 2016) (“Defendants [Arizona Department of Corrections] are permanently enjoined from conducting lethal injection executions without providing a means for witnesses to be aware of the administration(s) of lethal drugs . . . .”).
 First. Amend. Coal. of Arizona v. Ryan, 938 F.3d 1069, 1075 (9th Cir. 2019) (finding unconstitutional prison practice to turn off microphones, while still allowing witnesses to watch, after lethal drugs were administered to a prisoner).
 Arthur v. Alabama Dep’t of Corrections, 680 Fed.Appx. 894, 913 (11th Cir. 2017).
 See, e.g., Lecroy v. United States, 975 F.3d 1192, 1197 (11th Cir. 2020) (“As an initial matter, we reiterate our ‘consistent [holding] that there is no federal constitutional right to counsel in postconviction proceedings.’ Barbour v. Haley, 471 F.3d 1222, 1227 (11th Cir. 2006). Nor (for better or worse) does the Constitution guarantee a condemned inmate the right to have his lawyer present at his execution.”).
 See, e.g., McGehee v. Hutchinson, 463 F.Supp.3d 870 (E.D. Ark. 2020) (concluding that state rules that only allowed one witness that could only “witness” via telephone connection was unconstitutional).
 See, e.g., La. Rev. Stat. § 15:570 (2014).
 See, e.g., Ga. Code § 17-10-41 (2018).
 For example, the Missouri statute says that the defendant can name up to 5 “relatives or friends” as well as up to 2 “clergy or religious leaders.” Mo. Rev. Stat. § 546.740 (1995).
 Witnesses to an Execution, Tennessee Department of Correction, https://www.tn.gov/correction/statistics-and-information/executions/witnesses-to-an-execution.html (last visited Apr. 15, 2021).
 Wyo. Stat. § 7-13-908 (2016).
 Ind. Code § 35-38-6-6 (1995).
 See Mo. Rev. Stat., supra note 65.
 See Wyo. Stat., supra note 67.
 Pa. Cons. Stat. § 4305(a).
 Fla. Stat. § 922.11 (2020).
 Ark. Code Ann. § 16-90-502 (2010).
 Va. Code Ann. § 53.1-234 (2020).
 Matthew Haag, Arkansas Struggles to Find Enough People to Watch Executions, N.Y. Times (Mar. 25, 2017), https://www.nytimes.com/2017/03/25/us/arkansas-death-penalty-witnesses.html.
 Gareth Evans, The Americans volunteering to watch executions, BBC News (Apr. 11, 2017), https://www.bbc.com/news/world-us-canada-39535957.
 Of course, this is not an exhaustive list of proposals but rather a few options that embrace some of the principles and ideas with which Part II engages. Companies may say that they are already proportionally restricting access to MOD content through measures like age-gating videos or posts. For the purposes of the Piece, I do not consider this to be an effective proportional response since, among other issues, millions of users lie about their age to access social media sites. See, e.g., Mark Sweney, More than 80% of Children Lie About Their Age to Use Sites Like Facebook, The Guardian (July 25, 2013), https://www.theguardian.com/media/2013/jul/26/children-lie-age-facebook-asa; see also Mary Aiken, The Kids Who Lie About Their Age to Join Facebook, The Atlantic (Aug. 30, 2016), https://www.theatlantic.com/technology/archive/2016/08/the-social-media-invisibles/497729/.
 See John Bowers, Elaine Sedenberg & Jonathan Zittrain, Platform Accountability Through Digital “Poison Cabinets” 2 (2021), https://s3.amazonaws.com/kfai-documents/documents/5fc1f603d8/4.12.2021-Bowers-2.pdf.
 Id. at 18.
 See, e.g., Social-Media Platforms are Destroying Evidence of War Crimes, The Economist (Sept. 21, 2020), https://www.economist.com/international/2020/09/21/social-media-platforms-are-destroying-evidence-of-war-crimes.
 Human Rights Watch, “Video Unavailable”: Social Media Platforms Remove Evidence of War Crimes 35 (2020), https://www.hrw.org/sites/default/files/media_2020/09/crisis_conflict0920_web_0.pdf.
 Recommendations to adopt the technical approach behind the GIFCT does not, however, recommend that the GIFCT governance structure be fully replicated. From a governance point of view, a number of civil society organizations have criticized the GIFCT’s relationship to civil society and the organization’s Independent Advisory Council. See, e.g., Joint Letter to New Executive Director, Global Internet Forum to Combat Terrorism, Human Rights Watch (July 30, 2020, 1:00 PM), https://www.hrw.org/news/2020/07/30/joint-letter-new-executive-director-global-internet-forum-counter-terrorism.
 Elizabeth Culliford, Facebook and Tech Giants to Target Attacker Manifestos, Far-Right Militias in Database, Reuters (July 26, 2021, 3:38 PM), https://www.reuters.com/technology/exclusive-facebook-tech-giants-target-manifestos-militias-database-2021-07-26/.
 See United Nations Human Rights Office of the High Commissioner, Berkeley Protocol on Digital Open Source Investigations 86 (2020), https://www.ohchr.org/Documents/Publications/OHCHR_BerkeleyProtocol.pdf.
 Annie Sweeney & Jeremy Gorner, Family of 13-year-old Adam Toledo views video of his fatal shooting by Chicago police officer as authorities wait to release it publicly, Chi. Trib. (Apr. 13, 2021), https://www.chicagotribune.com/news/criminal-justice/ct-chicago-police-adam-toledo-family-views-video-20210414-npf5nzr7iff3tkyvmqsmjromve-story.html.
 The form also states that YouTube reviews “each request, but please note that we take public interest and newsworthiness into account when determining if content will be removed.” Moment of Death Content Removal, YouTube Help, https://support.google.com/youtube/contact/momentdeath.
 See, e.g., Louise Matsakis, YouTube Doesn’t Know Where Its Own Line Is, Wired (Mar. 2, 2018), https://www.wired.com/story/youtube-content-moderation-inconsistent/. This type of decentralized, user-driven moderation approach could also be explored with random users making these decisions, but it would also open the door to inconsistencies, particularly when it comes the public’s long-standing indecision on violent footage online and what “value” people place on it. See, e.g., Post No Evil, Radiolab (Aug. 17, 2018), https://www.wnycstudios.org/podcasts/radiolab/articles/post-no-evil (discussing public inconsistencies when outcry both led Facebook reinstate videos of the Boston Marathon terrorist attacks and also reverse its decision to leave up a beheading video showing the brutality of cartel violence in Mexico).
 By tagging I refer to the process by which reviewers “tag” content as approved, age-restricted, demonetized, and removed, with each removal reason (e.g., sexual content, violence, hate speech) having its own tag type.
 See, e.g., Sara Morrison, Questions to Ask Yourself Before Sharing Images of Police Brutality, Vox (June 11, 2020, 9:00 AM), https://www.vox.com/recode/2020/6/11/21281028/before-sharing-images-police-brutality-protest-george-floyd-ahmaud-arbery-facebook-instagram-twitter; Melanye Price, Please Stop Showing the Video of George Floyd’s Death, N.Y. Times (June 3, 2020), https://www.nytimes.com/2020/06/03/opinion/george-floyd-video-social-media.html.
 Neil Vigdor & Bryan Pietsch, Teenage Girl Is Fatally Shot by Police in Columbus, Officials Say, N.Y. Times (Apr. 20, 2021), https://www.nytimes.com/2021/04/20/us/columbus-ohio-shooting.html.