On April 8, 2019, the British Department for Digital, Culture, Media & Sport released the \u201cOnline Harms White Paper,\u201d which generated considerable media attention. The attention is deserved, and it is worthwhile going through the 102-page document. There is a 12-week consultation period and many stakeholders will likely submit comments.\nThe approach taken by the White Paper is probably too prescriptive from an innovation standpoint, despite the pro-innovation language in the document. But it is certainly a sign of the times. Whatever the UK ultimately decides to do with respect to online harms content moderation, UK regulation should not be applied extraterritorially.\nA new \u2018duty of care\u2019 for companies is the core of the white paper\nThe discussion on the \u201cduty of care\u201d that the UK intends to impose on companies is perhaps the heart and soul of the document, i.e. Chapter 7. The duty of care is to be applied to virtually any website that allows for some user interaction or discovery of user generated content. The UK authorities have set themselves a very ambitious task \u2013 the authors themselves say that prioritization will be needed, at least initially.\nThe Department includes, under the future regulator\u2019s remit, an obligation to promote innovation, as well as protection from online harm. That, plus the emphasis on proportional regulation to benefit SMEs is welcome, although the document places a great deal of emphasis on prescriptive regulation in so many different areas, it is hard to see how this approach would not have an impact on innovation.\nContent removal the focus for how to implement the duty of care\nThe focus of the White Paper is on developing new regulation compelling companies to be more proactive in removing harmful content without changing the liability regime, which in the UK will still be governed by the E-Commerce Directive. In 2016, SIIA wrote that in an age of terrorism, tech companies have the following responsibilities (but in the U.S. context, no legal obligation): take down responsibilities; countervailing responsibilities to foster free speech and association; and affirmative responsibilities to take steps to counter violent extremism.\nClearly, the UK authorities want companies to be more proactive and quicker with respect to take-downs. That is certainly a legitimate goal, although the discussion \u00a0about proposed EU monitoring obligations suggests that the broad and general proactive monitoring obligations contemplated by the White Paper may not be entirely consistent with the E-Commerce Directive.\nIt would also be worthwhile for the UK authorities to consider how to incentivize the more affirmative responsibilities. As has been noted by many observers, a lot of speech is offensive but cannot be eliminated because of freedom of expression law and values. \u00a0The White Paper says that regulation will be consistent with freedom of expression. If that is to happen, the more affirmative responsibilities are important in reaching that goal.\nBroad list of harms considered\nOne of the services that the White Paper does is provide a list of exactly what harms it proposes to regulate on page 31 of the document, although the Department is clear that the list is not static. Moreover, each harm is not defined so companies will be relying on the interpretation of the regulator as to exactly what a harm is.\n\u201cHarms with a clear definition\u201d include child sexual exploitation and abuse; terrorist content and activity; organized immigration crime; modern slavery; extreme pornography; revenge pornography; harassment and cyberstalking; hate crime; encouraging or assisting suicide; incitement of violence; sale of illegal goods\/services, such as drugs and weapons (on the open Internet); content illegally uploaded from prisons; and, sexting of indecent images by under 18s.\n\u201cHarms with a less clear definition\u201d include cyberbullying and trolling; extremist content and activity; coercive behavior; intimidation; disinformation, violent content; advocacy of self-harm; and, promotion of female genital mutilation.\n\u201cUnderage exposure to legal content\u201d includes children accessing pornography. It also includes children accessing inappropriate material (including under 13s using social media and under 18s using dating apps and excessive screen time.\nHarms not under consideration in the white paper\nThe White Paper says that harms to organizations such as competition, intellectual property rights, and organizational responses to fraudulent activity are not covered by the White Paper. With respect to individuals, separate legislation deals with harms stemming from breaches in data protection. Cybersecurity harms and harms suffered by individuals \u201con the dark web rather than the open internet\u201d are also excluded. \u00a0\u00a0\n\u2018Duty of care\u2019 the core of the white paper\u00a0\nObservers have rightfully emphasized this aspect of the White Paper. The Paper\u2019s Chapter 7 spells out what a duty of care might look like generally and specifically with respect to different kinds of harm. To begin with, any company providing \u201cservices or tools that allow, enable or facilitate users to share or discover user-generated content, or interact with each other online\u201d falls within the scope of the regulation contemplated by the White Paper.\nIn this context, social media platforms, file hosting sites, public discussion forums, messaging services and search engines are covered by the proposed regulation. Retail websites with product reviews and\/or user comments would also be covered. As a general matter, such companies should do the following to comply with their duty of care (more specific requirements are listed later in Chapter 7):\n\nEnsure their relevant terms and conditions meet standards set by the regulator and reflect the codes of practice as appropriate.\nEnforce their own relevant terms and conditions effectively and consistently.\nPrevent known terrorist or CSEA content being made available to users.\nTake prompt, transparent and effective action following user reporting.\nSupport law enforcement investigations to bring criminals who break the law online to justice.\nDirect users who have suffered harm to support.\nRegularly review their efforts in tackling harm and adapt their internal processes to drive continuous improvement.\n\nIn any event, companies should today regularly review what they are doing with respect to the actions listed above. Businesses might want to consider where they could conceivably do more. It seems worthwhile, for instance, to beef up departments dealing with users that report harm. \u00a0And those departments should include individuals who can speak empathetically with users.\nThe white paper lists types of harms but does not explain how to determine if content is harmful\u00a0\nThe Department produced a lengthy list (at least 24 types of harm) of different kinds of online harms. However, there is no indication of how to benchmark content and establish if it is harmful. For example, the line between acceptable political speech and unacceptable public figure intimidations is not clear. If a new regulator is established as the White Paper calls for, it will surely have to prioritize the most serious harms such as child sexual exploitation and terrorist content and activity. \u00a0\u00a0\nUpcoming codes of practice likely to be prescriptive\u00a0 \u00a0\u00a0\nOne of the themes of the White Paper is that in the view of its authors, the era of tech company self-regulation is over. So, UK authorities will be looking for a new regulator to set out detailed prescriptive \u201ccodes of practice\u201d for companies. It is worthwhile recalling though that no matter how detailed and prescriptive the codes of practice turn out to be, companies will still be making judgment calls. For example, it is probably never going to be clear what the difference might be between \u201cextreme\u201d pornography and acceptable pornography.\nSo, especially if the UK decides to make company officials personally liable for online harms, companies are likely going to lean towards taking down possibly objectionable material, rather than, for instance, investigating more thoroughly whether, for instance, freedom of expression concerns\/values might possibly override a possible online harm. That might be a trade-off that UK authorities consider worth making, but it is a trade-off.\nContinued tech sector innovation welcomed in the white paper\u00a0\u00a0\nThere is a good discussion in the White Paper about the continued importance of promoting innovation in the tech sector, while at same time curbing online harms. That is welcome because the reality is that tech will continue to be a source of growth in the UK and other advanced industrial economies. It is also simply a driver of new products and services that consumers are quick to take advantage of. UK regulators have been very creative particularly in the financial space, for instance through \u201cregulatory sandboxes,\u201d which the White Paper supports. Hopefully that creativity will be demonstrated in the tech sector as well. \u00a0\nWhatever ultimately emerges, UK regulation should not be applied extraterritorially\u00a0\nThe online harms space is so vast that countries will likely have different views regarding what is harmful and what is not. Perhaps most importantly, countries will strike different balances between acceptable and non-acceptable harms. For example, the US First Amendment renders it difficult to imagine that the kind of broad monitoring and duty of care approach envisaged by the White Paper could be implemented in the United States.\nSo, as has been noted in the EU right-be-forgotten context, it will be important to ensure that UK regulations do not impact what consumers\/users in other jurisdictions can or cannot do. This also reinforces the view that \u00a0geolocation tools should continue to be permitted to be used to allow companies to comply with different rules on content in different countries.