The viewing of child exploitation material found on seized mobile phones and computers during abuse investigations takes an incredible toll on police officers.\nOfficers have spoken out about the trauma caused by viewing such content, an issue compounded by not being able to talk about it with anybody. Rates of post-traumatic stress disorder (PTSD) are far higher among police officers than the general population, in part due to having to confront such imagery.\n \nTo help protect investigators from such material, the Australian Federal Police (AFP) is working with Monash University to develop machine learning algorithms that can identify and classify child exploitation material on seized devices, before they are reviewed.\n \nThe initiative will help officers \u201cscan through thousands of confronting images and files faster with lower levels of emotional distress\u201d Monash said.\n \nOver time the software will be extended to cover content from terrorism cases that can also cause significant psychological distress for investigators.\n \n\u201cThe automated detection of abhorrent material enhances workplace safety by going some way towards reducing the incidental and inadvertent exposure to such material by law enforcement practitioners,\u201d Dr Janis Dalins, a federal agent and co-director of Monash\u2019s new Artificial Intelligence for Law Enforcement and Community Safety (AiLECS) Lab.\n \n\u201cThe ultimate goal of this initiative is to ethically research the use of machine learning and data analytics in advancing law enforcement and community safety,\u201dhe said.\n \nA prototype system was created and first described last year by Dalins and researchers at Data61. An image classifier was built using open source machine learning frameworks including Google\u2019sTensorFlow.\n \nAlthough limited in its ability to classify the severity of the child abuse, it was found to be effective as a\u2018forensic triage\u2019or early warning system for investigators.\n \nData Airlock\n \nThe system will be further developed as within Monash University\u2019s new AiLECS Lab which launched today. The lab is supported by $2.5 million in funding and is part of Monash\u2019s wider \u2018Data Futures\u2019 initiative which focus on research into uses of artificial intelligence and data science for social good.\n \nThrough the lab, the AFP will also be making real-world data available to researchers a \u2018Data Airlock\u2019. \n\u201cThis is a service designed to manage legal and ethical restrictions in the field by providing trusted research and industry partners with indirect access to offensive materials on which they can develop and test deep learning-based tools,\u201d Dalins said of the Data Airlock in a blog post last year.\n \nThe airlock \u2013 built by the AFP and CSIRO\u2019s Data61 and hosted at Monash \u2013 enables researchers globally to develop and test machine learning algorithms \u201cwithout being exposed to confronting data\u201d.\n \nAFP Commissioner Andrew Colvin welcomed the launch of the AiLECS Lab.\n \n\u201cThis is a groundbreaking initiative from Monash University and the AFP that will minimise AFP officer exposure to child exploitation material and other distressing content. At the same time, it will vastly increase the speed and volume at which police can identify and classify this content,\u201d Colvin said.\n \n\u201cThe AiLECS Lab will therefore ensure we hold more people accountable for these abhorrent crimes and, just as importantly, we better safeguard the wellbeing of both AFP officers and the community we are here to serve,\u201d he added.\n\nSimilar work is being undertaken by the Metropolitan Police\u2019s forensics department in the UK. \nGoogle last year announced it was making available to NGOs and industry partners its Content Safety API, a toolkit to \u201cincrease the capacity to review content in a way that requires fewer people to be exposed to it\u201d.\n \nFacebook has hired additional human moderators to tackle content violating the company\u2019s community standards.The moderators have described the huge emotional toll the work takes on them.