Deep learning in healthcare: Deep 6 AI and MIT demo uses
The subset of AI known as deep learning, which uses artificial neural networks to teach machines, has myriad uses. From image recognition used in autonomous vehicles to the natural language processing used to turn human speech into machine instructions, the technology has the potential to change our lives. Nowhere is this more evident than in the healthcare sector.
A newly released paper in the journal Cell outlines the discovery of a powerful antibiotic known as Halicin via the deployment of deep learning. The scientists involved trained a neural network to predict molecules with “antibacterial activity”, eventually finding Halicin, which differs in structure from conventional antibiotics. A further eight antibacterial compounds were also discovered.
A separate but related development comes from Deep 6 AI, a member of the Nvidia Inception accelerator program for AI startups, making use of Nvidia’s GPUs, which are frequently employed for machine learning owing to their highly parallel structure.
Deep 6 AI specifically are involved in finding new methods of screening patients for clinical trials as medicine becomes increasingly targeted. “In the age of precision medicine, clinical trial criteria are getting more challenging,” CEO Wout Brusselaers said. “When developing a drug that is targeting patients with a rare genetic mutation, you have to be able to find those specific patients.”
Deep 6’s method involves processing medical records to identify and then label whatever criteria are relevant for the trial. Using natural language processing to gather relevant mentions, Deep 6 can also analyse unstructured data for additional information before creating a graph for doctors to peruse. The process has led to over 100,000 matches so far.
Discord buys Sentropy to fight against hate and abuse online
Discord, a popular chat app, has acquired the software company Sentropy to bolster its efforts to combat online abuse and harassment. Sentropy, monitors online networks for abuse and harassment, then offers users a way to block problematic people and filter out messages they don’t want to see.
First launched in 2015 and currently boasting 150 million monthly active users, Discord plans to integrate Sentropy’s own products into its existing toolkit and the company will also bring the smaller company’s leadership group aboard. Discord currently uses a “multilevel” approach to moderation, and a Trust and Safety (T&S) team dedicated to protecting users and shaping content moderation policies comprised 15% of Discord’s workforce as of May 2020.
“T&S tech and processes should not be used as a competitive advantage,” Sentropy CEO John Redgrave said in a blog post on the announcement. “We all deserve digital and physical safety, and moderators deserve better tooling to help them do one of the hardest jobs online more effectively and with fewer harmful impacts.”
Cleanse platforms of online harassment and abuse
Redgrave elaborated on the company’s natural connection with Discord: “Discord represents the next generation of social companies — a generation where users are not the product to be sold, but the engine of connectivity, creativity, and growth. In this model, user privacy and user safety are essential product features, not an afterthought. The success of this model depends upon building next-generation Trust and Safety into every product. We don’t take this responsibility lightly and are humbled to work at the scale of Discord and with Discord’s resources to increase the depth of our impact.”
Sentropy launched out of stealth last summer with an AI system designed to detect, track and cleanse platforms of online harassment and abuse. The company emerged then with $13 million in funding from notable backers including Reddit co-founder Alexis Ohanian and his VC firm Initialized Capital, King River Capital, Horizons Ventures and Playground Global.
“We are excited to help Discord decide how we can most effectively share with the rest of the Internet the best practices, technology, and tools that we’ve developed to protect our own communities,” Redgrave said.