Spirit AI, a leader in Artificial IntelligenceNatural Language Science and Machine Learning, announced significant updates to its customer intelligence tool, Ally. Designed to keep online forums healthier, Ally allows Customer Service and Data Science teams to understand the general tenor of the online community and predict problems before they escalate.

Ally considers context, nuance and the relationships between users versus seeking and blocking keywords. The software uses Natural Language Understanding and AI to identify the intent of a message, and then analyzes the behavior and reactions to determine its impact. Ally intelligently assists CS teams to identify users and user behaviors, and then provides data and customizable tools to improve the overall state of online play and forums.

“Online harassment has increasingly become an issue, both in-game and in other online forums, but it’s a challenging problem to solve. Not all communities have the same culture, rules, and norms, so one-size-fits-all solutions aren’t the best answer,” said Mitu Khandaker, Creative Partnerships Director at Spirit AI. “It’s important to be able to consider the context; to be able to discern harassment from smack talk, in order foster safe, healthy communities. Ally looks beyond language to context and behavior to provide intelligence to help head off trouble.”

Spirit AI recently introduced a host of new features, some of which were unveiled at this year’s Game Developers Conference, and as part of the new Fair Play Alliance Summit. Product improvements include new Smart Filters, an updated and customizable web-based front end and intuitive node-based interface, and GDPR compliance.

 

> Smart filters – Using proprietary machine learning technology, Ally groups language classifications such as profanity and sexual content in ways that help avoid the common pitfalls of current technologies. Rather than taking a redactive or ‘censoring’ approach, Ally determines game-specific slurs to create robust word groupings that are easily identified as helpful or harmful.

  • Natural Language Classifiers and Sentiment Analysis – Ally looks at the words in real time and in context to detect upset users.
  • Bot Detection – Because Ally’s system looks at every message it can find patterns in language that reveal the bots and bad actors who are targeting your community.
  • Case File – Detected behavior is permanently stored so a Community Manager can understand an actor’s case file and identify repeat behaviors
  • User Insight – Ally can reveal how your users are interacting with each other and highlight the bad apple before the barrel is spoilt.

> User Interface – Ally‘s UI allows customer intelligence teams to monitor behavior across the community and automatically highlight repeat offenders. Customer service teams can rapidly identify both positive and negative behaviors and either promote or demote their users immediately.

> GDPR – Ally is GDPR compliant to better protect user privacy and data.