Data science has come a long way and its uses are broadly applicable. Advances in data technology like keyword scanning and Artificial Intelligence (AI) are allowing developers to build ever more advanced self-harm monitoring technology for school student safety initiatives, among other practical applications that help schools keep kids safe both online and off.
While self-harming behavior is far off from attempting suicide, most experts agree that it can be an early indicator of future suicidal ideation. They also tell us that the three top behaviors that we should look for in terms of student suicide are:
On the other hand, students who self-harm aren’t looking for ways to die. Rather, they’re looking for ways to feel alive. Indicators of self-harming behavior that may show up in school collaboration tools include:
In today’s digital world, students often “talk” online in chat rooms or on social media. Increasingly, students are doing that talking in school-provided apps like Google Docs and Google Chat. As a result, there are more calls for district IT teams to get involved with student suicide prevention programs by implementing tools like self-harm monitoring.
There are several vendors on the market that develop student safety monitoring technologies for school districts. Some use AI, others use keyword scanning, and most use a combination of the two. So, what is the difference between keyword scanning and AI when it comes to self-harm monitoring technology?
Keyword scanning is the earliest method used for student self-harm detection. It’s still heavily relied on today for two reasons:
To set up self-harm keyword scanning, the admin inputs the keywords that they want the technology to look for. It’s similar to entering words into a search engine. In this case, however, when the system locates a keyword it’s supposed to look for, it will flag it for further review.
Systems often use regex in addition to keyword scanning. Regex stands for “regular expression.” It allows admins to enter a text string that defines a search pattern as opposed to a single word. For example, have you ever entered your email address into an online contact form and gotten an error message telling you that what you typed isn’t a valid email address? It means that the contact form incorporates regex logic, and it knows that you forgot to type the “@” sign in your email address.
K-12 IT admins researching different options should look for self-harm monitoring technology that provides customizable filtering options, including the following:
AI is all the rage these days. Even though it’s not currently as perfect as some would like to describe it, it’s still pretty awesome and darn helpful. Many school districts are using technology that incorporates AI at some level for a variety of use cases. Among them is self-harm detection, with an end-goal of helping with suicide prevention.
How does AI technology work? In simple terms, it takes large amounts of data and learns from patterns in that data using fast processing and advanced algorithms. Developers can build AI models from the ground up as a proprietary system. Alternatively, developers can use open source technology, or obtain a license for existing technology that they can customize to fit their needs.
AI systems built for a specific purpose, such as student self-harm monitoring, work much better than those that are designed to be more generic. AI becomes better at finding what it’s looking for when it receives more specific and relevant data. Advantages of using AI vs. keyword scanning are:
It takes time to create a perfect AI system, and more realistically, you should call it near perfect. But, once these systems achieve that level of operation, the end result will be better self-harm detection and fewer false positives. Over time, the system will receive only relevant data and it will continue to learn and improve.
There’s no doubt that district IT teams are only going to become more important in the fight against student self-harm and other types of student safety signals, especially as the tools they use continue to improve. Cyber safety in schools is a broad issue, and IT admins aren’t the people who will take action to investigate and stop self-harming behavior. But they will increasingly become the first line of defense in identifying students who are in crisis online.
When it comes to K-12 cyber safety, districts are realizing the need to break down traditional silos and develop cross-functional partnerships. This way, students get the safety and resources they need to be able to learn and grow in a safe and healthy school environment—whether they’re in-class or online.