Apple recently improved its Siri virtual assistant to better answer questions related to sexual assault and other emergencies. To some of the queries, the responses have been developed with consultation from the Rape, Abuse and Incest National Network (RAINN).
Apple updates Siri to handle personal emergencies better
To improve Siri, Apple collected the phrases and keywords commonly received by RAINN via its online and telephone hotlines. Also the response system was updated to use language and phrases with softer connotations. For instance, now when replying to queries about personal emergencies, Instead of using, “You should reach out to someone,” Siri now uses, “You may want to reach out to someone.”
Apple told ABC News that a few weeks ago, it added phrases like, “I was raped” and, “I am being abused” to Siri’s index along with web links to the National Sexual Assault Hotline. The update follows a study in the JAMA Internal Medicine journal, which found that virtual assistant technologies like Google Now, Samsung’s S Voice, Siri and Microsoft’s Cortana are not adept at handling personal emergencies. The study’s authors say they have been in contact with all four companies to improve the responses, and after the study, Apple got in contact with RAINN. The authors recommended validating the user’s feelings and suggesting resources to help them decide what to do next.
When investors are looking for a hedge fund to invest their money with, they usually look at returns. Of course, the larger the positive return, the better, but what about during major market selloffs? It may be easy to discount a hedge fund's negative return when everyone else lost a lot of money. However, hedge Read More
Growing human-machine interactions
RAINN’s vice president for victim services at RAINN, Jennifer Marsh, said, “We have been thrilled with our conversations with Apple,” and, “We both agreed that this would be an ongoing process and collaboration.”
With the growing number of human-machine interactions, virtual assistants like Siri might soon become a typical way for victims to report assault.
“The online service can be a good first step. Especially for young people,” Marsh said. “They are more comfortable in an online space rather than talking about it with a real-life person. There’s a reason someone might have made their first disclosure to Siri.”
Though Apple has been regularly updating Siri, when it comes to sensitive social issues, the service has been facing problems. Earlier this year, Apple fixed a flaw in Siri’s response database to help users easily search for abortion clinics, adoption agencies and fertility clinics.