Google is employing artificial intelligence to better recognize searches from people in distress.
Google is using AI to better detect searches from people in crisis |
Many people turn to an impersonal source of help in times of personal crisis: Google. Every day, the organization receives requests for information on themes such as suicide, sexual assault, and domestic violence. Google, on the other hand, wants to do more to help consumers find the information they need and claims that new AI approaches that better comprehend the complexity of language are assisting in this effort.
Google's latest machine learning model, MUM, will be integrated into its search engine to "better effectively detect a wider spectrum of personal crisis searches." MUM was first announced at the company's IO conference last year, and it has since been used to supplement search with features that attempt to answer queries related to the initial search.
According to Anne Merritt, a Google product manager for health and information quality, MUM will be able to recognize search queries relating to challenging personal situations that previous search tools could not.
"MUM can help us answer longer or more difficult questions like 'why did he attack me when I told him I didn't love him," says the researcher. Although it may be evident to humans that this inquiry is about domestic violence, our systems struggle to grasp extensive, natural-language queries like these without advanced AI." "MUM can assist us in understanding longer or more complex questions"
Other queries that MUM can respond to include "most common ways to commit suicide" (a search Merrit says earlier systems "may have previously understood as information seeking") and "Sydney suicide hot spots" (where, again, earlier responses would have likely returned travel information — ignoring the mention of "suicide" in favor of the more popular query for "hot spots"). When Google detects such crisis searches, it displays an information box that says "Help is available," usually with a phone number or website for a mental health organization such as Samaritans.
Google says that in addition to utilizing MUM to respond to personal crises, it's also employing BERT, an older AI language model, to better recognize searches for explicit content like pornography. Google claims that by using BERT, it has "reduced unexpected stunning results by 30%" year over year. However, the corporation was unable to provide absolute data for how many "shocking results" its users encounter on a daily basis, so while this is a step forward, it provides no indication of the scale of the problem.
Google is eager to convince you that artificial intelligence is assisting the firm in improving its search products, particularly at a time when there is a growing narrative that "Google search is dying." However, there are several drawbacks to integrating this technology.
Many AI specialists fear that Google's growing usage of machine learning language models may expose the corporation to new challenges, such as prejudice and disinformation in search results. AI systems are also enigmatic, giving engineers only a sliver of insight into how they reach particular judgments.
When we questioned Google about how it determines which search phrases indicated by MUM are linked to personal problems in advance, its representatives were either unwilling or unable to respond. Human assessors are used to carefully evaluate modifications to the company's search offerings, but that's not the same as knowing how your AI system will answer specific inquiries in advance. However, such sacrifices appear to be worthwhile for Google.
Comments
Post a Comment
Someone comment on the post!