Chatbots are a nifty tool for when we require more information from what is listed on a website. Even if the information is all there, most of us would appreciate responses that are tailored to our queries.
Recently, however, a chatbot on the Ministry of Health’s (MOH) website seemingly gave a wrong response after receiving a question about Covid-19.
In a viral screenshot shared on social media, the user had asked for advice on what they should do after their son tested positive for Covid-19.
In response, the chatbot provided safe sex advice instead of advising the user on what they should be doing. At the time of writing, the chatbot is no longer up on the MOH website.
The viral screenshot of the conversation was also shared on the Singapore Subreddit by user u/hweeeeeeeee.
At first, the Ask Jamie chatbot was asked, “My son tested positive for covid positive what should I do?”.
To that, it replied, “You should practise safe sex through the correct and consistent use of condoms, or abstinence, for at least the whole duration of your female partner’s pregnancy.”
Variations of this probe containing “daughter” instead of “son” also incited the same response.
According to The Straits Times, the chatbot uses a natural language processing technology to understand the question asked before pulling up an appropriate response from the website.
Seeing as though the questions contained words such as “positive” and “daughter/son”, the chatbot could have misinterpreted the question and provided the response it did.
While this is an uncommon occurrence, false responses are bound to happen more frequently when the chatbot becomes more complex and the database of its responses grows.
At the point of writing, the MOH has removed the Ask Jamie chatbot from their website. It is still up on other government agency websites.
With home recovery and other measures now becoming more common, the chatbot will see more use than ever before as anxious folk seek a timely response from the authorities.
As hilarious as the response might be, it’s great to see the glitch detected in the system early on. We are sure that the relevant authorities will be looking into these false responses and reintroduce the chatbot with better accuracy.
Have news you must share? Get in touch with us via email at news@mustsharenews.com.
Featured images adapted from Facebook and Unsplash.
The dog was found lying next to a pile of faeces when its owners got…
Shockwaves from the explosion caused windows of nearby buildings to shatter.
The issue was resolved amicably.
Although this particular clip wasn't real, it seems to be a million dollar idea.
"To me, bamboo bee plum means plum shaped like bamboo bees", said the customer.
The passenger claimed they were treated like criminals over a tuna sandwich.