Govt Chatbot Gives Safe Sex Advice When Asked About Covid-19, Taken Down From MOH Website

Ask Jamie Chatbot Offers Safe Sex Advice When Asked About Covid-19

Chatbots are a nifty tool for when we require more information from what is listed on a website. Even if the information is all there, most of us would appreciate responses that are tailored to our queries.

19-Year-Old Tries To Flirt With ‘Ask Jamie’ Chatbots, Drops Pick-Up Lines To Woo Them

Recently, however, a chatbot on the Ministry of Health’s (MOH) website seemingly gave a wrong response after receiving a question about Covid-19.

In a viral screenshot shared on social media, the user had asked for advice on what they should do after their son tested positive for Covid-19.

Source

In response, the chatbot provided safe sex advice instead of advising the user on what they should be doing. At the time of writing, the chatbot is no longer up on the MOH website.

Chatbot gives safe sex tips when asked for Covid-19 advice

The viral screenshot of the conversation was also shared on the Singapore Subreddit by user u/hweeeeeeeee.

At first, the Ask Jamie chatbot was asked, “My son tested positive for covid positive what should I do?”.

To that, it replied, “You should practise safe sex through the correct and consistent use of condoms, or abstinence, for at least the whole duration of your female partner’s pregnancy.”

Variations of this probe containing “daughter” instead of “son” also incited the same response.

Source

Misinterpretation of question might be reason for wrong response

According to The Straits Times, the chatbot uses a natural language processing technology to understand the question asked before pulling up an appropriate response from the website.

Seeing as though the questions contained words such as “positive” and “daughter/son”, the chatbot could have misinterpreted the question and provided the response it did.

While this is an uncommon occurrence, false responses are bound to happen more frequently when the chatbot becomes more complex and the database of its responses grows.

At the point of writing, the MOH has removed the Ask Jamie chatbot from their website. It is still up on other government agency websites.

Glitches detected early on

With home recovery and other measures now becoming more common, the chatbot will see more use than ever before as anxious folk seek a timely response from the authorities.

As hilarious as the response might be, it’s great to see the glitch detected in the system early on. We are sure that the relevant authorities will be looking into these false responses and reintroduce the chatbot with better accuracy.

Have news you must share? Get in touch with us via email at news@mustsharenews.com.

Featured images adapted from Facebook and Unsplash. 

  • More From Author