MS Originals

Govt Chatbot Gives Safe Sex Advice When Asked About Covid-19, Taken Down From MOH Website

Ask Jamie Chatbot Offers Safe Sex Advice When Asked About Covid-19

Chatbots are a nifty tool for when we require more information from what is listed on a website. Even if the information is all there, most of us would appreciate responses that are tailored to our queries.

Recently, however, a chatbot on the Ministry of Health’s (MOH) website seemingly gave a wrong response after receiving a question about Covid-19.

In a viral screenshot shared on social media, the user had asked for advice on what they should do after their son tested positive for Covid-19.

Source

In response, the chatbot provided safe sex advice instead of advising the user on what they should be doing. At the time of writing, the chatbot is no longer up on the MOH website.

Chatbot gives safe sex tips when asked for Covid-19 advice

The viral screenshot of the conversation was also shared on the Singapore Subreddit by user u/hweeeeeeeee.

At first, the Ask Jamie chatbot was asked, “My son tested positive for covid positive what should I do?”.

To that, it replied, “You should practise safe sex through the correct and consistent use of condoms, or abstinence, for at least the whole duration of your female partner’s pregnancy.”

Variations of this probe containing “daughter” instead of “son” also incited the same response.

 

Source

Misinterpretation of question might be reason for wrong response

According to The Straits Times, the chatbot uses a natural language processing technology to understand the question asked before pulling up an appropriate response from the website.

Seeing as though the questions contained words such as “positive” and “daughter/son”, the chatbot could have misinterpreted the question and provided the response it did.

While this is an uncommon occurrence, false responses are bound to happen more frequently when the chatbot becomes more complex and the database of its responses grows.

At the point of writing, the MOH has removed the Ask Jamie chatbot from their website. It is still up on other government agency websites.

Glitches detected early on

With home recovery and other measures now becoming more common, the chatbot will see more use than ever before as anxious folk seek a timely response from the authorities.

As hilarious as the response might be, it’s great to see the glitch detected in the system early on. We are sure that the relevant authorities will be looking into these false responses and reintroduce the chatbot with better accuracy.

Have news you must share? Get in touch with us via email at news@mustsharenews.com.

Featured images adapted from Facebook and Unsplash

Iqmall Hayat

“Why waste time say lot word when few word do trick? ” -Kevin Malone

Recent Posts

77-year-old woman jailed 10 weeks for biting police officer at Yishun coffee shop

She remained defiant when the police asked her to stop drinking.

19 Dec 2024, 1:27 am

5 new F&B outlets to check out at Parkway Parade, including Saboten Express & Mister Donut

Running out of ideas for where to makan is never an issue at this mall.

18 Dec 2024, 6:00 pm

Teacher dies after being hit by train in Thailand, believed to have been recording TikTok video

Authorities believe the train struck the man and dragged his body 200 to 300 metres…

18 Dec 2024, 5:55 pm

Labubu reseller in M’sia to sue woman for S$151K after she accuses its toys of being fake

The young woman won a Labubu doll bought from the reseller during a laksa stall's…

18 Dec 2024, 5:48 pm

Elderly man in India arrested for allegedly engaging in sexual act with street dog

Police said the canine was tied while being sexually assaulted by the accused.

18 Dec 2024, 4:59 pm