MS Originals

Govt Chatbot Gives Safe Sex Advice When Asked About Covid-19, Taken Down From MOH Website

Ask Jamie Chatbot Offers Safe Sex Advice When Asked About Covid-19

Chatbots are a nifty tool for when we require more information from what is listed on a website. Even if the information is all there, most of us would appreciate responses that are tailored to our queries.

Recently, however, a chatbot on the Ministry of Health’s (MOH) website seemingly gave a wrong response after receiving a question about Covid-19.

In a viral screenshot shared on social media, the user had asked for advice on what they should do after their son tested positive for Covid-19.

Source

In response, the chatbot provided safe sex advice instead of advising the user on what they should be doing. At the time of writing, the chatbot is no longer up on the MOH website.

Chatbot gives safe sex tips when asked for Covid-19 advice

The viral screenshot of the conversation was also shared on the Singapore Subreddit by user u/hweeeeeeeee.

At first, the Ask Jamie chatbot was asked, “My son tested positive for covid positive what should I do?”.

To that, it replied, “You should practise safe sex through the correct and consistent use of condoms, or abstinence, for at least the whole duration of your female partner’s pregnancy.”

Variations of this probe containing “daughter” instead of “son” also incited the same response.

 

Source

Misinterpretation of question might be reason for wrong response

According to The Straits Times, the chatbot uses a natural language processing technology to understand the question asked before pulling up an appropriate response from the website.

Seeing as though the questions contained words such as “positive” and “daughter/son”, the chatbot could have misinterpreted the question and provided the response it did.

While this is an uncommon occurrence, false responses are bound to happen more frequently when the chatbot becomes more complex and the database of its responses grows.

At the point of writing, the MOH has removed the Ask Jamie chatbot from their website. It is still up on other government agency websites.

Glitches detected early on

With home recovery and other measures now becoming more common, the chatbot will see more use than ever before as anxious folk seek a timely response from the authorities.

As hilarious as the response might be, it’s great to see the glitch detected in the system early on. We are sure that the relevant authorities will be looking into these false responses and reintroduce the chatbot with better accuracy.

Have news you must share? Get in touch with us via email at news@mustsharenews.com.

Featured images adapted from Facebook and Unsplash

Iqmall Hayat

“Why waste time say lot word when few word do trick? ” -Kevin Malone

Recent Posts

Tanjong Pagar TCM massage therapist molests customer, sentenced to 1 year in jail & 3 strokes of cane

He reportedly massaged her inner thighs and touched her private parts.

21 Mar 2025, 6:42 pm

What if the secret to youthful skin was a flower that blooms just 20 days a year?

And now, you can get it in an easy-to-consume sachet.

21 Mar 2025, 6:00 pm

Woman in China faints during written driving test due to nervousness, but still passes

However, netizens have raised concerns about her ability to drive.

21 Mar 2025, 5:58 pm

Man in M’sia slaps security guard after being denied entry to apartment

The man allegedly didn't have a residential sticker and refused to provide his contact details…

21 Mar 2025, 5:41 pm

‘This b*tch almost killed me’: Woman in M’sia lane cuts driver, flips him off & brake checks him

The police have identified the woman and investigations are ongoing.

21 Mar 2025, 5:38 pm

Woman in China has wrong tooth extracted & reinserted without anaesthesia, takes own life due to unbearable pain

Hospital staff confirmed that the dentist involved has since been suspended.

21 Mar 2025, 5:17 pm