MS Originals

Govt Chatbot Gives Safe Sex Advice When Asked About Covid-19, Taken Down From MOH Website

Ask Jamie Chatbot Offers Safe Sex Advice When Asked About Covid-19

Chatbots are a nifty tool for when we require more information from what is listed on a website. Even if the information is all there, most of us would appreciate responses that are tailored to our queries.

Recently, however, a chatbot on the Ministry of Health’s (MOH) website seemingly gave a wrong response after receiving a question about Covid-19.

In a viral screenshot shared on social media, the user had asked for advice on what they should do after their son tested positive for Covid-19.

Source

In response, the chatbot provided safe sex advice instead of advising the user on what they should be doing. At the time of writing, the chatbot is no longer up on the MOH website.

Chatbot gives safe sex tips when asked for Covid-19 advice

The viral screenshot of the conversation was also shared on the Singapore Subreddit by user u/hweeeeeeeee.

At first, the Ask Jamie chatbot was asked, “My son tested positive for covid positive what should I do?”.

To that, it replied, “You should practise safe sex through the correct and consistent use of condoms, or abstinence, for at least the whole duration of your female partner’s pregnancy.”

Variations of this probe containing “daughter” instead of “son” also incited the same response.

 

Source

Misinterpretation of question might be reason for wrong response

According to The Straits Times, the chatbot uses a natural language processing technology to understand the question asked before pulling up an appropriate response from the website.

Seeing as though the questions contained words such as “positive” and “daughter/son”, the chatbot could have misinterpreted the question and provided the response it did.

While this is an uncommon occurrence, false responses are bound to happen more frequently when the chatbot becomes more complex and the database of its responses grows.

At the point of writing, the MOH has removed the Ask Jamie chatbot from their website. It is still up on other government agency websites.

Glitches detected early on

With home recovery and other measures now becoming more common, the chatbot will see more use than ever before as anxious folk seek a timely response from the authorities.

As hilarious as the response might be, it’s great to see the glitch detected in the system early on. We are sure that the relevant authorities will be looking into these false responses and reintroduce the chatbot with better accuracy.

Have news you must share? Get in touch with us via email at news@mustsharenews.com.

Featured images adapted from Facebook and Unsplash

Iqmall Hayat

“Why waste time say lot word when few word do trick? ” -Kevin Malone

Recent Posts

11-year-old student in Selangor gets heat exhaustion after being punished to stand under sun

Three other students were also forced to stand under the sun as punishment.

27 May 2024, 5:46 pm

27-year-old Vietnamese man marries 53-year-old woman despite villagers’ objections about 26-year age difference

They are happily married after 3 years and have a son together.

27 May 2024, 5:43 pm

Four Star has up to 80% off mattresses & sofas, gift Dad comfort this Father’s Day

Beds to give Dad a good night’s rest & dining sets for that upcoming Father’s…

27 May 2024, 5:30 pm

Woman gets room with killer view of ocean in Italy, turns out it was a deviously-placed poster

Commenters advised her to check google street view before booking.

27 May 2024, 4:47 pm

Massage competition in Thailand shows masseuses busting out crazy moves, dancing on top of person’s back

One contestant even massaged a person with his feet while doing a Thai dance.

27 May 2024, 3:58 pm