cross-posted from: https://lemmy.ml/post/3109500
Pak ‘n’ Save’s Savey Meal-bot cheerfully created unappealing recipes when customers experimented with non-grocery household items
Sure, it can make a recipe for chlorine gas, but can it recommend a wine pairing to go with the gas?
Removed by mod
…
That’s bad.
Any mainstream news article about ‘AI’ just feels like clickbait at this point
Agree. “Chatbot outputs ridiculous response when given ridiculous inputs” gets old.
This was at least funny.
Though I would say that it spitting out recipes for things that aren’t even ingredients indicates that it’s not a useful tool. It’s not basing recipe recommendations on any knowledge of food, cooking, flavours, textures, or chemistry. Seems like it’s just arbitrarily fitting a list of ingredients into some other patterns.
If it doesn’t understand “this isn’t a safe ingredient”, I doubt it understands anything about what ingredients that aren’t poison would go well together, other than ones it has seen paired in it’s training set.
The headline makes it sound as if it was just randomly suggesting this, but of course it would do that with people inputting non-food ingredients.
and noted that the bot has terms and conditions stating that users should be over 18.
We should definitely prosecute kids who poison themselves or others via use of this app
A lot of kids have parents…
So does mine. I don’t think I would have had any concern if he was playing around with a MEAL PLANNER
A lot of parents have kids.
yummy