Live streaming on Altcast.TV is now available!
MOM REMOVED AMAZON ALEXA đ˛ AFTER DEVICE ASKS 4-YEAR OLD ABOUT HER CLOTHING
âŁMom removes Amazon Alexa after device asks 4-year-old about her clothing
A bedtime story turned nightmare: an Amazon Alexa device interrupted a 4-year-oldâs tale to ask an âinappropriateâ question, prompting a Texas mom to pull the plug.
Christy Hosterman, 32, said the unsettling exchange happened last month while she was using the smart speaker to find her a dinner recipe.
Her child Stella popped in and asked the Alexa for a âsilly story.â When it finished sharing one, the little girl wanted to tell one to the device in return.
The Alexa initially agreed to listen â but then abruptly interrupted Stella to ask the pre-K-er âwhat she was wearing and if it could see her pants,â Hosterman wrote in a Facebook post describing the incident.
Screenshots shared by the mom, as per The Daily Mail, show the bizarre interaction escalating further. When Stella replied, âI have a skirt on,â the device responded: âlet me take a look.â
The assistant quickly walked the comment back, adding: âThis experience isnât quite ready for kids yet, but I am working on it!â
The protective mom then went toe-to-toe with the rogue AI and called it out.
Alexa apologized, explaining it âcannot actually see anythingâ because it lacks âvisual capabilities,â and admitted the response was âconfusing and inappropriate.â
Still, the explanation didnât exactly calm Hostermanâs nerves.
âI flipped out on the Alexa, it said it made a mistake and doesnât have visual capabilities, but I dont believe that. No more Alexa in our house,â Hosterman said in her post.
Sheâs now warning other parents to âbe aware when your child talks to Alexa.â
The horrified family reported the incident to Amazon, which blamed the unsettling exchange on a technical glitch.
A company spokesperson said the device likely tried to activate a feature called âShow and Tell,â which âlets Alexa+ describe what it sees through the camera,â as reported by WXIX.
However, the company insisted built-in safeguards stopped the function from activating because a child profile was in use.
âBecause we have safeguards that disable this feature when a child profile is in use, the camera never turned on â and Alexa explained the feature wasnât available,â the spokesperson said.
Amazon added the response appears to have been a âfeature misfire that our safeguards prevented from launching,â noting to The Daily Mail that its engineers quickly corrected the issue.
But Hosterman says the explanation doesnât fully address her concerns.
âMy concern is that it recognized she was a child to begin with â and with or without the child profile, it should not have been asking that,â she said to WXIX.
Amazon insists it was a glitch, not a peeping employee â but Hosterman isnât buying it.
âIt is functionally impossible for Amazon employees to insert themselves into a conversation and generate responses as Alexa,â the company told The Daily Mail.
As previously reported by The Post last November, experts were already warning parents about AI-powered toys that could have âsexually explicitâ conversations with children under 12.
The New York Public Interest Research Group (NYPIRG) tested four high-tech interactive toys â Curioâs Grok, FoloToyâs Kumma, Miko 3, and Robo MINI â to see if they would discuss adult topics with kids.
Curio and Miko stressed parental controls and compliance with child privacy laws, but the real shocker came from FoloToyâs Kumma.
When researchers asked the plushy to define âkink,â it âwent into detail about the topic, and even asked a follow-up question about the userâs own sexual preferences.â
The bear rattled off different kink styles â from roleplay to sensory and impact play â and even asked, âWhat do you think would be the most fun to explore?â
Researchers called it âsurprisingâ how willing the toy was to introduce explicit concepts.
While the study noted itâs unlikely a child would initiate these conversations on their own, the findings underscore growing concerns about AI toys in the hands of kids.
Source: https://www.youtube.com/watch?v=kgGfVn0kv-4
Thumbnail: https://nypost.com/2026/03/11/....lifestyle/amazons-al
