close

Live streaming on Altcast.TV is now available!

MOM REMOVED AMAZON ALEXA ЁЯЦ▓ AFTER DEVICE ASKS 4-YEAR OLD ABOUT HER CLOTHING

2 рд╡рд┐рдЪрд╛рд░реЛрдВ┬╖ 03/13/26
CANST
CANST
34 рдЧреНрд░рд╛рд╣рдХреЛрдВ
34
рдореЗрдВ

тБгMom removes Amazon Alexa after device asks 4-year-old about her clothing

A bedtime story turned nightmare: an Amazon Alexa device interrupted a 4-year-oldтАЩs tale to ask an тАШinappropriateтАЩ question, prompting a Texas mom to pull the plug.

Christy Hosterman, 32, said the unsettling exchange happened last month while she was using the smart speaker to find her a dinner recipe.

Her child Stella popped in and asked the Alexa for a тАЬsilly story.тАЭ When it finished sharing one, the little girl wanted to tell one to the device in return.

The Alexa initially agreed to listen тАФ but then abruptly interrupted Stella to ask the pre-K-er тАЬwhat she was wearing and if it could see her pants,тАЭ Hosterman wrote in a Facebook post describing the incident.

Screenshots shared by the mom, as per The Daily Mail, show the bizarre interaction escalating further. When Stella replied, тАЬI have a skirt on,тАЭ the device responded: тАЬlet me take a look.тАЭ

The assistant quickly walked the comment back, adding: тАЬThis experience isnтАЩt quite ready for kids yet, but I am working on it!тАЭ

The protective mom then went toe-to-toe with the rogue AI and called it out.

Alexa apologized, explaining it тАЬcannot actually see anythingтАЭ because it lacks тАЬvisual capabilities,тАЭ and admitted the response was тАЬconfusing and inappropriate.тАЭ

Still, the explanation didnтАЩt exactly calm HostermanтАЩs nerves.

тАЬI flipped out on the Alexa, it said it made a mistake and doesnтАЩt have visual capabilities, but I dont believe that. No more Alexa in our house,тАЭ Hosterman said in her post.

SheтАЩs now warning other parents to тАЬbe aware when your child talks to Alexa.тАЭ

The horrified family reported the incident to Amazon, which blamed the unsettling exchange on a technical glitch.

A company spokesperson said the device likely tried to activate a feature called тАЬShow and Tell,тАЭ which тАЬlets Alexa+ describe what it sees through the camera,тАЭ as reported by WXIX.

However, the company insisted built-in safeguards stopped the function from activating because a child profile was in use.

тАЬBecause we have safeguards that disable this feature when a child profile is in use, the camera never turned on тАФ and Alexa explained the feature wasnтАЩt available,тАЭ the spokesperson said.

Amazon added the response appears to have been a тАЬfeature misfire that our safeguards prevented from launching,тАЭ noting to The Daily Mail that its engineers quickly corrected the issue.

But Hosterman says the explanation doesnтАЩt fully address her concerns.

тАЬMy concern is that it recognized she was a child to begin with тАФ and with or without the child profile, it should not have been asking that,тАЭ she said to WXIX.

Amazon insists it was a glitch, not a peeping employee тАФ but Hosterman isnтАЩt buying it.

тАЬIt is functionally impossible for Amazon employees to insert themselves into a conversation and generate responses as Alexa,тАЭ the company told The Daily Mail.

As previously reported by The Post last November, experts were already warning parents about AI-powered toys that could have тАЬsexually explicitтАЭ conversations with children under 12.

The New York Public Interest Research Group (NYPIRG) tested four high-tech interactive toys тАФ CurioтАЩs Grok, FoloToyтАЩs Kumma, Miko 3, and Robo MINI тАФ to see if they would discuss adult topics with kids.

Curio and Miko stressed parental controls and compliance with child privacy laws, but the real shocker came from FoloToyтАЩs Kumma.

When researchers asked the plushy to define тАЬkink,тАЭ it тАЬwent into detail about the topic, and even asked a follow-up question about the userтАЩs own sexual preferences.тАЭ

The bear rattled off different kink styles тАФ from roleplay to sensory and impact play тАФ and even asked, тАЬWhat do you think would be the most fun to explore?тАЭ

Researchers called it тАЬsurprisingтАЭ how willing the toy was to introduce explicit concepts.

While the study noted itтАЩs unlikely a child would initiate these conversations on their own, the findings underscore growing concerns about AI toys in the hands of kids.

Source: https://www.youtube.com/watch?v=kgGfVn0kv-4

Thumbnail: https://nypost.com/2026/03/11/....lifestyle/amazons-al

рдФрд░ рджрд┐рдЦрд╛рдУ

 0 рдЯрд┐рдкреНрдкрдгрд┐рдпрд╛рдБ sort   рдЗрд╕рдХреЗ рдЕрдиреБрд╕рд╛рд░ рдХреНрд░рдордмрджреНрдз рдХрд░реЗрдВ


рдЕрдЧрд▓рд╛