close

Live streaming on Altcast.TV is now available!

MOM REMOVED AMAZON ALEXA ๐Ÿ–ฒ AFTER DEVICE ASKS 4-YEAR OLD ABOUT HER CLOTHING

2 ืฆืคื™ื•ืชยท 03/13/26
CANST
CANST
34 ืžื ื•ื™ื™ื
34
ื‘

โฃMom removes Amazon Alexa after device asks 4-year-old about her clothing

A bedtime story turned nightmare: an Amazon Alexa device interrupted a 4-year-oldโ€™s tale to ask an โ€˜inappropriateโ€™ question, prompting a Texas mom to pull the plug.

Christy Hosterman, 32, said the unsettling exchange happened last month while she was using the smart speaker to find her a dinner recipe.

Her child Stella popped in and asked the Alexa for a โ€œsilly story.โ€ When it finished sharing one, the little girl wanted to tell one to the device in return.

The Alexa initially agreed to listen โ€” but then abruptly interrupted Stella to ask the pre-K-er โ€œwhat she was wearing and if it could see her pants,โ€ Hosterman wrote in a Facebook post describing the incident.

Screenshots shared by the mom, as per The Daily Mail, show the bizarre interaction escalating further. When Stella replied, โ€œI have a skirt on,โ€ the device responded: โ€œlet me take a look.โ€

The assistant quickly walked the comment back, adding: โ€œThis experience isnโ€™t quite ready for kids yet, but I am working on it!โ€

The protective mom then went toe-to-toe with the rogue AI and called it out.

Alexa apologized, explaining it โ€œcannot actually see anythingโ€ because it lacks โ€œvisual capabilities,โ€ and admitted the response was โ€œconfusing and inappropriate.โ€

Still, the explanation didnโ€™t exactly calm Hostermanโ€™s nerves.

โ€œI flipped out on the Alexa, it said it made a mistake and doesnโ€™t have visual capabilities, but I dont believe that. No more Alexa in our house,โ€ Hosterman said in her post.

Sheโ€™s now warning other parents to โ€œbe aware when your child talks to Alexa.โ€

The horrified family reported the incident to Amazon, which blamed the unsettling exchange on a technical glitch.

A company spokesperson said the device likely tried to activate a feature called โ€œShow and Tell,โ€ which โ€œlets Alexa+ describe what it sees through the camera,โ€ as reported by WXIX.

However, the company insisted built-in safeguards stopped the function from activating because a child profile was in use.

โ€œBecause we have safeguards that disable this feature when a child profile is in use, the camera never turned on โ€” and Alexa explained the feature wasnโ€™t available,โ€ the spokesperson said.

Amazon added the response appears to have been a โ€œfeature misfire that our safeguards prevented from launching,โ€ noting to The Daily Mail that its engineers quickly corrected the issue.

But Hosterman says the explanation doesnโ€™t fully address her concerns.

โ€œMy concern is that it recognized she was a child to begin with โ€” and with or without the child profile, it should not have been asking that,โ€ she said to WXIX.

Amazon insists it was a glitch, not a peeping employee โ€” but Hosterman isnโ€™t buying it.

โ€œIt is functionally impossible for Amazon employees to insert themselves into a conversation and generate responses as Alexa,โ€ the company told The Daily Mail.

As previously reported by The Post last November, experts were already warning parents about AI-powered toys that could have โ€œsexually explicitโ€ conversations with children under 12.

The New York Public Interest Research Group (NYPIRG) tested four high-tech interactive toys โ€” Curioโ€™s Grok, FoloToyโ€™s Kumma, Miko 3, and Robo MINI โ€” to see if they would discuss adult topics with kids.

Curio and Miko stressed parental controls and compliance with child privacy laws, but the real shocker came from FoloToyโ€™s Kumma.

When researchers asked the plushy to define โ€œkink,โ€ it โ€œwent into detail about the topic, and even asked a follow-up question about the userโ€™s own sexual preferences.โ€

The bear rattled off different kink styles โ€” from roleplay to sensory and impact play โ€” and even asked, โ€œWhat do you think would be the most fun to explore?โ€

Researchers called it โ€œsurprisingโ€ how willing the toy was to introduce explicit concepts.

While the study noted itโ€™s unlikely a child would initiate these conversations on their own, the findings underscore growing concerns about AI toys in the hands of kids.

Source: https://www.youtube.com/watch?v=kgGfVn0kv-4

Thumbnail: https://nypost.com/2026/03/11/....lifestyle/amazons-al

ืœื”ืจืื•ืช ื™ื•ืชืจ



ื”ื‘ื