Menu Close

Intelligent Smart Toys

„Should we trust machines to create what comforts our children?“

This speculative project explores the uneasy relationship between emerging technologies and objects designed for trust, softness, and care. Using early image-generation tools, it investigates how generative adversarial networks (GANs) interpret the concept of a plush toy—and what happens when that interpretation includes surveillance features.

Backstory

Smart devices are slowly finding their way into children’s rooms. Microphones, GPS, cameras, Bluetooth—features once reserved for phones are now embedded in plush animals. This project began with a simple question: what happens when design for affection becomes a vehicle for control?

Vision

We used early-stage GAN models to generate images of fictional stuffed animals, combining them with visual cues of surveillance—wires, lenses, audio modules. The goal wasn’t realism, but confrontation: what does a plush toy look like when interpreted by a system with no concept of childhood?

Result

The generated images were uncanny, some resembled animals, others became strange or menacing. Most striking were the eyes—often oversized and multiplied. The GAN didn’t understand what was “friendly,” but its distortions exposed a larger truth: we don’t fully know what we invite into intimate spaces when we embed tech in toys.

When Innocence Gets Wired

Innocence, With a Side of Surveillance

Once Pixels, Now Plush

Hug at Your Own Risk

Final Thoughts

Conclusion

Working on this project in 2021, we didn’t expect the GAN to be good at designing — but we also didn’t expect it to be this honest. The machine's results felt alien, yet disturbingly right. It challenged our own assumptions about innocence, aesthetics, and trust in design. And maybe, that discomfort is exactly what we need right now.

Take A Step
Next Project

Take A Step

Next