
About this
- Original Link: ECPAT Child Alert’s Website
- Original Post Date: 17 April 2026
While travelling in the last year (or so), i’ve walked in and out of high-end airport boutiques. Amongst the Gucci and Balenciaga, the new Meta glasses from Ray Ban featured in a lot of the stores. I tried them on and had a look in the mirror. They are no longer the awkward first gen smartglasses. They now look stylish and fit easily in any high-end fashion boutique. It left a weird feeling in my stomach, and it also rang alarm bells for Elle Hunt, from the Guardian. Here are my thoughts – published on ECPAT Child Alert’s website
Check out my other mahi:
The Guardian’s Elle Hunt test drove a pair of Meta’s smartglasses, and wrote this article. It came across my desk much like most of the other news I read. I have always been interested in new technology, and especially where it intersects with real life.
Like most people, the article left me with an uneasy feeling, but then when I put on my ECPAT pōtae (hat), that uneasy feeling became a deeper sinking feeling. We work at the front-line of safeguarding children, here in Aotearoa and feeding into global efforts. New technologies often come fast, much faster than any legislative guard rails can.
“Invisible camera” risk = easier non‑consensual capture of children.
hroughout the month-long test, Hunt consistently highlighted how easily smartglasses could capture both images and video – often surreptitiously. The compulsory recording indicator is a small light and is easily missed; especially in bright light situations. The increasing access to small, high‑quality cameras has changed who can produce abusive material and how much is produced, making “always‑there” capture devices a structural risk factor and not just a bad-user problem. We talk about this and more in our course on Understanding Harm from Online Sexual Exploitation of Taiohi, once abuse is recorded, the material can be shared later – extending the impact far beyond the moment of capture.
Wearables blur ‘online’ vs ‘offline’
Smartglasses compress the steps between in-person access, recording and digital distribution. ECPAT is one of the organisations implementing Disrupting Harm, and the evidence gathered there demonstrates, time and time again, that technology-facilitated child sex exploitation and abuse often occur through a combination of digital and in-person interactions. Wearables make this pathway shorter and quicker.
Livestreaming/Real-time capture
Livestreaming abuse often leaves ‘no digital trail’ – In 2024, ECPAT International’s submission for the 79th session of the UN General Assembly, for the Office of the Special Rapporteur on the Sale and Sexual Exploitation of Children extensively discussed how there are distinct enforcement and safeguarding challenges when the abuse is livestreamed (versus stored files). A hands-free wearable camera, paired with mobile connectivity can make this real-time capture and dissemination so much easier to initiate quickly and so much harder for bystanders to notice in the moment. This raises the likelihood of unnoticed offending in public or semi-public environments.
Potential facial recognition
In the article, Hunt is suggesting that these devices are part of Meta’s push towards wearable AI. We know that Meta has made significant strides in AI with a giant global push towards Meta AI in recent times. Our course on Navigating Deep Fakes and AI Safety for Taiohi outlines how deepfakes and generative AI are rapidly intensifying child-safety challenges, and this new technology presents a raft of new ways on how taiohi can be abused. Smartglasses form the first bridge into that future, it’s always-on, always sensing, and the impending AI identification of people can amplify risks. As always, safeguards are not designed with safety by default, and this is the result.
The normalisation of surveillance
The biggest systemic issue is that smartglasses don’t just collect information about the wearer (who presumably gave consent through the fact that they purchased it), but that they capture bystanders and environments. Hunt discusses this extensively as a major ethical and privacy concern. Alongside our Disrupting Harm partners, we repeatedly emphasise that digital technologies are evolving rapidly and that prevention and response must be grounded in up‑to‑date evidence and clear roadmaps for governments and tech companies.
Why does it all matter?
In the context of child sexual exploitation, Meta’s smartglasses are not just a privacy problem but a safeguarding one. By normalising hands‑free, often unnoticed recording and collapsing the distance between offline access and online distribution, they amplify the very dynamics we warn most about: covert capture, blurred boundaries between online and in‑person harm, and the rapid evolution of technology outpacing child‑protection systems. When devices make it easier to turn everyday proximity into permanent digital evidence or leverage, responsibility cannot sit solely with “bad users.”
Without strong, child‑centred safeguards built in from the start, smartglasses risk becoming another tool that quietly shifts power away from children—and further embeds exploitation into the fabric of ordinary life.

