Smart glasses like the Ray-Ban Meta AI look, at first glance, like an ordinary pair of frames. Behind the lenses sits a camera, a microphone, and an AI model capable of processing everything they encounter – the stranger’s conversation on the Tube, the route walked every morning, the faces of people who never agreed to be recorded. The hardware is neutral in the way most technology is neutral: it reflects the intentions of whoever holds it. English law was not written with this technology in mind, yet its existing frameworks have something to say about how they are used. This article sets out the legal landscape in England and Wales, covering the mechanics of the technology, recent reporting that has brought it into public debate, and the civil and criminal frameworks that may apply.
What are smart glasses, and how do they work?
“Smart glasses” is not a single legal or technical category. It is a broad description for eyewear that combines ordinary lenses or frames with digital hardware and software. Depending on the product, that hardware may include cameras, microphones, open-ear speakers, touch controls, motion sensors, wireless connectivity, miniature displays, AI assistants and links to mobile apps or cloud services.
Ray-Ban Meta AI glasses are the best-known example, but they are part of a wider and growing market. Other current or recent smart glasses products include Oakley Meta performance AI glasses, Solos AirGo Vision, Rokid AI glasses, Brilliant Labs Halo, Even Realities G1 and G2, XREAL AR glasses, and Vuzix enterprise smart glasses. Their capabilities vary significantly.
In the case of the Ray-Ban Meta AI glasses, they are ordinary-looking Ray-Ban frames with a small camera, microphones, open-ear speakers, and a connection to Meta’s app and AI services. The wearer can take photographs or videos, record sound, make calls, listen to audio, send messages, livestream to Meta platforms and use “Hey Meta” voice commands. In some modes, the wearer can ask Meta AI about what the glasses are seeing.
For bystanders, the key point is that the person in front of them may be wearing something that looks like normal eyewear but can also function as a camera, microphone and AI input device. The glasses can capture identifiable people and conversations, then move that material through an app, Meta services and, in some cases, AI processing. Meta says a capture LED is intended to signal when photos or videos are being taken, and its responsible use guidance tells users to make capture obvious, stop recording if someone objects, avoid sensitive spaces and not cover the LED.
Meta’s supplemental privacy notice says photos and videos with audio may be stored on the glasses until uploaded to the app. It also says Meta may collect media where cloud processing is enabled, where the user interacts with Meta AI about what the user is seeing or submits media to Meta AI, or where the user uploads media to Meta services. Livestreamed video and audio are processed through Facebook or Instagram.
The same notice is important for voice and AI use. It says voice interactions may include mistaken invocations and background sound once voice services are enabled, and that audio recordings, transcripts and related data may be processed. It also refers to machine learning, trained reviewers and third-party vendors in specified circumstances.
Recent reporting, investigations, and prosecutions
UK reporting has focused on men filming women in public while wearing Meta smart glasses, often in “pick-up” style videos. The Independent reported in January 2026 that women had been approached in public and later discovered, or been told, that the interaction had been recorded on Meta glasses and posted online.
In a similar vein, the BBC reported in May 2026 on a woman, anonymised as Alice, who said she had been covertly filmed by a man wearing smart glasses in a London shopping centre, with the footage later posted online. The BBC report stated that when she asked for the video to be removed, she was told removal was available as a “paid service”. Police reportedly opened an investigation but could not progress it due to insufficient information.
There has now been a reported guilty plea in the UK involving smart glasses and sexual recording. The Independent, citing Telegraph reporting, stated that David Williams pleaded guilty to voyeurism at Warrington Magistrates' Court after recording sex with a woman without her explicit consent using smart glasses. The report says he avoided prison and was fined. The article did not clearly confirm that the device was Ray-Ban Meta-branded.
A separate court integrity issue has also arisen recently. In UAB Business Enterprise & Anor v Oneta Limited & Ors [2026] EWHC 543 (Ch), ICC Judge Agnello KC rejected a witness’s evidence after finding he had been coached through smart glasses connected to his phone while giving evidence.
Civil and regulatory activity is emerging outside the UK. Euronews reported in March 2026 that Meta faced a US privacy lawsuit alleging that sensitive smart-glasses footage, including intimate content and banking information, had been exposed to human reviewers.
The civil law landscape
Privacy and misuse of private information
One common law privacy claim which may arise is for misuse of private information. It developed from breach of confidence but is now treated as a distinct tort. The modern approach asks two questions. First, did the claimant have a reasonable expectation of privacy in respect of the information? Secondly, if so, how should the court balance the claimant's Article 8 rights against any competing Article 10 rights, including freedom of expression?
English law does not currently recognise a freestanding image right. A stranger appearing incidentally in a photograph of a street does not automatically acquire a claim merely because they are identifiable.
However, cases such as Murray v Big Pictures (UK) Ltd [2008] EWCA Civ 446 and Weller v Associated Newspapers Ltd [2015] EWCA Civ 1176 show that public photography may still be actionable depending on context.
In Murray, the Court of Appeal held that J K Rowling's young child, photographed in a public street, had an arguable claim. The court emphasised a range of factors: the claimant's age, the nature of the activity, the place, the purpose of the intrusion, the absence of consent, the effect on the claimant, and the circumstances in which the photograph was taken. For smart glasses, those factors may be relevant. A child, a targeted interaction, a family outing, a private emotional moment or a covert recording for publication will be very different from an incidental background capture.
In Weller, photographs of the musician Paul Weller’s children were taken in public in California and published online. The Court of Appeal upheld findings of misuse of private information and breach of the Data Protection Act 1998. The photographs were described as innocuous in one sense, but the children’s identifiability, their family context, their age, the absence of parental consent and the upsetting circumstances of publication were deemed significant.
The Strasbourg courts have also recognised that disclosing footage to the media of someone experiencing acute distress can found a breach of Article 8 even where the act or behaviour occurred in public: Peck v United Kingdom (2003) 36 EHRR 41. It is conceivable that publication of footage capturing people in a state of vulnerability, or publication or onward sharing of such footage for entertainment, ridicule, or moral judgment may attract a privacy claim even if the initial event occurred in public. Relatedly, private events, such as weddings, conferences, religious ceremonies, private clubs, can generate obligations of confidentiality or privacy even though many people are present: Douglas v Hello! Ltd [2005] EWCA Civ 595.
The courts have recognised that public photography or recording by state actors can engage Article 8 where the context is targeted, intimidating, or involves retention and later use. In Wood v Commissioner of Police of the Metropolis [2009] EWCA Civ 414, police photographed a campaigner leaving a company AGM. The Court of Appeal accepted that the mere taking of a photograph in a public place would not ordinarily be enough, but on the facts Article 8 was engaged because of the circumstances, the claimant's uncertainty as to purpose, and the apprehended use and retention of the photographs.
R (Catt) v Association of Chief Police Officers [2015] UKSC 9 and Catt v United Kingdom (App no 43514/15), judgment 24 January 2019, concerned police retention of information about a peaceful protester. Although the domestic and Strasbourg courts approached justification differently, both show that systematic recording and retention of a person's political activity may engage privacy rights. For smart glasses, the point is not confined to the police. A private person or organisation recording protesters, worshippers, trade unionists or campaigners may also be processing sensitive information and generating a record whose privacy impact is created by retention and use, not merely by observation.
The misuse of private information claim will not offer a remedy for every unwanted recording. The strongest cases will involve sensitive content, children, private spaces, targeted recording, distress, publication, threatened publication or a serious imbalance between the subject’s expectation of privacy and the wearer’s asserted purpose. A claim based only on a momentary or incidental capture in a public place will be difficult. Where the complaint is repeated monitoring without meaningful publication, harassment and data protection may be more natural causes of action, though privacy may still be relevant depending on the facts.
Data Protection and the UK GDPR
Data protection law offers a separate and, in some respects, broader avenue.
Data protection law is broader than misuse of private information in some respects and narrower in others. It is broader because it regulates the processing of personal data, including capture, storage, upload, analysis, sharing, retention and deletion. It does not require the same level of offensiveness or sensitivity as a privacy tort. A face, voice, gait, location or conversation may be personal data if the person is identifiable.
It is narrower because the UK GDPR applies only where its scope is engaged. Purely personal or household activity is exempt. It also provides a structured compliance regime rather than a general prohibition on recording. If the UK GDPR applies, the question is not simply whether recording feels intrusive. The controller must identify a lawful basis, act fairly and transparently, minimise data, limit purpose and retention, keep data secure, respect rights, and be able to demonstrate compliance.
The CJEU’s decision in Case C-212/13 Ryneš v Úřad pro ochranu osobních údajů is a key authority on the legality of cameras in domestic settings. Mr Ryneš installed a camera at his home which recorded his entrance, the public footpath and the house opposite. The court held that the household exemption did not apply where the surveillance also covered public space. The reasoning has obvious consequences for smart glasses. A wearable camera is not fixed to the home, but it may capture public spaces, workplaces, neighbours, customers and strangers. Once use moves beyond genuinely private activity, the exemption may fall away.
The ICO gives similar guidance for domestic CCTV and video doorbells: if a camera captures images or audio beyond the user’s private boundary, such as a neighbour's property or a public area, data protection law may apply. Smart glasses make that boundary mobile. A family video kept privately will often be outside the UK GDPR. A creator’s monetised recording of strangers, a neighbour’s repeated recording, or use in a workplace setting will not be so easily characterised as household use.
If the UK GDPR applies, a wearer, employer or organisation needs a lawful basis for data processing under Article 6. Consent will often be impractical where bystanders, passers-by or background speakers are involved. Legitimate interests may be available in some cases, but it requires a purpose test, necessity test and balancing test. It is unlikely to justify indiscriminate audio recording, covert recording in sensitive settings, or publication designed mainly to humiliate or entertain at another person's expense.
Transparency may be a vulnerability if recording is discreet. A small LED may help, but it will not always be seen or understood. Bystanders may not know whether audio is being captured, whether the footage is being livestreamed, whether an AI prompt has been submitted, or whether the material will be reviewed by a human contractor. For organisational use, privacy notices, signage, policies and training will often be required. For individual users outside the household exemption, the safest practical route is to make recording obvious, stop if asked unless there is a compelling lawful reason not to, and avoid sensitive spaces.
Data minimisation is equally important. The ICO’s surveillance guidance asks controllers to justify why recording is needed and whether less intrusive means would meet the objective. A still image may be less intrusive than video, especially if that video leads to facial recognition or an AI interface. Video without audio may be less intrusive than video with audio. Local storage may be less intrusive than uploading to the cloud. And reviewing the footage internally before sharing selected portions may be less intrusive than livestreaming.
An employer issuing smart glasses to staff will usually be a controller for work-related processing. It may need a data protection impact assessment, a lawful basis, an Article 9 condition if special category data is likely, staff and visitor notices, restrictions on audio, retention rules, access controls and deletion procedures. Continuous recording by delivery drivers, security officers, estate agents, health workers, teachers or enforcement agents is unlikely to be lawful merely because it is operationally convenient.
The UK GDPR gives heightened protection to special category data, which includes data revealing or concerning health, religion, political opinions, trade union membership, racial or ethnic origin, sex life or sexual orientation, and biometric data where processed for identification purposes, among other categories. Recording inside a hospital, fertility clinic, addiction service, school, place of worship, political meeting, trade union event, protest, Pride event or support group may reveal such data. If an organisation is using the glasses, it may need both an Article 6 lawful basis and an Article 9 condition.
AI compounds these risks. Seeming anodyne information can be converted into special category data through automated processing alone. A video of a room may become a scene analysis. A face may become an identity. A conversation may become a transcript. A location may reveal health, religion or politics. A seemingly ordinary clip may be used to infer disability, emotion, ethnicity, age, relationships or financial status. The legal risk, therefore, lies not only in capturing the information, but also its processing in context – a process that may be largely opaque event to the wearer.
Where does the manufacturer stand?
The wearer will often be the obvious controller for their own recording decisions. But companies such as Meta may also be a controller for its own processing where it determines why and how media, audio, transcripts, AI prompts, livestreams, and product-improvement data are processed. Meta’s supplemental privacy materials identify circumstances in which media may be collected or processed, including cloud processing, AI interactions and uploads to Meta services. The controller must explain that clearly, identify a lawful basis, minimise what is reviewed, control access, secure transfers and respect data subject rights. Even where the user has agreed to product terms, the position of bystanders is more difficult because they may never know that their image, voice or environment has been submitted to an AI system.
That does not mean Meta is responsible for every unlawful act of every wearer. A user who covertly records a sexual encounter or harasses a neighbour is likely to bear primary responsibility. However, Meta's own processing is separate. The legal questions for Meta include transparency, lawful basis, data minimisation, privacy by design, security, reviewer access, retention, international transfers, children's data, bystander data and the effectiveness of the capture LED and responsible use warnings.
Criminal offences: when do glasses become part of a crime?
The mere wearing of smart glasses is not a criminal offence. But particular uses may engage criminal law.
Voyeurism, upskirting and intimate recording
Section 67 of the Sexual Offences Act 2003 criminalises, among other things, operating equipment to observe or record another person doing a private act for sexual gratification, where the person does not consent and the defendant knows that. Section 67A addresses "upskirting" and related recording beneath clothing without consent for purposes including sexual gratification, humiliation, alarm or distress.
Smart glasses could engage those offences if used in toilets, changing rooms, bedrooms, sexual encounters, brothels, treatment rooms or other private-act settings. The reported UK guilty plea involving smart glasses and sexual recording, and the New Zealand Ray-Ban Meta brothel prosecution, are practical illustrations of the risk.
The intimate image regime is also developing. Section 66B of the Sexual Offences Act 2003, inserted by the Online Safety Act 2023, creates offences concerning sharing or threatening to share intimate photographs or films without consent in specified circumstances. CPS guidance explains the offences concerning sharing, threatening to share and cyberflashing.
From 6 February 2026, section 138 of the Data (Use and Access) Act 2025 brought into force offences concerning creating, or requesting the creation of, purported intimate images of adults. The Crime and Policing Act 2026 contains further measures concerning intimate images, “nudification” tools and AI-related child sexual abuse material, although commencement and transitional provisions should be checked in any live case.
For smart glasses, the practical point is that criminal exposure may arise not only from filming nudity or sexual activity, but also from using ordinary captured images as source material for AI-generated intimate images or from threatening publication.
Children and indecent images
Where children are involved, the legal risk is acute. CPS guidance states that, for indecent image offences, a child is a person under 18; that the test of indecency is objective; and that “making” an indecent image can include opening, downloading, storing or receiving images through social media or livestreaming. The law also covers pseudo-photographs, including high-quality computer-generated or AI-generated images.
Smart glasses used in schools, sports clubs, swimming pools, bedrooms, changing areas or youth settings therefore require particular caution. If the resulting image is indecent, or if ordinary footage is used to generate an indecent pseudo-photograph, the wearer and downstream recipients may face criminal liability.
Harassment, stalking and coercive control
Repeated recording can become harassment or stalking. The Protection from Harassment Act 1997 creates offences including harassment, stalking, putting a person in fear of violence, and stalking involving serious alarm or distress. CPS guidance identifies stalking behaviours such as following, watching, spying, publishing material relating to a person and monitoring online activity.
A neighbour who repeatedly films another resident, a former partner who records encounters, a campaigner who follows and livestreams an individual, or a content creator who repeatedly targets the same person may stray beyond a one-off privacy problem into harassment or stalking.
Coercive control may arise in intimate or family relationships. CPS guidance gives examples of technology-enabled control, including monitoring through online tools, spyware, smart devices or social media. A partner who uses glasses to monitor movements, conversations, friendships or daily conduct, or who threatens to publish footage, may therefore fall within the statutory framework if the other elements are made out.
Communications offences, threats, blackmail and fraud
The recording itself may be only the beginning. Posting footage with threatening, false or abusive accompanying material may engage communications offences. Sending intimate material, threatening to share it, or using footage to humiliate or alarm may fall within the Sexual Offences Act and Online Safety Act framework described above.
If a person demands money to remove a video, prosecutors may consider blackmail under section 21 of the Theft Act 1968, depending on the facts: a demand with menaces made with a view to gain or intent to cause loss. If the glasses are used to capture PINs, passwords, payment cards, confidential documents or trade secrets for dishonest gain, the Fraud Act 2006 may also be relevant.
Unlawful data access, hacking and interception
Third parties may also be liable. Section 170 of the Data Protection Act 2018 creates offences concerning knowingly or recklessly obtaining, disclosing, procuring, retaining, selling or offering to sell personal data without the controller’s consent, subject to statutory defences. A rogue reviewer, recipient, data broker or hacker who obtains smart glasses footage without authority may therefore face separate exposure.
The Computer Misuse Act 1990 may apply where a person hacks the glasses, the connected phone, the Meta account, the app or cloud storage. The Investigatory Powers Act 2016 may be relevant in more technical scenarios involving intentional unlawful interception of communications in the course of transmission.
Courts, tribunals and other restricted spaces
In the wake of the events described in the UAB Business Enterprise case above, there may be heightened awareness of the legal issues posed by smart glasses in courts or tribunals. Section 41 of the Criminal Justice Act 1925 restricts photography in court. Section 9 of the Contempt of Court Act 1981 concerns unauthorised sound recordings. Section 85B of the Courts Act 2003 addresses unauthorised recording or transmission of certain court proceedings. The decision in UAB Business Enterprise also illustrates that smart glasses may undermine the integrity of oral evidence even without publication of a recording.
Practical scenarios
1. Personal holiday use
A wearer takes a photograph of friends who know they are being photographed and stores it privately. This appears low risk and likely to fall within the domestic or household exemption, subject to common sense about children, private spaces and publication.
2. Public "pick-up" filming
A content creator approaches strangers, records them on smart glasses, asks for consent only after recording, then posts the footage online. Privacy, harassment, data protection issues may arise. Criminal liability will depend on the content, repetition, and threats.
3. Recording in a clinic, school, place of worship or support group
Even if no one is undressed or doing anything obviously private, the footage may reveal health, religion, disability, sexual orientation, political opinion, children’s data or other sensitive matters. UK GDPR, special category data and misuse of private information may be engaged.
4. Toilets, changing rooms, bedrooms and sexual settings
This is a high-risk category for voyeurism, intimate-image offences, misuse of private information and data protection breaches. A capture LED or terms of service will not resolve non-consensual intimate recording.
5. Workplace deployment
A delivery company, landlord, security contractor, hospital, school or retailer asks staff to wear smart glasses. The organisation will need a lawful basis, DPIA, privacy notices, limits on audio and AI processing, training, retention rules and a policy for sensitive spaces.
6. Neighbour disputes
A neighbour repeatedly records another resident or their visitors using smart glasses. Depending on the facts, this may generate data protection, harassment and privacy claims, particularly if audio is captured or footage is shared.
7. Domestic abuse and coercive control
A partner uses the glasses to record conversations, check movements, monitor social contacts, or threaten publication. That may be evidence of coercive control, harassment, stalking or intimate image abuse.
8. AI deepfakes
A wearer captures a person's face or body and uses an AI tool to generate a purported intimate image. The criminal law now addresses creation and requesting of purported intimate images in specified circumstances, and further provisions in the Crime and Policing Act 2026 point to continuing expansion of liability.
9. Court or tribunal use
A party, witness, journalist or observer wears smart glasses during proceedings. Unauthorised recording may engage statutory restrictions and contempt.
Conclusion
The glasses are lawful technology. Some uses of them will not be. Whether that is so will depend on the facts: who is recording, what is captured, where it is captured, why it is captured, what the subject reasonably expects, and what happens to the data afterwards. The law in this area is also moving. The statutory provisions governing intimate images, AI-generated content and data use have changed materially in the past two years. Anyone advising on, using or regulating this technology should treat the current position as a starting point, not a settled answer.
Frederick Powell specialises in the law of data protection and civil liberties. He is due to appear before the Supreme Court in XGY v Chief Constable of Sussex Police and The Crown Prosecution Service UKSC/2025/0194, a data protection / confidentiality case. For enquiries about Frederick, please contact Grace Walton, at Doughty Street Chambers.

/Passle/5b3f2cb9780ebf0410d034b3/MediaLibrary/Images/62cbdb4cf636e90f90cbb708/2023-12-01-17-45-47-680-656a1bcbc939f53538e0fca7.jpg)

/Passle/5b3f2cb9780ebf0410d034b3/MediaLibrary/Images/62cbdb4cf636e90f90cbb708/2023-11-16-15-35-05-870-655636a93d4d3db2aa2b7272.jpg)