Video conferencing apps can leak location data through audio channels despite privacy controls

How secure is video conferencing – really?
SMU determined an attacker can probe users’ physical surroundings by injecting malicious sounds and analyzing the location-specific audio feedback, or echoes. Credit: Southern Methodist University, Chen Wang

Since the COVID-19 pandemic, video conferencing platforms like Zoom and Microsoft Teams have become essential for work, education, and social connections. While these platforms offer controls such as disabling cameras and muting microphones to safeguard user privacy, a new study suggests that video conferencing may not be as secure as many assume.

SMU computer scientists have discovered that even with cameras turned off and virtual backgrounds in use, attackers can actively and covertly probe a user's physical location by exploiting the two-way audio channels of apps.

The mechanism works through "remote acoustic sensing," allowing an attacker to probe users' physical surroundings by injecting malicious sounds and analyzing the location-specific audio feedback, or echoes.

In a study published as part of the 2025 IEEE Symposium on Security and Privacy , the research team tested popular apps such as Zoom and found that proposed attacks were able to recognize user's locations or location contexts with 88% accuracy, whether the user was in the same place multiple times or had never been there before.

"The results raise a severe privacy concern since any video conferencing participant could invade each other's location privacy easily without malware installation," said SMU principal investigator Chen Wang, O'Donnell Foundation Endowed Professor of computer science at SMU Lyle School of Engineering.

This type of cybersecurity—known as "sniffing location privacy"—is particularly alarming because there's very little users can do to secure videoconferencing, Wang said.

"Even a vigilant user who carefully unmutes the microphone only when speaking remains vulnerable: an adversary can exploit the few silent seconds between unmuting and muting, since people naturally leave margins to ensure their speech is fully heard," he noted. "Furthermore, we find that when a user speaks, sounds return with higher energy, because video conferencing systems apply acoustic suppression to silent user ends to eliminate meaningless feedback."

As a result, the user's speech effectively amplifies the malicious signal feedback.

How secure is video conferencing – really?
SMU computer science professor Chen Wang found that even with cameras turned off and virtual backgrounds in use, video meeting participants can still be vulnerable to privacy intrusions. Credit: Southern Methodist University, Jeffrey McWhorter

Another issue is that the probing sounds can be as short as 100 milliseconds, giving attackers sufficient information before a victim would have time to notice.

Wang and his team are currently working on defense algorithms that can be deployed at the video conferencing server to detect and delete suspicious probing sounds before forwarding audio to participants, along with other ways to defend against an adversary being able to sense our surroundings or "see where we are."

Why your conference call may not be as secure as you think

SMU researchers identified two types of echo attacks that are noninvasive enough to go unnoticed by the victim: the in-channel echo attack, which uses carefully crafted signals to bypass echo cancellation, and the off-channel echo attack, which hijacks everyday sounds like email notifications to slip past defenses undetected.

These methods could allow a thief or spy, for instance, to learn when you are at home. An adversary can also determine where the user is whenever they meet online, even if the user is using a virtual background.

The research team's findings are based on six-month experiments at 12 different locations, ranging from homes and offices to vehicles and hotels.

"We all know that video conferencing systems utilize echo cancellation functions to suppress audio feedback and ensure call quality," Wang said. "However, we find that an adversary can leverage generative AI encoders to counteract such echo cancellation mechanisms and extract stable location embeddings from severely suppressed echo signals, even though they are nearly imperceptible to human listeners."

More information: Long Huang et al, Sniffing Location Privacy of Video Conference Users Using Free Audio Channels, 2025 IEEE Symposium on Security and Privacy (SP) (2025). DOI: 10.1109/sp61157.2025.00260

Citation: Video conferencing apps can leak location data through audio channels despite privacy controls (2025, October 28) retrieved 2 November 2025 from https://techxplore.com/news/2025-10-video-conferencing-apps-leak-audio.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Amazon ends little-used privacy feature that let Echo users opt out of sending recordings to company

43 shares

Feedback to editors