Several sources speaking to Vice say that Microsoft hired contractors to listen in on audio picked up by Xbox One consoles.
The goal of bringing human listeners into the mix was ultimately to improve how the Xbox One understood voice commands, but former contractors say their jobs also saw them listening to the audio of personal conversations in cases where the Xbox One virtual assistant had been unintentionally triggered.
It's a practice that has been used by Microsoft for products like Skype and Cortana in the past, and criticized as a breach of privacy in both of those cases once the process was made public knowledge. Apple and Google have likewise come under fire for similar practices.
In the case of Xbox, contractors speaking to Vice say the voice command system has used human oversight since at least 2014 to verify audio was being transcribed correctly, and the system was taking the correct action.
The practice continued once the Kinect-driven voice command feature was transferred to Microsoft’s Cortana virtual assistant on Xbox in 2016. Cortana has since been removed from the console itself, and instead can be used to control Xbox commands through apps for Android and iOS.
"Most of the Xbox related stuff I can recall doing was obviously unintentional activations with people telling Cortana 'No' as they were obviously in the middle of a game and doing normal game chat," a current contractor tells Vice. Others, as detailed in the full article, say the majority of the activations they listened in on were triggered by children.