Think Siri doesn't listen? It does at times—so Apple has told contractors to stop listening to recordings made on the iOS software, the Guardian reports. This after a whistleblower report in the Guardian found that contractors, hired to maintain quality control, were hearing private recordings made unintentionally by Siri, including snippets of sex, business deals, doctor's visits, and illegal drug sales. "While we conduct a thorough review, we are suspending Siri grading globally," says Apple. "Additionally, as part of a future software update, users will have the ability to choose to participate in grading." Most of the recordings were apparently triggered by accident, either by a "wake word" that resembles "hey Siri" or the Apple Watch activating Siri when it's raised.
Sometimes "you can definitely hear a doctor and patient, talking about the medical history of the patient," says the whistleblower. "And you'd hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch." The story echoes earlier reports by Bloomberg and the Belgian station VRT about contractors evaluating Amazon and Google voice assistants. A lot of it is humdrum stuff—like confirming whether a user wanted a Taylor Swift song—but workers say they also heard bits of intimate conversations and possible crimes. "Too often we see that so-called 'smart assistants' are in fact eavesdropping," says a British activist. "We also see that they often collect and use people's personal information in ways that people do not know about and cannot control." (More Apple stories.)