Alexa, Cortana, Google, Siri user? Watch out for these inaudible command attacks | ZDNet Video Description

They have also developed proof-of-concept attacks to illustrate how an attacker could exploit inaudible voice commands, including silently instructing Siri to make a FaceTime call on an iPhone, telling Google to switch the phone into airplane mode, and even manipulating the navigation system in an Audi.

Videos for 9/8/2017