An interesting paper was published earlier this year by a team from University of Illinois at Urbana-Champaign, University of South Carolina, and Zhejiang University. In short, it says that the accelerometer in your phone could give you away even if you’ve locked all your privacy settings down tight.
The idea is based on the fact that each accelerometer is unique at the lowest level, having minor but detectable differences in waveform or harmonic content. To the extent that the characteristic resonance of an accelerometer can identify it uniquely (or nearly so), it acts as a signature.
This means that an app can “record” a phone’s accelerometer and then store it in the cloud for future reference. Some other app can also sample the accelerometer and send the sample to the Cloud, where a search engine can match the signature and identify the phone. (This is the way music is identified these days, so there is clearly precedent that the search aspect is doable.)
“Unique” may actually be an overstatement from a purely scientific standpoint. As they point out, they haven’t done enough of a statistical sample to prove uniqueness over the many millions of phones out there, and they don’t have some theoretical model to suggest uniqueness. But they measured 36 different time- and frequency-domain features in 80 accelerometers, 25 phones, and 2 tablets and came away pretty convinced that there is something to pay attention to here.
They discuss the possibility of “scrubbing” the measurements by adding white noise or filtering, but each of the things they tried was either ineffective or too effective (that is, it affected how an application operated).
To me, it seems like there’s an abstraction problem here. A phone has a raw accelerometer followed by a conditioning circuit and a digitizer. Eventually a value is placed in a register for retrieval by an application. In a perfect world, all distortions and anomalies would be “filtered” out by the conditioning and the digitization so that what lands in the register has been purged of errors – making all accelerometers look alike. That’s a pretty high bar to set, but you’d think that, even if not perfect, it would at least get rid of enough noise to make a uniqueness determination infeasible.
Then again, as they point out, (a) it took 36 features to get uniqueness, and (b) if you couldn’t quite get there using just the accelerometer, you could also bring the gyro (et al) into the picture – effectively adding more features to the signature. So any policy of “cleanup” prior to registering the final value would have to be applicable (and actually applied) strategically across a number of sensors. In other words, some fortuitous solution related to how accelerometers are built would be insufficient, since it couldn’t be used on a gyro as well.
The only other obvious solution would be policy-based. You could restrict low-level access, but that would rule out apps needing high precision readings. The OS could flag apps that need low-level access and ask permission, although presenting that request to a non-technical phone user could be a challenge. And the OS would have to actually check the program code to see if it does low-level access; relying on declarations wouldn’t work since the concern here is specifically about sneakware, whose authors are not likely to volunteer what they’re about.
I’m curious about your thoughts on this. Are there other solutions? Is this much ado about nothing? You can read much more detail in the original paper, and then share your reactions.