New frontiers in self-incrimination
Jun. 9th, 2014 04:25 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Just read a recent brief in Scientific American, which serves as another reminder of how powerful Big Data is getting:
Apparently, a characteristic of digital cameras (not anything intentionally built-in, just how they are built) is that each one winds up with subtle "noise" in the images it generates. That is, if your camera and mine both shoot the exact same picture, they might *look* the same, but at the fine-grained digital level there are slight differences. Those differences amount to a "fingerprint" for that camera, and can be extracted as essentially a fairly accurate identifier.
The implication? In principle, at least, this means that your photographs are all digitally signed to your camera. On the positive side, this may prove lovely for law-enforcement: if you have used your camera for nefarious purposes (eg, child porn, terrorist activities, etc), and also use the same camera to take innocuous pictures to post on Facebook, those pictures can be correlated at least well enough to make you a suspect. (The brief says that the accuracy rate is about 90%, with 2% false-positive: not enough to hold up in court, but enough to define an initial suspect pool, especially if correlated with other data.)
The downside, of course, is that this is yet another difficulty in trying to maintain distinct and private identities online. You might have very well-separated Facebook identities for your work life and your private life, but if you are posting pictures on both using the same camera, that may someday wind up giving away your identity. Take due notice thereof...
Apparently, a characteristic of digital cameras (not anything intentionally built-in, just how they are built) is that each one winds up with subtle "noise" in the images it generates. That is, if your camera and mine both shoot the exact same picture, they might *look* the same, but at the fine-grained digital level there are slight differences. Those differences amount to a "fingerprint" for that camera, and can be extracted as essentially a fairly accurate identifier.
The implication? In principle, at least, this means that your photographs are all digitally signed to your camera. On the positive side, this may prove lovely for law-enforcement: if you have used your camera for nefarious purposes (eg, child porn, terrorist activities, etc), and also use the same camera to take innocuous pictures to post on Facebook, those pictures can be correlated at least well enough to make you a suspect. (The brief says that the accuracy rate is about 90%, with 2% false-positive: not enough to hold up in court, but enough to define an initial suspect pool, especially if correlated with other data.)
The downside, of course, is that this is yet another difficulty in trying to maintain distinct and private identities online. You might have very well-separated Facebook identities for your work life and your private life, but if you are posting pictures on both using the same camera, that may someday wind up giving away your identity. Take due notice thereof...
(no subject)
Date: 2014-06-09 09:19 pm (UTC)I have often wondered how sensitive this pattern is to, say, a 3x3 Gaussian blur, possibly with a tiny bit of noise thrown in before the blur to mess things up even more. Doesn't even have to be a 100% strength blur, so you don't lose too much image quality.
FB could offer this as a service when you upload your images! It sort of already does by re-encoding your image as you upload...
(no subject)
From:(no subject)
From: