Research at Intel Day had a lot of cool technologies including one about manipulating encrypted images. That may not sound like a big deal, similar things happen all the time, but Intel is trying to do it without ever decrypting the image.
The idea is simple, companies like Facebook have a business model based on abusing your privacy, natural tendencies, and inherent trust in them. Since they have been repeatedly proven to abuse that trust, why give them any more info than you have to? Easy, they force you to. If you know how much any online service has to play with the uploaded data, you realize that there are all sorts of things they tweak in order to provide you with the services that you want. A server also can’t search an encrypted block of who knows what without decrypting it either.
If you send Facebook a post that is encrypted, if they have to be able to decrypt it to know who you are mentioning so they can pester them with incessant emails, then out your innermost secrets to the highest bidder. While the first is ostensibly what you want, the second is what they require in order to make money. To make matters more complicated, both ‘services’ require that Facebook can decrypt whatever you send them. If you don’t let them do so, you will lose out on the core functionality of facebook, and they will lose out on their core source of income.
This is the long way of saying encryption makes data unreadable to those without the key. Duh, that is the entire point of encrypting it in the first place. If you provide Facebook the keys, they can do what you want them to, and also what they want to as well. These things are by their very nature inseparable, or are they?
That is where Intel comes in, and they have a security oriented demo of an encryption technology they are working on. It has two parts, the first is to use the GPU in order to encrypt your images. This is somewhat old hat by now, and is done in OpenCL, something that means Ivy Bridge or newer CPUs. The second is a bit more of a head-scratcher, it involves making an algorithm that allows the underlying data, in this case images, to be manipulated without decrypting them first.
On the surface, this is a really good idea, and quite simple to explain to a layman. If you ask a non-technical person about it, they will often assume that is the way the Internet actually works. Technical users will immediately understand that doing this violates the very purpose of encryption, and is probably impossible. Why? Because it is encrypted.
Intel is not only attempting to do so, they have done it. The demo at Research at Intel Day took an image, encrypted it, and sent it to a server. So far, so normal. The server was then able to manipulate that image without decrypting it, specifically to perform resize and crop operations. This not exactly magic, almost any image manipulation can do both of these jobs on encrypted pictures, data is data after all, and if the file format is something that it can parse, the manipulation algorithms are completely ignorant what the picture is. Aunt Matilda crops and resizes just as well as completely random numbers wrapped in a .jpg header.
The magic comes in when you try and decrypt that data after manipulation. with normal algorithms, you will likely get a parsing error spat back at you, or if you are lucky, another stream of total gibberish. Intel’s algorithm does things differently, instead of errors, decryption gets you, wait for it, the picture, cropped and resized correctly. In technical terms, you can do DCT and Huffman type operations on the data without decrypting it.
There are of course caveats. The server has to use transform algorithms that are compatible with the cypher, and the manipulation of images is currently limited to two operations. Luckily these are the two most common things you do to a picture, so they probably have the majority of the market covered. think 80/20 rule. Far more important is that there is a loss of fidelity in the result, something you might expect to happen. For the technically minded, it is in the 30db range.
The results in the demo were quite acceptable for casual viewing, but I would not recommend uploading medical x-ray images for your radiologist to base treatments on. Then again, we are talking about Facebook, so I wouldn’t recommend uploading anything, but that is a different debate.
Intel obviously wasn’t saying exactly how this works, but if SemiAccurate had to make a semi-accurate guess, we would think that the symmetric key algorithm chunks things up in to discrete blocks of regular data which are presented as ‘pixels’. If you have a 32-bit color depth, a 4 x 4 block would result in 512-bit chunks to manipulate, enough to give granularity while keeping things somewhat random looking. That said, we have no idea what is actually going on under the hood, but it does work.
With a bit of polish, this algorithm can support the best of both worlds, it keeps your pictures private and out of the hands of companies with the moral fiber that mass child murders wince at, Facebook for example, but still lets said companies do what they need to to keep storage and transmission costs down. Unfortunately for the whole paradigm, the first of those worlds is quite at odds with Facebook’s profit generations schemes.
Since the entire concept relies on the host buying in to the plan, anything that even potentially crimps their ability to abuse your data is not going to get far. For this decidedly non-technical reason, the idea of keeping private data in the cloud actually private is unlikely to see widespread adoption. This is unfortunate because Intel looks to have come up with something that not only uses a server’s GPU, but does it in a way that is beneficial to the end user. Cropping and resizing encrypted images is possible now, lets hope some prescient company actually implements it.S|A
Latest posts by Charlie Demerjian (see all)
- Intel shows off 10nm 112Gbps SerDes - Mar 12, 2019
- Intel releases Compute Express Link spec - Mar 11, 2019
- Qualcomm rolls out a second gen 5G modem called X55 - Feb 19, 2019
- What is Intel’s Foveros tech and what isn’t it? - Feb 11, 2019
- Why SemiAccurate called 10nm wrong - Jan 25, 2019