INTEL’S LAST KEYNOTE of IDF focused on TVs and how PCs could integrate with them. All it managed to do is convince me that the future is darker than I had feared, the wrong forces are in control, and Intel doesn’t understand this market.
The depressing bits come from two questions asked by myself and Janet Ramkisoon of Quadra Capital. The question I asked was about privacy. Basically Intel is making tools for TV and Internet convergence – overlays, search and chat. But it isn’t being proactive about enforcing privacy. Intel did say it has people looking at it, and will abide by industry standards, but the problem is the industry that sets the standards.
You probably don’t realize this, but Blu-Ray times out encryption keys so that you have to update the system, with a reason or not. This is a pain in the butt to do unless you have the Blu-Ray disc player connected to the net. At that point, Blu-Ray sends the name of every disc you watch, for how long, what buttons you pressed, and what you skipped back to the mothership. You can’t stop it. If you do, your Blu-Ray disc player stops working after a few months. Think about that next time you drop Naughty Nurses VI in – they are watching you.
Another company deeply involved in the convergence is Comcast, and Intel brought its CTO up on stage. Remember, Comcast is one of the pioneers of deep packet inspection (DPI), a technology that is not only counter to the intents of the Internet, but allows the cable companies to screw you over in all sorts of hard to detect antisocial ways.
Comcast is also at the forefront of the net neutrality debate, only it is on the wrong side. It actively wants to distort, block, and cap what you do on the net, and does not in any way have your best interests at heart. (Ironically, you are reading this from a server hosted on a Comcast connection.)
What discussion of how the industry lovingly cares for it’s customers would be complete without mentioning the Sony rootkit fiasco? How about black screens on HD if one company doesn’t like it’s competitor? This is for your own good, not their margins. Honestly, just ask its PR people.
So, Intel is willing to work with the industry and industry standards on privacy and user rights. The problem is that said industry is Comcast, Sony, the makers of Blu-Ray, and others that haven’t been publicly outed yet. They make the standards, and sometimes even go as far as to make them sound palatable to the average consumer, in classic political doublespeak. In the end, these ‘protections’ are about as effective as tissue paper handcuffs. Wet single ply tissue.
These people, and I use the term loosely, have no intention of doing anything that slows down their ability to extract money from their victims, I mean customers. The rules that come out of these companies are laughable and wrong-headed, and that is exactly as intended. Why should they cost themselves money if they can repeatedly get away with abusing your rights for profit?
Intel is doing the wrong thing here. Rather than make tough standards that must be observed and enforced by anyone using its products, it waffles and passes the buck. This is the moral equivalent of selling illegal guns in gang territory, and when a grieving mother comes up and asks why her kid was shot, the answer is, “I didn’t shoot anyone, I just sold the guns. I am only trying to make a dollar.” Intel really needs to step up and set standards at the lowest level and work up from there. Real standards with teeth.
The other question from Janet Ramkisoon was about replacement cycles. Intel needs to make money selling chips that go into TVs, but TVs have a long replacement cycle, 10 years plus. PCs on the other hand have about a three year replacement cycle. Selling one $10 chip every 10 years is not as good a deal as selling one $100 to $400 chip every three years.
To Intel’s credit, Eric Kim answered the question by pointing out that the market for these chips is huge and largely untapped. Intel can not only expand its presence in this market from zero to a large number of chips sold, but market saturation is going to be a tough nut to crack in a few years.
This brings up the next problem, obsolescence. With the convergence of TV, PCs and the Internet, one of the presenters pointed out that TV functionality is now on the Moore’s law curve. The problem with that is that if you improve performance rapidly, you obsolete the TVs rapidly. Products based on the CE3100 (Canmore) chip are going to be obsoleted by the newer and faster CE4100 (Sodaville) part. That is what Moore’s law does.
If you are an early adopter, your TV is now slower than your neighbor who waited a year. Things that are designed to work on Sodaville won’t work all that well on Canmore systems, and the same will be true for next year’s model. Consumers don’t want to replace their TVs every few years to make sure they meets the minimum CPU power for the hot widget of the moment. This paradigm will not play well in the consumer space, and we haven’t even touched on DRM infections and black screens from incompatibilities yet.
That is the sad part, but all is not lost. There were some good things in the morning keynotes, all technology based. The first and most important was GameTree.TV. You may know about Transgaming and its development efforts on Wine, or its Cedega and Cider products for bringing PC games to Linux and Macs respectively. GameTree.TV uses the same basic idea to bring PC games to Linux based TV/Internet converged devices. You didn’t think these things ran Windows did you?
GameTree.TV is a service with a UI that allows people to browse, buy, and play games on a TV oriented UI with wireless motion sensing controllers a la the Wii. There are several models to getting the games – outright purchase, one time use, rental, or ad supported – the publisher can decide which to chose. From that point, the game is downloaded to the TV, and the OS doesn’t matter, it just runs. This could be big.
Then there was 3D, lots of it, using vastly different technologies. The first big one was from HDI IIID, using a laser light source and a Liquid Crystal on Silicon (LCoS) image chip. It is bright light, very bright in fact, and the LCoS chip can switch fast enough to support 1000FPS.
HDI laser light sources and LCoS chips
The light source, shining through styrofoam cups, is so bright it washes out the colors leaking from the fiber optic cables. Those were shining bright enough to almost be painful, and the combined light was pure white and astoundingly bright. Together with multiple LCoS chips, you can have high frame rate 3D images with very high color gamuts. If you recall, Intel tried to make LCoS chips a few years ago. That effort ended up with a disco ball made from prototype chips as a warning for future generations of Intel employees.
Moving from display to capture, we have 3ality, a company that makes the cameras for 3D content creation. The problem with doing 3D image capture is the cameras need to be precisely aligned or the image will have problems that the human eye is really good at detecting. For static cameras, this is easily done, but if you move the camera, focus it in and out, and worse yet, walk around with it, the alignment is going to suffer.
3ality camera platform
To solve this, the big advance from 3ality is a camera platform that takes two existing Sony broadcast quality cameras, and keeps them in alignment. The platform itself looks to have piezoelectric stages on multiple axis that can keep the cameras precisely aligned. 3ality said that the stages adjust the cameras every 2ms, basically constantly.
Each camera dumps a 3Gbps stream of data to a central server, or 6Gbps for the pair. Add in 5Gbps of metadata from the pair, and you have a huge bandwidth problem on your hands as well. The cameras have to keep in alignment with each other too, so for something like a sports game, you may have up to 40 camera pairs that have to keep in contact with each other, forming a global model of the arena in 3-space. Talk about compute intensive….
In the end though, it just works. Justin Rattner stood in front of a live 3D screen and even from the large angle I was sitting at, his interactions with the screen looked very convincing. We will refrain from making a joke about Intel upgrading its executives to appear 17% more lifelike this year, that is too obvious. [No you didn’t. Ed]
This 3D technology will be invading your house in a year or so, but wait until a few of the format details are hashed out before you buy.
One thought to leave you with, and it is about technology trends. Nvidia is pushing an expensive 3D scheme that uses active glasses and active transmitters. It is expensive and simply doesn’t scale. The rest of the consumer electronics industry is going with polarized glasses or Dolby’s color slicing scheme. Guess who is going to win this – Samsung, Sony, Intel, the movie theaters, and just about everyone else that’s out there, or Nvidia?S|A
Latest posts by Charlie Demerjian (see all)
- Displaylink shows off dual 4K screens over USB - Jan 28, 2015
- The ‘roadmap’ about Qualcomm’s Taipan core is wrong - Jan 26, 2015
- Imagination outs a lot of details and demos at CES - Jan 23, 2015
- And the name of Qualcomm’s 64-bit post-Krait core is….. - Jan 21, 2015
- Goodway shows off the first USB-C hub we’ve seen - Jan 21, 2015