It is an easy enough concept to grasp, take a carrier Ethernet switch and slap storage on it so you don’t have to send the same bits over the long haul wires again and again. This saves precious bandwidth to and from the remote sites, a scarce commodity in most places. The devil however is in the details, and that is what ADVA’s Cachejack is said to have worked out.
This little card is said to make a big difference.
Cachejack is the little card with three Ethernet and two optical ports on it. Like many of its ilk designed for carrier Ethernet, the FSP 150 chassis it plugs in to is heavy on the control, management, and reliability side of things. From the user perspective, it just provides Ethernet. The Cachejack module puts two SD slots on the card for up to 64GB of storage capability and an external interface if you need more. That is the easy part though, making it work right without breaking TCP/IP is the hard part.
Caching switches, even video caching switches have been around for a while, but doing it on a cell network is a lot more finicky. If you are caching movies like Netflix or a cable company would do, you have a limited amount of titles, users are at a set location that doesn’t vary, and you can be pretty sure they will watch the entire movie, or at least cache it all. With internet users, especially mobile ones, this all goes out the window. Short clips rule, the content available is nearly infinite, and a speeding car can move between locations at a scary rate. How do you deal with this?
One big problem is that if you don’t control the content source, how do you deal with the TCP/IP ACKs and higher level networking protocols? How do you tell Youtube that they don’t actually need to resend the entire clip? What Cachejack does is to strip the packets at the gateway to their network, then just sends a token to the Cachejack, and a similar flow back to the edge gateway from the cache. This can save massive amounts of bandwidth, you are effectively sending a subset of the video packet headers back and forth, but no data.
The sender, say Youtube, just does it’s normal streaming job. On the carrier side, packets are sent on to the destination until the cache has enough to determine that a video is in the cache or not. If it is, from there the gateway is signaled, and any further packets are stripped down so only tokens go back and forth. This can save an order of magnitude or more of bandwidth.
Another trick that the Cachejack can do is DNS caching, the mechanism is much the same as video caching. The DNS query is still sent along to the appropriate server, but like the video, things can be stripped, replaced, or otherwise twiddled to save bandwidth or give a more appropriate result. If done right, after the first request, subsequent DNS inquiries can see massively reduced latency from this kind of cache. ADVA didn’t go in to much detail on exactly how this works, but the possible mechanisms aren’t that numerous.
One last technology that the Cachejack allows is a combination of the previous two capabilities seen to the user as location aware advertising. This allows the Cachejack itself to give a more appropriate result to the end-user than the one sent over the net, mainly because the Cachejack is location aware, the web server queried might not be. This raised a lot of red flags with the author, the propensity for evil here is immense. Worse yet, advertisers are generally not known for their restraint, much less troubled by ethical concerns like us humans.
ADVA made it clear that this is not for Comcast and Time Warner style content hijacking, but an opt-in approach. Since this type of switch is almost always owned by a carrier, they can cut deals with advertisers to say, “We know you have ads going across our network, want to make them location aware?”. If you search for McDonalds, instead of seeing an ad for the main corporate site, you can see one for the nearest location to you. The carrier can make more money by targeting the ad, and since it is cached, it can be done at the network edge without adding to the bandwidth load. For a carrier, less bandwidth used for more money taken in is the absolute best case scenario.
All in all, Cachejack seems like an interesting idea. From a layman’s point of view, it doesn’t seem like 64GB is anywhere near enough to cache video in a useful way. How many people on the same cell as you are watching that same dumb cat video you are showing your friend? For that reason, it is hard to see this tech being effective, but the DNS and advertising side are a lot easier to see benefits for. That said if it didn’t pay off, Cachejack probably wouldn’t have made it to this stage of product development. It will be interesting to see if it takes off in the market.S|A
Latest posts by Charlie Demerjian (see all)
- More on Intel’s 10nm process problems - Sep 17, 2018
- Intel puts out another 14nm 2020 server platform - Sep 11, 2018
- Why Can’t Intel Supply Enough 14nm Xeons? - Sep 10, 2018
- Intel can’t supply 14nm Xeons, HPE directly recommends AMD Epyc - Sep 7, 2018
- AMD reintroduces the Athlon name with two CPUs - Sep 6, 2018