amazon’s deep racer is still missing in action

You’re looking at Amazon’s DeepRacer, advertised as “the fastest way to get rolling with machine learning, literally.” DeepRacer is an interesting integration of a 1/8 scale electric race car chassis and an Intel dual-core Atom processor running Ubuntu. It appears that DeepRacer is based on Amazon’s DeepLens, an AI training system Amazon released in 2017 (see “Exploring AWS DeepLens” for interesting details). DeepLens appears to be using an Intel SoC, a dual-core Atom E3930 with an Intel HD Graphics 500 GPU. The GPU has 12 execution units capable of running 7 threads each (SIMD architecture), the equivalent of 84 cores. The SoC with this GPU is capable of doing interesting image processing on an “edge” device, with the Atom cores doing the basic integration and control of the entire device. And all of this tied back into Amazons Web Services (AWS) for further machine-based inference. It is thus a very interesting device, and all for the paltry sum of $250.

Great tech bundle if you can get one. DeepRacer was announced towards the end of December 2018, and I placed my order for a unit on 1 January 2019 after it became available again (it’s now no longer available, again). Expected release date was supposedly 6 March, or a mere three days from the date of this post. Yesterday I got an order update email from Amazon telling me that the delivery date for my unit was being pushed back six weeks to 19 April at the earliest, with the option to cancel the order if I so wished.

At this point I’m not sure what to do. The DeepLens kit is $250 just by itself and is also available on Amazon. If I were to build my own DeepRacer clone system using a DeepLens to start with, the cost alone would easily be double the cost of a real DeepRacer.  A second alternative is to go full maker and build everything from scratch, substituting cheaper hardware all around, especially the processor. That might allow me to get something approaching DeepRacer in functionality, but hacking the software to perform the same rough functionality isn’t something I want to take on. I’ve followed that lone gunslinger penny-wise-and-pound-foolish type of engineering path in the past because I had no other options, but this time around I really don’t want to go there. The graphics inference engine that runs locally is coded around the on-chip HD Graphics 500 GPU, which isn’t something to look askance at. It might not be an nVidia or Apple class GPU, but it’s no slouch either. I would be sorely pressed to find an equivalent alternative on a SoC, find it on a cost-effective SBC, and find software to bootstrap on this so-called mythical SBC if the SBC even exists.

I’m still not sure if I should cancel the order at this time or just wait it out. My thought is to keep the order in place and begin to look at possible alternatives. The AWS platform is free for experimentation, and it has an simulator, but the simulator is specifically for DeepRacer (obviously). Decisions, decisions…

the arcane part of the science lab

I’m not a professional photographer. I am, at best, a sporadic enthusiast, with periods where my cameras lie fallow as it were. I have many other interests, so many that a few of them see even less activity than my photography. One of them is the practical application of the ubiquitous computers that have washed over us like a cybernetic tsunami.

I purchased a pair of Raspberry Pi Model B’s around the time I was laid off in 2013. With all that was happening in my life at the time, I left them sitting in the box they arrived in while I was scrambling to re-establish myself at another job. I kept the box out in plain view so that my little Raspberries wouldn’t wind up hidden away and eventually forgotten.

I ordered the Raspberries with the express purpose of using them for tasks far different from what’s originally been envisioned. Specifically, intelligent machines, or more romantically, robotics. I’ve been a robot hacker since the mid-1980s when I used Intel 8051 single chip micro controllers with steppers, power FETs, and extremely simple photo sensors (diodes and transistors) to build rolling machines that could follow tracks and avoid obstacles. I had great plans to build up a small robotics business here in Orlando, but life and family got in the way, and in an odd turn of events I satisfied part of that itch when I worked for Birket Engineering as a sub to Universal Studios. When I left Birket I kept going further and further into more esoteric projects, such as Time Warner’s Full Service Network (FSN), the Theater Telemedicine Prototype Project, and finally into modeling and simulation for training purposes using HLA federations. All along the way I kept an eye out on robotics and embedded computing, waiting for that magic ‘tipping point’ moment when hardware, software and serendipity would all combine to provide me that magic moment.

No single moment really occurred, but a whole series. That, combined with my always never having quite enough time, stretched out into years until I finally hit 60 this past December. I’m no fool. Only by the wildest stroke of luck would I ever be able to contribute anything of significance to robotics, and even wilder if I could make a decent living at it. Robotics, from hobby to full industrial, has exploded across the technological landscape. And it isn’t as if I tried to keep up across the years. Back when Radio Shack tried to sell VEX Robotics kits, Radio Shack decided they couldn’t sell enough of them and so they put all their remaining VEX stock on sale. I bought enough VEX gear for four complete bots and even picked up extra equipment for tank treads and special sensors that didn’t come with the starter kits. All this back in 2007 while I was still working for what used to be SPARTA in Orlando. It eventually all went into special containers, with the promise to myself to find out how to externally communicate with and ultimately control the VEX controller, but that turned out to be a fruitless task, as VEX Robotics had (and still has from what I can tell) locked their controller down pretty tight.

It isn’t as if the Raspberries are the first embedded ARM-based boards I’ve every bought. Before the Raspberries I purchased a complete Gumstix kit (that was going to be the brains riding on the VEX machines). I got the Gumstix up and running Linux, using my bigger Linux (openSUSE powered) notebook as the console, even upgrading the kernel on the Gumstix. I had built embedded Linux before; one of the tasks I had at SPARTA was to take small mini-ITX x86-based computers and turn them into very specific appliances hosting Linux. That’s when I really dug down into building a complete embedded Linux distribution (nothing graphical) from the kernel on up. Writing drivers, integrating hardware, building a WiFi network to mesh them all together, it was an enjoyable if intense part of my life. Unfortunately I couldn’t go much father than to get the mini-ITX systems working as a reliable distributed platform. It had solid potential, it just needed some extra spark to really give it a purpose and push it over the top. That spark never arrived. At the end of 2007 I put it away on some storage shelf back at the SPARTA office and left for yet another esoteric project at yet another company.

So here I am, with dirt cheap computing hardware, some new wild ideas, and a lot of dusty notebooks with old ideas. My problem is I’m not into wearable tech like Google Glass and Samsung’s Gear. If I had my way I’d be building adaptive mesh sensor networks roaming through space or sitting on the surface of the moon, gather data and shipping it back to Earth. It’s my very firm belief that we put too much resources into too few missions (Curiosity, a technological tour-de-force to be sure, cost $1billion alone just for the rover, compared to about 1/10 that for either Spirit (MER-A) or Opportunity (MER-B) that came before it). A variation on micro-sats to be sure, an established technology that started with sending what are essentially re-purposed Android smart phones into space. I have a different vision for pervasive computing; my vision looks outward to tools that amplify out abilities to learn more about the universe, rather than help drive us inward to more trivial and irrelevant pursuits, such as how much of a Glasshole we can become to one another.

We shall see. Perhaps if I make more of this public it will become part of the overall motivation I need to finally get off my ass before I become so old I can’t do squat.

Photography Note

Photo of one of my two Raspberries were taken with the Samsung G4 smartphone camera, and post processed with VSCO Cam on the phone. Lighting was off to the left with a simple Fotodiox LED panel. Backdrop was my Samsung Series 7 Chronos notebook. This is the last time I’ll deliberately choose to use this smartphone camera. I’m going back to one of my many µFourThirds cameras with an Eye-Fi card and Android app on the smartphone if I need that kind of immediacy. I’ll only use the Galaxy S4 as a last resort.