basic performance test of two different solid state drives on the nvidia jetson xavier nx

I ran another set of tests today on the nVidia Jetson Xavier NX with the latest L4T, version 4.5, based on Ubuntu 18.04.5. I have two drives, a Western Digital Black NVME 250 GB drive mounted in the NVME slot on the bottom of the Xavier, and an SK Hynix 500GB SSD mounted to one of the Xaviers USB 3.1 connectors using a StarTech USB3S2SAT3CB adapter.

The two tests are at the top. I used dd to write a solid 1GB to each drive. Both are pretty fast for an SBC, with the NVME coming in almost twice as fast as the USB SSD. I should note that the Jetson’s USB write speed is three times faster than this same experiment run on the Raspberry Pi 4 using a USB mounted SSD.

These are not meant to be definitive performance tests. All I’m looking for is a simple measure of merit. Let somebody else who has that much time to waste go off and do the exhaustive nit-picky performance tests…

As for the SSDs, I’m something of a digital pack rat. When interesting technology falls well below $100, I usually pick up at least one of the item to experiment with. I picked up the Western Digital SSD when it had dropped to $60 on Amazon last year, and the SD Hynix drive when it dropped to $40. I use the Xavier for many things besides TensorFlow, such as an AARCH64 build machine, and I move the products onto the USB SSD to then copy onto other machines, such as the Raspberry Pi 3 and 4s I have ‘lying around.’

Performant technology is getting so cheap.

nVidia Jetson Xavier NX showing its four USB 3.1 ports

reinstalling jetpack 4.4 official on the jetson xavier nx

I reinstalled JetPack 4.4 on my NX. It wasn’t like I had a choice in the matter. A couple of nights back I picked up the latest Ubuntu 18.04/L4T updates, and when it rebooted, it refused to come back up. After all that work playing with it, and with no backup, I pulled the current image from the Nvidia website and started over again.

Which was, in hind sight, actually a Good Thing. The Developer’s Preview had picked up a bit of cruft from all my experimentation. Being forced to recreate the boot image wasn’t all that horrible, it just cost me a detour and a chunk of my time I really didn’t want to have to give up.

It wasn’t all that bad, actually. After all, you flash a micro SDXC card, poke it into the NX, and apply power. Fortunately for me the bits I cared about were on my blog, and I used my other Linux notebook to basically copy my home directory to a 64GB thumb drive, from which I cherry-picked the bits I want to move over to the new install.

One of the oddball problems I ran into was re-installing Deno. On the 4.4 Developer’s Preview I’d installed Deno via Rust’s cargo, and got Deno version 1.0.2 up and running. This time, Deno had moved up to version 1.1.3 and failed to build via cargo. I’ve gotten Deno back by pulling the source tarball off of Deno’s github site. I discovered what the cargo build failure was by watching the source tarball build: The module deno_lint is 0.1.16 from cargo, and it fails all over the place with unresolved calls. The module deno_lint in the source tarball is version 0.1.15, and it builds. After the build, I pushed my binary over to ~./local/bin (which is in my path) and carried on using Deno.

Oh, and I pulled the latest Emacs from their git site, and I’m now running with version 28.0.50. Some things are actually a smidge better with this release.