Category: Software

Recent Nushell/Rust Work

SQLite, file watcher, windows-rs

I’ve joined the Nushell core team. This doesn’t really change what I’m doing day-to-day, but it makes my work on Nu feel a little more official 🙂.

SQLite Support

This is the biggest feature I’ve implemented so far:

I’m pretty proud of how this turned out; it’s very convenient to be able to browse SQLite databases in your shell and interact with them the same way you would any other data source. Nu is often-but-not-always smart enough to avoid unnecessary work when loading things from the database; there’s still some work to do here and it will probably involve rearchitecting how Nushell queries data.

File watcher

I also implemented a watch command that runs arbitrary Nu code in response to file changes. Nothing groundbreaking, but I find myself needing this kind of low-key automation all the time: run tests when code changes, restart a development server, log changes in a directory, etc. I think the ability to respond to file changes should be a more widely available primitive, and now it is.

Rust for Windows

Against all odds, I somehow got sucked back into Windows development. I spent a solid week helping one of Nushell’s dependencies do a big upgrade of their Windows functionality. This required a deep dive into the current state of calling Windows APIs from Rust, and… it’s a mixed bag.

I used the windows crate which is maintained by Microsoft. It’s an automatically generated set of Rust bindings for Windows APIs, which is both good (very comprehensive, always kept up to date) and bad (some rough edges that might be solved in a handmade solution like winapi). The crate is actively being worked on and it frequently has breaking changes; this means documentation is a little scarce and often out of date. Overall I was impressed and I think the crate has a bright future . But until it settles down a bit, expect some growing pains.

I’ve been using Rust full time for the last month and a bit while contributing to Nushell (more on that later). A lot has changed since I first tried Rust in 2019 and this is my first time working on a big Rust project. Here are some thoughts on the language while they’re still fresh in my head.

Compile times and feedback loops

Rust’s compile times are notoriously slow. Rust development was slow enough on my laptop that I finally gave up on mobile computing and bought a desktop with a top-of-the line CPU (12900K). Along the way I switched from Windows to Linux (more on that later) and started using the mold linker, and now… things are OK!

I’m able to do incremental builds of Nushell (a huge project) in a second or 2, and a full debug build takes 25s. For smaller projects, incremental builds are nearly instant. There’s certainly room to improve here, and the development experience is not great on average hardware, but… this works for me.

Another thing to consider is that the typical Rust feedback loop is tighter than you might expect from the slow compile times. The Rust compiler catches a lot of bugs before a full build needs to happen, and that reduces the need to do a full build and try things out.

Complexity + monotony

Rust is not a simple language. In total I’ve spent nearly 3 months working mostly in Rust, and the language still has a lot of corners that I don’t have a solid grasp on. To improve on this I’m going to need to branch out from Nushell and write a lot of little tools for myself.

Despite the complexity, I’ve found that writing Rust is sometimes a bit… braindead? The type system is very expressive and the compiler catches a ton of errors, so I spend 25% of my time thinking real hard and 75% painting by numbers to make the compiler happy. I can’t quite decide how I feel about this style of development, it can be a little tedious but it also makes for a better end product.

I (sometimes) want a higher-level Rust

Rust has a lot of great things going for it; the tooling, community, package ecosystem, compiler, and syntax are all fantastic. But the focus on systems programming does mean that Rust isn’t quite as ergonomic as it could be for many use cases.

Sometimes I just want a garbage collector! Sometimes I’d be perfectly happy for Rust to implicitly allocate memory if it makes my code work (for example: converting from a &str to a String)! I don’t know if that will ever be possible in standard Rust, but… maybe there’s room for a Rust variant intended for higher-level use cases.

On the other hand, the ability to go as low as you want is great. It’s nice to work in a language with a very “high ceiling”; no matter where your Rust project goes, you won’t have to switch to C or C++.

I recently spent a few days tuning Nushell’s GitHub Actions CI pipelines and it paid off: CI used to take about 30 minutes, and now it’s closer to 10. This is not pleasant or glamorous work, but it has a big payoff; every Nu change going forward will spend a lot less time waiting for essential feedback. Here’s how you can do the same.

Use rust-cache

Seriously, it’s really good! GitHub build runners are slow. But GitHub gives every repo 10GB of cache space, and rust-cache takes advantage of that. It caches temporary files for your build dependencies across CI runs, so if you have a lot of dependencies you’ll likely see a big performance boost.

One gotcha to be aware of: GitHub Actions has slightly unintuitive behavior across PRs. PR X is unable to see cache data from PR Y, but they can both see cache data from the base branch (usually main or master). This makes sense from an isolation perspective, but it’s not especially well-documented; I ended up adding an extra CI trigger on main just to fill caches properly.

Split your build and test jobs

Previously we were running cargo build then cargo test in a single job. This was suboptimal for a few reasons:

  1. cargo test often needed to recompile crates that had just been built for cargo build. #[cfg(test)] is the most likely culprit here; it makes sense that build output might be different in “test mode”. This has implications for caching too!
  2. It’s faster to run build and test in parallel; GitHub gives us 20 build runners for free, and we might as well use them.

Run Clippy after cargo build

Previously we were running Clippy before cargo build. Just switching their order shaved about 5 minutes off every test run! It seems like Clippy can reuse build artifacts from cargo build, but not vice versa.

(Dec 2024: I’ve been told that this doesn’t work anymore. Possible that something’s changed in Cargo/Rust)

Use cargo nextest

cargo nextest is “a next-generation test runner for Rust projects.” It’s dead simple to install in CI, and it’s often faster than cargo test. We didn’t see a huge benefit from this (maybe 30-40s faster?), but that’s because our CI time is dominated by compilation; YMMV depending on your code base and test suite.

Conclusion

If you’d like to see the actual changes, they’re all here. Like anything GitHub Actions, this took a lot of tries to get right; those 5 PRs are just the tip of the iceberg, there were a lot more experimental changes in my private fork. I’m hopeful that someday we’ll be able to stop programming in YAML files, but we’re not there yet!

Nushell

Nushell v0.60 is out, and it’s fantastic. This is the first Nu release where I made meaningful contributions (mostly to the website+documentation) and it feels like a good use of my sabbatical time. It’s been interesting figuring out how to sell+explain Nu succinctly; writing good public-facing documentation is hard!

If you haven’t tried Nu, this is a great time to do so; Nu’s not stable yet, but I think you’ll be very pleasantly surprised by the level of polish. I’ve finally made it my default shell on both Windows+Linux.

Most of the work I’m doing for Nushell has a selfish motivation: I want to live in a world where POSIX shells are a thing of the past, and Nushell seems like the most promising way to get there.

Learning

I’ve started working through Crafting Interpreters by Bob Nystrom. My first exposure to Nystrom’s work was Game Programming Patterns, one of the best programming books I’ve ever read. The title’s a little unfortunate because it covers design patterns that are useful in any field of programming; I genuinely think GPP is much more useful to today’s programmer than the book that inspired it.

Crafting Interpreters walks you through building a scripting language from the ground up. The book walks you through an interpreter implementation in Java then C; I’m doing the Java version in C# (personal preference and experience).

Other

I’ve started rekindling some old friendships with people I haven’t seen in person in 2 years, and that’s been really great.

Spring is finally arriving here in Vancouver, so I’ve been finding lots of excuses to be outdoors. My patio’s never been cleaner and I’m looking forward to a lot of spring gardening. I’d like to get some more trellises set up this year; I have a fairly small urban patio so it’s important to make good use of vertical space. “Green to the eye, not green on the ground.”

.NET has been taking huge leaps and bounds in the last few years, and not everyone is aware of it! We’re now able to build small fast single-file .NET applications; this is arguably Go’s biggest strength, and now you can do it in C# and F# too.

Meanwhile, the .NET community has been making excellent libraries to make console applications slick, polished, and easy to write.

The conditions seem right for a .NET renaissance of sorts; all the pieces are in place for us to build great CLI-first software. Here are some useful+easy things you can do in .NET today:

Publish Small Zero-Dependency Executables

In .NET 6 it’s easy to build your application as a single file with no external dependencies (even the .NET runtime!). Applications that include their own runtime are called self-contained. Self-contained apps can be trimmed to remove unused code; trimming reduces a Hello World application from about 60MB to 12MB.

Trimming is well-supported by nearly the entire standard library. Ecosystem support varies; if a library makes heavy use of reflection, C++/CLI, or COM interop then it might not work with trimming yet.

The easiest way to build a trimmed self-contained single-file executable is dotnet publish with the args --self-contained=true -p:PublishSingleFile=true -p:PublishTrimmed=true.

dotnet-releaser makes publishing your app a breeze; it can build for multiple platforms+architectures, then publish the resulting artifacts in a GitHub release.

Console UI

Spectre.Console is a great replacement for Console.WriteLine() and friends. Coloring text with Spectre is as simple as:

AnsiConsole.MarkupLine("[blue]Hello[/] [yellow]World![/]");

It can also do much more; if you need to print tables or prompt users for input, Spectre.Console has you covered. But if you need to build a full-blown terminal UI, check out gui.cs.

Running External Processes

The System.Diagnostics.Process APIs built into .NET are clumsy, to say the least. Thankfully we have much better options these days! Here’s how to run git status with the excellent SimpleExec:

var (stdout, stderr) = await RunAsync("git", "status");

For a more powerful solution, check out CliWrap; anything you can do in a Bash script, you can do in CliWrap.

headshot

Cities & Code

Top Categories

View all categories