Hmm isn’t this the old unix idea of “layers” where each layer is built so that its as close to non-leaky as possible, thus allowing you to never need to investigate the layers below the one you are working with.
I think I learned this in school in like year 2000.
And I’ve strived to do that in software ever since - whatever I build - build the abstraction that makes solving the problem easy and then solve the problem with it.
That way when the inevitable change request comes, you don’t need to dive down the layers, just utilize the current tools.
And people gush when that happens throwing terms like 10x developers, ninjas etc. Its just consistently investing time to build those layers and then utilize them.
Bur every single advance we’ve had in software engineering is thanks to abstraction. The issue isn’t that abstraction is bad, it’s that bad abstractions are bad. Finding good ones is hard, and people have begun to assume it’s impossible (or at least not worth the effort).
IMO this is a symptom of the exponential growth in our field. Experienced engineers who know how to do this well are outnumbered by well-meaning novices who output ten times as many LOC with half the value.
In my mind we shouldn’t be avoiding abstraction, we should be learning what makes one good and learning to emulate that.
Doug McIlroy once said something to the effect that the wonderful thing about his team was that he'd come into the lunch room, and instead of talking about new features requiring adding a bunch of code, they'd be talking about new abstractions allowing them to delete a bunch of code.
Back when Paul Graham was a cool programmer, he wrote in On Lisp, that the first step to creating a program is to use Lisp to create a language to write the program in.
The observation that so many people incidentally dependent on software to do their jobs, and that said software is often broken, unreliable, or actively hostile really stikes a cord in me. Humans have always had a tendency to develop a deep understanding of their tools, and even to refine them to better suit their needs. But when the tool is software they are left totally at the mercy of whatever opaque GUI the distributor has cooked up. They have no opportunity to familiarize themselves with the tools they depend on or to sand down unpleasant irritations. It's anti-human, and results in a learned helplessness about technology. Certainly engineers have a role to play in improving their tools to prevent quality backsliding, but to really have transformative change for your everyday worker the systems they use need to be open to modification, and basic education needs to cover at least the starting point for how to go about refining their tools for comfortable day-to-day use. Engineering teams can't anticipate the every need and preference of ever non-software-developer in the world, regardless of how much great Rust code they write.
> Humans have always had a tendency to develop a deep understanding of their tools, and even to refine them to better suit their needs.
Many people in North America use a microwave every day. Does that mean they have or want a deep understanding of magnetrons? Likely not. Of course, this is because microwaves generally just work as expected. But I can't imagine a scenario where microwave home repair would be a thing that people would be okay with if they were unreliable.
If my job or hobbies required me to make frequent use of a microwave oven, to the point that its clunky interface became an issue for me, I imagine I would develop an operational understanding of magnetrons. That is, what they are good at heating, the shape of their heating effect, how best to place and prepare items for heating, and what parameters can be set to control their behavior, though I may lack a detailed knowledge of the physics involved. At this point it would be reasonable for me to want to configure my microwave, alter default settings, or add button behaviors to speed up my frequent use. I don't expect everyone to become proficient with every tool they interact with in any capacity. I do expect them to be able to grow proficient with tools that they use frequently and that matter to them on a professional or personal level.
I used to think to this too, but I'm coming around to the idea that it was a useful idea only in the period when software was special, and used by a small number of people, and not extremely ordinary, and ubiquitous.
There's a different kind of high stakes involved when your software is used by (say) scientists in the defense industry to when your software is used by hundreds of thousands of fast food workers at mcdonalds. In the latter case, proficiency is useful but modifiability is not. So then the discussion needs to move to more human factors like discoverability, optimization of the fast path through the users' most frequent operations, and speed.
There's a corresponding objection to the idea of fast-food software, though, which is that the world doesn't have to be one-size-fits-all, and in fact we may be near or past a peak era for that approach to software: while everyone still wants and uses manufactured goods, the value chains are increasingly able to incorporate some kind of customization element and deliver services to meet precise definitions.
We know that it's more important to get right answers than fast answers, but the question of whether rightness needs to be shrinkwrapped is inconclusive and subject to the "bundling vs unbundling" innovation dynamic: there are a lot of scenarios where local agency is prioritized, e.g., US ground forces rely on being able to coordinate comms and intel well to achieve objectives within broad parameters, in a bottom-up fashion. That kind of approach does mean there's heavy standardization of specific techniques and technologies, but it's done to enable flexibility in other respects.
The problem though is the vastly different scale. Even small software components have thousands or tens of thousands of lines of code. Millions of lines of code is not uncommon for many commonly used tools.
Imagine a mechanical thing with a million interconnection gears. Would anyone really expect to understand it and be able to replace gears to make it fit their work flow better?
Even for developers of that software, it is often inscrutable, and it is often hard to estimate how a change in one feature might impact other features.
And that's an important aspect that engineers need to take into account: serviceability. We already kinda do this for our own sake, we do need to onboard developers and change functionality over time after all. But there's an art to designing a complex system that is nonetheless scrutible and modifiable. There have been experiments in designing such decoupled and live-modifiable systems, like smalltalk images and lisp machines, but the closest thing we have to that in broad use is the web browser, with its inspection window and console. Of course, webpages are usually deployed as compiled blobs of script these days, but without an isane amount of effort one could build a webapp that was well-suited to modification. Then the user agent would just need some way to save user modifications and apply them consistently. Basically like low-friction userscripts or extensions.
Somehow, I haven't seen the juxtaposition of ratchet vs lever before and that lens feels really powerful. I recently had a mechanical design problem where this view might even be literally helpful.
At the risk of losing all my meaningless internet points, I have to confess that Rust just doesn't feel like a ratchet or a lever to me. It feels a lot more like a large, iron shackle.
The problem with all ratchets is that they are difficult to unwind, typically because there is a dangerous amount of tension that the ratchet has accumulated over time as it's been wound up. The C language probably felt like a ratchet in 1970. Ditto OO. Today's ratchet is tomorrow's lethal deadweight. Doesn't mean we shouldn't keep trying.
100% agree that we should always be trying new approaches, ideas and perspectives. I'm all for trying new things and this Cambrian explosion of programming languages we're living through is a wonderful time to be alive.
I just do not happen to agree with the crowd that seems to be believe the only value that matters is safety. I think there are many other values that are in tension with safety and that these should be more reasonably balanced against one another. I sincerely hope that Rust does not represent the end of history for programming language evolution.
That said, I think you raise a really useful historical perspective. C did indeed feel like dead weight to many people. There were many detractors who felt computers weren't (and would never be) fast enough for such a high level language to be practical and as languages became more and more high level, there have been very similar concerns raised at every rung.
We have recently entered a period where natural human language is the input that resolves at least some subset of software creation problems.
Author here, and I agree that Rust shouldn't represent the end of history for programming language evolution. Vehemently. I have been saying for the better part of a decade now that I hope very much Rust is the first of a new breed of memory-safe systems-level languages, and that I also believe that we can do better with learnings from Rust as a stepping stone. Safety is really important along some of the axes I called out in the piece (“can you hand this to an unsupervised junior, or for that matter me on a bad day?”) but it is not the only important axis, and indeed Rust’s success has been in part because of many other very important things it got more right than predecessors or competitors in the same space (Cargo!).
But I am immensely encouraged to see Swift and Hylo and Vale all taking swings at the same problem space with very different emphases and approaches, and while I differ with Zig on some fundamentals I can totally imagine a language that grabs many of its good ideas along many of those from Rust which Zig drops, and goes somewhere better than either has managed so far.
I don't, though, agree at all about natural language solving problems here—rather the opposite, in fact. I think that in many cases, things like memory safety (and other kinds of safety!) are going to be more important to solve at language and framework level in a world where there is massively more code generated by prompting LLMs.
I seem to stumble into so many of the religious zealots, that it's really encouraging to see this from a Rust proponent.
I completely agree with you about natural language not solving the problem. That was a sentence out of a much larger paragraph that I deleted and then mistakenly submitted on my way out the door.
> The real wins, then, are tools which do not require everyone to be at their best at every moment: ratchets, not levers. Levers let you move things, but if you are holding something up with a lever, you have to keep holding it — forever. A ratchet lets you drive forward motion without slipping back as soon as you let up on the pressure.
I love love love this idea. I feel like there is always a curse of immediacy & visible progress that goes against figuring out how to create relief & space & de-tension; few see the tradeoffs & most of the org just wants features now, tomorrow, & always, & often from a position of high of not knowing (ignorance) will argue for an express path that they think gives leverage.
This is such an interesting great framing for in-our-humble-opinions the real quest of software. The idea of the ratchet as allowing progress but also being forever a backstop implies a paradigm that espouses the frontiersmen, the radicals. But always offering fallbacks, redoubts, safeties, and viewpoints to come back to. There's an interesting nexus point of software & observability that I think has grown enormously, but still is viewed largely as ops, as site reliability. The submarine point, what's below the surface, is that the running of software slowly becomes more legible, easier to clearly see. Making good damned choices in your architecture that don't suck & are long term good picks is like 1/3rd the effort. 2/3rds+ of what enhances us, what positions us to not fail is a better total software visibility, is how we see our runtimes run the time; that ability to augment ourselves with an understanding of the runtime is up to is what keeps us from falling backwards, is what keeps us atop of situations.
Our languages & libraries have made great strides, but the real computer revolution that's afoot is being able to see & understand our systems, and only a minority part of that is the arbitrary cobbling of architecture we do and a far greater part of that is systematics, is growing into these tools to see spans & traces, to see time run. This is a sense of computing humans are only just starting to evolve again, and it's IMHO how we ratchet ourselves up onwards & forwards.
The part at the beginning about how critical software is to our world reminds me of the time (or so I've heard, it was before my time) when there was a push to make software engineering a licensed field. A "real" engineering. I'm very happy that didn't happen.
Not to sound too much like an evangelist, but Nix is the first ratchet I've found for software deployment / OS management. Everything else has been a lever. Once you get something working, it works forever.
Since you are evangelizing a little bit, I have a question about nix. I'm a debian user, and I see debian as a base layer. It takes care of security updates and miscellaneous system configurations so I don't have to. In a way, it shifts under my feet, and I like that because those shifts are necessary for security and progress. How are you able to use nix in a way that allows those shifts to silently occur while at the same time maintain full control over your system's configuration? I imagine there must be some kind of tension between fully specifying what you want vs allowing experts to take care of configurations as they see fit.
You don't really fully specify most things. A complete working NixOS config with a few core utilities is like 50 lines without comments. Here is my main workstation's full definition: https://git.sr.ht/~chiefnoah/nixos/tree
It's split up into multiple files, but even the total combination is not that much. If you let Nix take over your system (ie. NixOS) it moves out from underneath you in a completely reproduceable and revertable way so you can almost always just... run the equivalent of a dist-upgrade daily and get on with your life because you're 1 command away from undoing it all.
You'll obviously have more the more you customize things, but for the most part its services.<service>.enabled = true;. The defaults are usually good enough IME.
Your repo is very helpful. Really great comments. It's too bad that I'm so busy. The Nix sirens are certainly calling, but I'll have to plug my ears for now.
I love the idea of Nix, but every time I go back to it I've forgotten the syntax. So it feels much more of a lever. I think something built on the same ideas but with a more standard configuration language (which I guess is what Guix was meant to be?) could be a great ratchet.
For me it’s not the language, the language seems quite small in fact. It’s the damn magic functions. They’re everywhere, and the difficulty to find the function docs can be drastically variable.
It would be a lot better if you could effectively trace your config like a program with debug breakpoints, but I’ve not found that to exist.
I look forward to Nickel integration and never touching Nix again tbh.
Sure, it's basically JSON with semicolons... that's used to define a graph of nested lambdas which are evaluated recursively until a fixpoint representing the stable configuration is reached. Also, you can add comments, which makes it nicer than JSON!
> Levers let you move things, but if you are holding something up with a lever, you have to keep holding it — forever. A ratchet lets you drive forward motion without slipping back as soon as you let up on the pressure.
The ratchet is the promise of literally every tool. The lever is the cold hard reality of every tool. Human beings are required to maintain, fund, update, and use the systems in question. Which means letting up on the pressure will quickly land you in the dustbin of history. Remember Google Wave? Yeah, nobody else does either.
I think I learned this in school in like year 2000.
And I’ve strived to do that in software ever since - whatever I build - build the abstraction that makes solving the problem easy and then solve the problem with it.
That way when the inevitable change request comes, you don’t need to dive down the layers, just utilize the current tools.
And people gush when that happens throwing terms like 10x developers, ninjas etc. Its just consistently investing time to build those layers and then utilize them.