A Rant from the Old Forge: On the State of Software Development

02 Jun 2025 - tsp
Last update 02 Jun 2025
Reading time 6 mins

“Not every innovation is progress. Not every dependency is help. Not every platform string is a limitation.”

There was a time, not so long ago, when building software meant just that - building software. Not downloading half the internet. Not praying for some server halfway across the world to be up so your build script doesn’t die. Not wondering which of the 73 versions of leftpad got pulled into your tree. No, software development used to be straightforward.

As someone who has coded in Assembly, ANSI C, C++, Java, Erlang, Forth, Fortran, C#, QBasic, Visual Basic, JavaScript, Python and having touched Rust - I remember when one had distinct, minimal toolchains. Visual Basic or Visual C++ for Windows in the old ages, gcc (or later clang) and make for Linux and the BSDs and later on of course also on Windows. Except on Windows everything built from source, and if it didn’t, you wrote the tool that did.

The Age of Simplicity

Makefiles. They just worked. Clone a repo from SourceForge via cvs or svn, type make, and within minutes your software was built - on Linux, on FreeBSD, on Solaris, sometimes even on Windows. Sure, you had to write a few ifdefs but the underlying principle was: source goes in, binary comes out. If you needed helper tools, they were built as part of the process - also from source.

Java had it even better: javac worked everywhere. Java’s portability was real. One jar to rule them all. Even Maven, when used sparingly, brought order - not dependency hell.

And yes, DLL hell on Windows existed. But it got solved. Mostly. Over time.

Then Came the Flood

Then the hellfire came.

Package managers. First came pip, then npm, then yarn, then cargo, then gradle (with artifactories), then the N-th flavor of “build orchestrator with plugins that download more plugins to build plugins that fetch binaries from the cloud”.

Want to build a simple application? Now you need to install 32 packages, 5 virtual environments, and 3 different versions of a JavaScript bundler - each one fetching their own version of the world.

Even worse, exact-version dependency pinning became the norm. You’re not declaring compatibility anymore. You’re saying, “This version and only this version will work, everything else will break.” So now every project drags with it its own sealed time capsule of the entire dependency tree. It’s like every applications developers believe it’s the only application running on the system - forever.

The Rise of Binary Artifacts

It wasn’t enough to fetch gigabytes of source. We started pulling binaries. Precompiled toolchains. Obscure shared libraries. Platform-specific artifacts. And surprise - if you’re not on Ubuntu LTS, good luck.

Trying to build an Android app on FreeBSD? Forget it. Not because it’s impossible - but because Gradle insists on downloading precompiled tools, and none are built for your platform. Even though you already have built the whole toolchain on your platform as native executables, gradle ignores them. The same applies to a huge number of Node, .NET, and even Rust projects - despite the fact that the code could run just fine if built locally.

And don’t get me started on string-matching OS names in build scripts. “Darwin”, “Linux”, “Windows” - as if this crude taxonomy is the reason your code compiles or not. We’re back to “This web page works in Internet Explorer 6” levels of idiocy.

What We’ve Lost

Today, people marvel at how quickly they can get a prototype running - and don’t get started with vibe coding “results”. But they don’t see the rot under the surface.

Where We Need to Go

If we want a sane future for software development, we need to steer the ship back to stability. Here are some commandments to consider:

Final Thoughts

We are building brittle sandcastles of software on shaky shores of dependency hell and version madness. The illusion of rapid development hides a future of painful maintenance, platform fragility, and security nightmares.

Let’s not forget the lessons of the past. Good software isn’t just fast to write - it’s also stable, portable, maintainable, compileable and useable in 5, 10, 20, 30 years. We should aim for that.

Not just for ourselves, but for the poor souls who will come after us - be it in the same company having to keep software alive or having to work with our software years after we even stopped programming.

This article is tagged:


Data protection policy

Dipl.-Ing. Thomas Spielauer, Wien (webcomplains389t48957@tspi.at)

This webpage is also available via TOR at http://rh6v563nt2dnxd5h2vhhqkudmyvjaevgiv77c62xflas52d5omtkxuid.onion/

Valid HTML 4.01 Strict Powered by FreeBSD IPv6 support