When I wrote up the previous post, there was a couple of points which I forgot to cover and since then a few more things have come up on the same subject that I want to go over here as a follow up.
I’ve been carrying on with the package building since the previous article and keeping on the same theme of old computer, old OS, old compiler, what can I build? (PowerPC mac, running OS X 10.4, with GCC 4.0.1). I resort to GCC 5 if I need C11 support & can go up to GCC 7 if needed. Have not yet attempted to package a newer version of GCC yet but after a certain point beyond GCC 7, the currently supported versions of GCC (11 and above?) require interim toolchain builds to go from a system with only GCC 4.x to 10+. Taking around 24 hours to build GCC 5 alone on the hardware I have at hand, I’m putting off attempting toolchain related changes as much as possible.
One thing I seem to spend a lot of time doing is trying to dig into finding requirements for software since the current trend is at most to list the names of components, without any version info. There are a couple of scenarios where the requirements are skewed indirectly.
The software I want to build has modest requirements, but they require a critical 3rd party component which depends on newer tooling, thus setting a higher requirement and sometimes it’s just the build infrastructure which has a far higher system requirement.
Of course if I was working on a modern systems with current hardware it wouldn’t be a problem since either binaries would be readily available or building would not take much time, but perhaps if you’re going to strive for minimalism, avoid requiring build tools which need the latest and greatest language support and beyond.
No matter how much of a time the shiny new build infra will save, it would take many days for me to get up to having the tooling in place in order to attempt building your project.
I find myself in the situatio where I now appreciate autoconf in these scenarios and think that projects should carry the infra for both autotools & more fancy tooling.
Autoconf may be old, odd, and slow, but it has been dragged through many scenarios and systems and has built up knowledge of edge cases, and in the world of old machines and uniprocessors, something that reduces down to a shell script and make files is just fine for such systems. A PowerBook would appreciate it, an iBook definitely would.
Hence, I’ve been holding back versions in some cases where the projects have moved on to different infrastructure, pinning things to the last version to support autotools, for now. Through doing this, I discovered that sometimes there were no mirrors kept for the source files, the source archives are published on their repo but it’s an archive generated from the source tag, not the bootstrapped release archive which would’ve been published originally, resulting in a detour to bootstrapping before attempting to build and additional build dependencies to pull in autotools. I guess I could patch that infrastructure in, having bootstrapped separately, but that would be a hefty diff.
Whilst making progress with packages, I made a discovery as to the source of my troubles with pthread support detection, that I wrote about in the previous post, the common failure is to just check for the presence of the header file and not checking for the implementation details. It turns out, this is largely due to gnulib, the GNU portability library that is vastly used in the GNU ecosystem. The issue there is the detection of pthread support is split across several modules which cover different aspects such as “is there a pthread.h
?”, “is there support for rwlock
?” and in this fragmented way, it is possible to misdetect functionality, since projects are selective in what they import from the library. One issue, now fixed, was the miss detection of support for rwlock
, which OS X Tiger & prior lack. If projects import all of the pthread components from gnulib, detection of pthread support works correctly and gnulib build process makes the correct substitutions and then its tests suite passes ok. If only some parts have been imported things fail when running the test suite since for example the test for pthread support tries to make use of rwlock
support in the testcase. There are many other issues with gnulib and its test suite broadly, completely separate from pthread support that for now I’ve skipped on running the gnulib tests. I am unaware of the scope of breakage in the library since attempting to generate a library comprised of all components in gnulib take some time and disk space to generate (documentation states 512MB disk space but I crossed a couple of gigabytes before aborting). I have been working my way through the GNU software projects to see what’s broken in the current versions related to gnulib. I’m sad to say GNU Hello is a casualty and currently unbuildable on OS X Tiger. There is a lot more work related to gnulib that I haven’t even started on.
Technical aspects of building software aside, one thing that is re-occurring is peoples response to legacy systems. On the plus side folks work on restoring functionality on old systems and others question the age of the system and whether it is time to rip out support for them. I can’t help to wonder if this is conditioning from the Apple side, since there is often support for other long dead operating systems from extinct companies/products that remain in a projects codebase. I was once asked if I try new software and technology as soon as it becomes available for a vacancy with them. As a user of an iPod Classic, PowerBook G4 running OS X Tiger, a 2007 17″ MacBook Pro, and a resume mentioning packaging for legacy OS X, I answered yes & awaited to hear back from them.