September 19th, 2022 — Josh
Photo by my uncle :)
Although I was never around to experience it, I miss the time when there existed a notion of computing for yourself. Computing for fun. In a way we are all personally computing constantly, but that's not what I mean.
Computers have magic in them and with the help of programs we can hope to command that magic momentarily. Using a graphical interface isn't granular enough, it's like using other people's spells rather than writing your own. So now we're at this weird place where computer use is either (1) this half-baked consumerist practice, or (2) engineering. There's no room for software as a soap bubble, it's a nonexistant category.
Basically the point I'm making is that as programmers there's no line between professional and amateur. We force ourselves to only write the most robust and scalable programs, even if it's just for us. Best practices! We use industrial tools, the same infrastructure that we use at work and to power the world, at home.
I don't remember where I read this but somebody pointed out the huge disparity between professional and amateur in arts like film, like in the tools that are available respectively. Professional cameras are huge machines, no amateur uses them at home to make videos (not to mention other tools like high-grade microphones and lighting equipment). In a way it's a similar thing with music studios (all the mixing gear and synthesizers they have...), but that's sort of been disrupted by software (DAWs and VSTs). Regardless, there's a tangible difference both in the experience of creating these media (e.g. band of folks in a garage vs. graybeard audio engineer in a studio), and in the feeling of the result (e.g. funny cat videos vs. a Wong Kar Wai film). The People (amateur) and the Guild (professional) have traditionally been almost entirely different things, only recently has computing started disrupting this arrangement (is this what our post-scarcity world will look like?). But back to computer science specifically...
Is this a good or bad thing? In a way we can paint this as being a victory for computing, the little guy has the same means of production as the big guys, we're all on an equal playing field, etc. But on the other hand the little guy is now corporate. We have the same constraints as industry. This isn't for fun anymore, we're constantly being surveilled, God forbid we don't follow best practices.
I'm not saying we should just be writing spaghetti code at home like nothing matters, not letting yourself lapse into writing BadCodeTM is probably part of a healthy breakfast. Rather I'm saying the priorities at home, for fun, for magic, for personal computing, are different. You don't want robust static silos, cathedrals, pyramids. You want dynamic silk, a spider building up its beautiful web at night and packing it up in the morning, only to do it again the next day. You see, it can't be too difficult. (Excuse my sad attempt at describing this poeticly).
I always see people say that as an industry we've sort of collectively agreed that metaprogramming is a bad idea. Okay, after spelunking in some legacy codebases I can understand why you want code to be as straightforward as possible. That's optimizing for reading, reliability, blah. But why should the technology I use at home when programming by myself for myself, for making my computer do magic for me, be the same as what I do on a team full of people that have to maintain this thing for years to come. For personal computing metaprogramming makes perfect sense. This kind of programming should be fun, it could be witty, it could be whatever we want. IT SHOULD BE FREE. The capitalist ethos of "anything you do should be productive in some way" has seeped in.
Software can be a soap bubble. It can be just for us. It can be just for fun. Write one to throw away. Nobody has to see it!! Oh the sigh of relief I'll breathe when I do something for absolutely no reason for the very first time. Freedom!!!!
I liked the attitude of G. Branden Robinson on the groff mailing list:
> Do you have the git repo in a public remote that I can check out?
Sorry, no--it just lives in my home directory with no replication to the Internet.
I know some people put their personal dotfile repos on the public Web but I'm neither that much of an exhibitionist, nor that proud of some of the kludges I come up with, nor willing to provide support services for such things.
I guess in a way the rise of open source has contributed to the death of personal computing as we somewhat converge on the infrastructure of our society (the bedrock open source programs that everbody uses, industry and individual alike), the gap narrows between the professional and the amateur. We're probably timesharing the same machines for Pete's sake, the cloud doesn't discriminate.
I'm not sure what the takeaway from these thoughts should be. On the one hand cheap computing is helping us make progress in the class war and putting power into the fingertips of the everyman; every year we have more means of production built into our phone or available online libre and gratis. But as a wise stock proverb once said, with great power comes great responsibility, and we have the responsibility to not forget how to be alive. Meaning, we should use this technology as a conduit for our soul, not as a replacement for it. Working hard to make something beautiful in a constrained environment can't be fully replaced by sufficiently advanced technology.
I'm somewhat genuinely worried about the human race becoming like the people from Wall-E. We already get food almost on-demand, entertainment to consume too. As of right now somebody alive has to produce that entertainment, but once we've figured out how to automate that (ComingSoonToATheatreNearYou) we'll just be amusing ourselves to death.
I want to read Computer Lib by Ted Nelson, I think that should answer some of my questions and give me new ones.
I couldn't help turning this into another existential tangent, but the original seed of this post was me wishing for a way to actually use my computer. The ongoing dialogue about static vs. dynamic languages is relevant here:
The difference between Lisp and Java, as Paul Graham has pointed out, is that Lisp is for working with computational ideas and expression, whereas Java is for expressing completed programs. As James says, Java requires you to pin down decisions early on. And once pinned down, the system which is the set of type declarations, the compiler, and the runtime system make it as hard as it can for you to change those assumptions, on the assumption that all such changes are mistakes you're inadvertently making.
[...] The screwed-up way we approach software development is because of what we have done with programming languages. With some exceptions, we have opted for optimizing the description of programs for compilers, computers, .... Interestingly, the results of this optimization are well-described by "Premature optimization is the root of all evil in programming.". [Knuth] was referring to the practice of worrying about performance of an algorithm before worrying about correctness, but the dictum can be taken to refer to any design problem where optimization is an eventual concern. In this case, the design problem was to design a usable programming medium that excels at enabling developers and designers to explore and discover, and to continue to enable discovery and exploration once well into the perfecting stage. Instead of waiting until we understood the ramifications of large system design and implementation using computer programming media, we decided to prematurely optimize for performance and optimization. And we got program description (or programming) languages instead—the root of all evil.