Windows XP X Button Windows XP Square Button Windows XP Minus Button

home

Corduroy's Link Hoarding Cupboard

Scenery

Table of Contents



What is this page?

      Welcome to my humble abode. I've become something of an online trashman over the past few years -- I often stumble upon information that I don't want to lose, so I organize it here for my convenience (and maybe yours too, dig around maybe you'll find something cool). This is also a place to commemorate people who I admire. I like to humanize the gurus and legendary personas that I read and hear about. I find that birthdays help with this -- did you know that Ken Thompson and I almost share a birthday!? Have a good stay.



The history and people of computing

      Much like everything else in life, computer science didn't appear out of thin air. I seldom come across anyone interested in the culture and history of computing and its subgroups, which is sad and kind of strange. The public view of CS seems to paint it as an ultra-practical skillset and a pathway to economic success. Perhaps this is a symptom of the divide between STEM and the humanities, or the inevitable side effect of Late Capitalism. Either way, to the layman, this rich history is virtually unknown, and computer science seems to be boiled down to the technical sum of its parts.


With that in mind, here's a list of people worth knowing (in no particular order):

Richard Stallman (rms), Eric S. Raymond (esr), Linus Torvalds, Jon Hall (maddog), Dennis M. Ritchie (dmr), Ken L. Thompson (ken), Brian W. Kernighan (bwk), Donald Knuth (dek), John Von Neumann, John G. Kemeny, Alan Turing, Ada Lovelace, Charles Babbage, Tim Berners-Lee, Doug Engelbart, Edsger Dijkstra, John McCarthy, Aaron Swartz, Terry Davis, Joe Armstrong, Francesco Vianello, Richard Feynman, Leonard Susskind, Isaac Newton, Gottfried Wilhelm Leibniz, Leonhard Euler, George Boole, Paul Erdös, Vint Cerf

This list is severely incomplete.


pjw

Great reads

* In the Beginning was the Command Line

* A history of the Amiga (parts 1 - 12) (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12)

* Everything on catb.org

* Learn about Free Software from Stallman, here

* Why Erlang is the only true computer language

* The Lesson of Grace in Teaching

* Gödel, Escher, Bach: an Eternal Golden Braid

* Ken Thompson's Password

* Four Days of Go

* Hackers: Heroes of the Computer Revolution

* Diligence, Patience, and Humility by Larry Wall

* The Humble Programmer by Edsger W. Dijkstra

* Breakable Toys

* Thoughts on Glitch Art by Nick Briz

* A series on parsing by Jeffrey Kegler (1) (2)

* Scribe: A Document Specification Language and its Compiler

* The Emperor's New Clothes

* The Meme Hustler (an essay on so many things, some of which I disagree with)

* Hackers and Painters by Paul Graham

* What Makes History? By Michael S. Mahoney

* The Eternal Mainframe by Rudolf Winestock

* The Web of Alexandria (1) and (2) by Bret Victor

    + Related: 'Prints' by Rob Pike

* The Rise of "Worse is Better" by Richard Gabriel

* The future of software, the end of apps, and why UX designers should care about type theory by Paul Chiusano

* The Emperor's Old Clothes by C.A.R. Hoare

* Semantic Compression by Casey Muratori.



Favorite videos

* Python as C++’s Limiting Case, by Brandon Rhodes

* The Mess We're In, by Joe Armstrong

* Terry Davis's McDonald's Interview

* Brian Kernighan Interviews Ken Thompson

* DNA: The Code of Life by Bert Hubert

* Light Years Ahead | The 1969 Apollo Guidance Computer

* Escape from the ivory tower: the Haskell journey

* Steve Wozniak at "Intertwingled" Festival, 2014

* Ted Nelson's Eulogy for Douglas Engelbart

* Ted Nelson Understands Computers

* Jaron Lanier at "Intertwingled" festival, 2014

* Network Literacy by Howard Rheingold

* What is Cybernetics? Conference by Stafford Beer

* 200 Points of Light Demonstration


Birthdays!

Saint Ignucius aka Richard Stallman

Richard Stallman

March 16th, 1953

Ken Thompson

Ken Thomspon

February 4th, 1943

John Von Neumann

John Von Neumann

December 28th, 1903

Charles Babbage

Charles Babbage

December 26th, 1791

Kurt Gödel

Kurt Gödel

April 28th, 1906

Slavoj Žižek

Slavoj Žižek

March 21st, 1949

Eric S. Raymond

Eric Raymond

December 4th, 1957

Dennis Ritchie

Dennis Ritchie

September 9th, 1941

John Kemeny

John G. Kemeny

May 31st, 1926

Ada Lovelace

Augusta Ada King, Countess of Lovelace

December 10th, 1815

Maurits Cornelis Escher

Maurits Cornelis Escher

June 17th, 1898

Richard Feynman

Richard Feynman

May 11th, 1918

Jon 'maddog' Hall

Jon Hall

August 7th, 1950

Brian Kernighan

Brian Kernighan

January 1st, 1942

Alan Turing

Alan Turing

June 23rd, 1912

Joseph Marie Jacquard

Joseph Marie Jacquard

July 7th, 1752

Johann Sebastian Bach

Johann Sebastian Bach

March 31st, 1685

Larry Wall

Larry Wall

September 27th, 1954



Computer-related

* Javidx9, a friendly computer wizard

* The Cherno Project's C++ Tutorials

* Harvard CS50 Intro To Computer Science Videos

* Stanford Computer Science Nifty Assignments

* Bisqwit's YouTube Channel

* Peter Norvig's Website

* Nand to Tetris

* Functional Programming in OCaml

* Computer Science from the Bottom Up

* +ORC

* Advent of Code

* Computerphile

* Live Overflow

* DaedTech (Existentialist Programmer Thoughts)

* Reflections on Trusting Trust

* DIY Raspberry Pi Case

* Nils M Holm's Bits and Pieces

* Handmade Hero (click this for some context)

* Moss (Measure Of Software Similarity)

* FreeBSD Journals

* Bit Twiddling Hacks

* Railroad diagram generator

* Find LaTeX Symbol's Code by Drawing It

* Kode Vicious

* Recursive Drawing

* The Whitney Museum Portal to Net Art

* WikiWikiWeb, the original Wiki

* Making Homebrew PS1 Games in C

* RackAFX7, a DAW Plug-In Designer Software

* Advice from Joe Armstrong

* Michael Abrash’s Graphics Programming Black Book

* Carbon, Create and share beautiful images of your source code

* What every computer science major should know

* Real Programmer: The Story of Mel

* Racism is Socially Engineered Injustice

* pjw's face

* bikeshed

* How to Report Bugs Effectively

* Richard Feynman and The Connection Machine

* The Graphing Calculator Story

* Suricrasia Online's Online Library (Meme)

* The Cursed Computer Iceberg Meme

* The Tilde Club

* Kragen, Software Security Holes

Weblogs

* Drew DeVault's blog

* Norman Yarvin's blog

* Chris Wellons' blog

* Julia Evans' Blog

* Paul Graham's essays

* Steve Yegge's blog and other blog

* Jeff Atwood's blog

* Joel Spolsky's blog

* Dan Luu's Site

* gwern.net

* Matt Might's blog

* Joe Armstrong's blog

* Geoff Greer's blog

Miscellaneous

* Professor Leonard's Math Videos

* Agadmator's Chess Channel

* Andrew Huang

* Food Wishes

* Go 2 Random Site

* Whitechapel: Documents of Contemporary Art

* Lil Ugly Mane

* Textures

* Freakshow Industries, VSTs

* Ascii Art

* Brian Raiter Makes a Unique Keyboard For His Daughter


Quotes, terms, an oxford comma, and etcetera

Playing over these moves is an eerie experience. They are not human; a grandmaster does not understand them any better than someone who has learned chess yesterday. The knights jump, the kings orbit, the sun goes down, and every move is the truth. It's like being revealed the Meaning of Life, but it's in Estonian.

Today science is biased towards rejecting any theory involving God. It is called naturalism. It proposes that any and all observable phenomena must have a natural cause. It also has the side-effect of barring God from having any influence in the world. If it appears that God has an influence in the world, a different explanation must be found. And men are masters of creating alternative explanations. It is called skepticism.

Education is an extension of the human instinct to reproduce. One of the things we should be doing is to purposely invent better ways to think, and then figure out how to teach them to children to create much more able adults than we are.

Anyone can cook, but only the fearless can be great

Hedonic treadmill

From soup to nuts

Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.

Most Non-Unix managers conclude that VI is either extraterrestrial in origin or was devised by the original Unix developers as part of a secret communications code to reach another dimension.

xyzmy@pantsexample.com -- To email me, first you have to remove my pants.

Benevolent Dictator For Life

The moral is obvious. You can't trust code that you did not totally create yourself.

im 86 years old, and, uh, im in the bathtub with two crocodiles, they're like pet crocodiles, cause real crocodiles have gone extinct, so they're like small and pink, and they're crawling all over me, im drinking nice, moroccan coffee, and im listening to old chief keef demos, and just like, beautiful, women everywhere, exotic pets, and i don't know i go down and check the mail

chrestomathy

In the 80s and 90s, engineers built complex systems by combining simple and well-understood parts. The goal of SICP was to provide the abstraction language for reasoning about such systems. Today, this is no longer the case. Engineers now routinely write code for complicated hardware that they don’t fully understand (and often can’t understand because of trade secrecy.) The same is true at the software level, since programming environments consist of gigantic libraries with enormous functionality. Programming today is “More like science. You grab this piece of library and you poke at it. You write programs that poke it and see what it does. And you say, ‘Can I tweak it to do the thing I want?'”. The “analysis-by-synthesis” view of SICP — where you build a larger system out of smaller, simple parts — became irrelevant. Nowadays, we do programming by poking.

I have no idea if other fields have this same problem — my guess is that physicists are particularly prone to it, since we are trained early on to think that physicists are simply smarter than chemists or biologists. Those other fields are for the hard workers. We don’t put mathemeticians on this scale, because we secretly believe they’re smarter than us. Note to the biologist lynch mob: tounge is in cheek.)

Enjoy your work

ex nihilo nihil fit

one should base one’s work satisfaction on realistic achievements, such as advancing the state of knowledge in one’s specialty, improving one’s understanding of a field, or communicating this understanding successfully to others, rather than basing it on exceptionally rare events, such as spectacularly solving a major open problem, or achieving major recognition from one’s peers.

There is no royal way to geometry

we should be getting as much baloney as possible out of our own sandwiches.

after cultivating the right habits around doing nothing, you can actually obtain animal consciousness where you do not desire to do things.

Everybody writes a screen editor. It's easy to do and makes them feel important. Tell them to work on something useful.

Dotfile Madness. We are no longer in control of our home directories.

What you do need is some amount of time spent on the idea that computer programs are mathematical objects which can be reasoned about mathematically. This is the part that the vast majority of people are missing nowadays, and it can be a little tricky to wrap your brain around at first.

of course it runs NetBSD

Ken Thompson was once asked what he would do differently if he were redesigning the UNIX system. His reply: "I'd spell creat with an e."

Radio on TV

C is a Spartan language, and so should your naming be. Unlike Modula-2 and Pascal programmers, C programmers do not use cute names like ThisVariableIsATemporaryCounter. A C programmer would call that variable tmp, which is much easier to write, and not the least more difficult to understand.

free software without any warranty but with best wishes

Why don't you tell us what you are attempting to accomplish and perhaps we can suggest an easier way to "skin this cat".

I find it bizarre that people use the term "coding" to mean programming. For decades, we used the word "coding" for the work of low-level staff in a business programming team. The designer would write a detailed flow chart, then the "coders" would write code to implement the flow chart. This is quite different from what we did and do in the hacker community -- with us, one person designs the program and writes its code as a single activity. When I developed GNU programs, that was programming, but it was definitely not coding. Since I don't think the recent fad for "coding" is an improvement, I have decided not to adopt it. I don't use the term "coding", except if I were talking about a business programming team which has coders.

anything that’s truly real can stand up to scrutiny

If you are anything like me, then you are not an astrophysicist.

nota bene, "note well"

Leaning toothpick syndrome

A notable group of exceptions to all the previous systems are Interactive LISP [...] and TRAC [...] Their only great drawback is that programs written in them look like King Burniburiach's letter to the Sumerians done in Babylonian cuniform!

Neo-Luddism

I never metacharacter I didn't like

Sneakernet

lots of cream, lots of sugar

People often miss this, or even deny it, but there are many examples of object-oriented programming in the kernel. Although the kernel developers may shun C++ and other explicitly object-oriented languages, thinking in terms of objects is often useful. The VFS [Virtual File System] is a good example of how to do clean and efficient OOP in C, which is a language that lacks any OOP constructs

Thompson: One day I got this idea -- pipes, essentially exactly as they are today. I put them in the operating system in an hour -- they're trivial, they really are super trivial when you've got redirecting IO like UNIX already had. The idea was just mind blowing to us. Dennis and I came in and rewrote everything in the world -- our world, in one night. We converted everything. Mostly what we did was throw out extraneous messages. Like, sort would never say "hey I'm sorting, I'm merging, I'm doing this, I'm working on this file". All that garbage is gone -- sort would read, sort, then write. And suddenly sort was what we'd call a "filter" in that day. Then we converted everything that processed something into filters. It was massive, and just exciting.
Kernighan: And the world changed, essentially overnight.

Thompson: I've always been interested in chess, I played it when I was in 7th grade because that's when Bobby Fischer was right at his height. Bobby Fischer and I are like 10 years apart in age, except that he's dead. So, I would come home and on the cover of Life Magazine there'd be "Bobby Fischer". And here I am, exactly the same age, and what do I do? So I felt very very, I don't know, worthless. I joined the chess club and played chess in school, and was good at it, but I didn't like it. I didn't like to win -- because you feel sorry for somebody who'd take it seriously, and I didn't like to lose of course. And that cut down on my options.

In order to do evil things like convert raw bytes to floats, I chose to use the “unsafe” package, which made me feel manly, powerful, and highly supportive of private gun ownership.

A mathematician’s distance from the center of his universe is often measured by Erdős number − how many degrees of coauthorship separate him and the legendary Paul Erdős. It has been my good fortune to snag a Ritchie-Thompson number of one.

Java: the elegant simplicity of C++ and the blazing speed of Smalltalk.

Lehrer went on to describe his official response to the request to use his song: "As sole copyright owner of 'The Old Dope Peddler', I grant you motherfuckers permission to do this. Please give my regards to Mr. Chainz, or may I call him 2?"

To be “good at math” is a direct reference to the K to 12 math curriculum

0xDEADBEEF

Tautological cat is tautological

Erdős number

The majority of ordinary people want to live a peaceful life. In fact, the people who are the most determined to live a peaceful and calm life are constantly running into capitalism as an insurmountable obstacle to living it. Every single day we're pushed aside, pushed at the job, we're put under all kinds of pressure in the family and society -- anywhere we go in capitalist society, which impedes on our very modest wish to live a perfectly ordinary, non-eventful and harmonious life. Because of the pressures the crisis of capitalism puts on these people, you see that the people who are most determined to live a normal and tranquil life are pushed towards drawing more and more radical and revolutionary ideas, contrary to their wishes. And that in fact is the driving force of all revolutions.

To be a Hegelian is to have a big stomach. You've got the whole world in thought. That's how we understand someone like Zizek -- he's got a very big philosophical stomach... there's nothing he can't digest.

You are Not Expected to Understand This

‘You are not expected to understand this’ was intended as a remark in the spirit of ‘This won’t be on the exam,’ rather than as an impudent challenge.

What Are You Afraid Of? Another way of looking at it is that you’re picking a license based on what you are afraid of. All of these licenses assume you’re afraid of being sued. The MIT license is if you’re afraid no one will use your code; you’re making the licensing as short and non-intimidating as possible. The Apache License you are somewhat afraid of no one using your code, but you are also afraid of legal ambiguity and patent trolls. With the GPL licenses, you are afraid of someone else profiting from your work (and ambiguity, and patent trolls). This is a radical simplification, but if nothing else it can be a helpful framework in discussing with your attorney what license makes sense for your software.

Andy giveth, and Bill taketh away.

int risultato;
risultato = addizione(5,6);

Lorinda Cherry told me that that RTM (senior) used to test people's programs by feeding them to themselves as input, a.out < a.out. It helped cure people of the assumption that a program would only see "reasonable" inputs.

There's More Than One Way To Do It

Richard Stallman - The Last of the Hackers, he vowed to defend the principles of hackerism to the bitter end. Remained at MIT until there was no one to eat Chinese food with.

Shaving off an instruction or two was almost an obsession with them. McCarthy compared these students to ski bums. They got the same kind of primal thrill from “maximizing code” as fanatic skiers got from swooshing frantically down a hill. So the practice of taking a computer program and trying to cut off instructions without affecting the outcome came to be called “program bumming"

To err is human to forgive divine.

Nothing further

The Useless Use of Cat Award

Helen: As you know, god hates brute force

The sonavabich didn't study astronomy

Tesla doesn't have a cafeteria

Unix herder

enough _ to be dangerous

Enjoy every sandwich

The hacker ethic that played such a large part in advancing computer science, building gcc, building Linux, indeed building the world's computer systems and engineering the biggest peaceful economic boom in history, is more than just a thirst for knowledge about computers. It's the obsessive belief that knowledge exists to be shared, that helping someone by making their computer run better (or their air conditioner) is one of life's joys, and that the rules that prevent sharing and helping exist to be broken. [...] The hero is the one who knows how to fix things, and fixes them -- despite not being "authorized." The evil is the paperwork we construct around ourselves, the forms and regulations that take the place of people freely helping each other.

Next time your boss comes to you and asks "Can't you just...?" Stop. Think about what he just asked. Your boss is managing complexity and he doesn't even know it, and he's just described the interface he wants. Before you dismiss him as asking for the impossible, at least consider whether or not you could arrange things so that it looks like you're doing the really simple thing he's asking for, rather than making it obvious to all your users that you're doing the really complex thing that you have to do to achieve what he asked for. You know that's what you're doing, but you don't have to share your pain with people who don't know or care about the underlying complexity.

The fact is, your brain is built to do Perl programming. You have a deep desire to turn the complex into the simple, and Perl is just another tool to help you do that--just as I am using English right now to try to simplify reality. I can use English for that because English is a mess. This is important, and a little hard to understand. English is useful because it's a mess. Since English is a mess, it maps well onto the problem space, which is also a mess, which we call reality. Similarly, Perl was designed to be a mess (though in the nicest of possible ways).

Our variable and function naming convention is to match the surrounding code. For example, if you see that variables use a CamelCase style, match that. If they use underscores, or are lowercase, match that. Readability and consistency within a section of code is of greater importance than universal consistency.

Home is a place where you are without having to justify why you're there.

If you are writing a script that is more than 100 lines long, you should probably be writing it in Python instead. Bear in mind that scripts grow. Rewrite your script in another language early to avoid a time-consuming rewrite at a later date.

This code is MIT Licensed. Do stuff with it.

Just some dotfiles son

** auuuuugggghhhhhh **
Oh bother and blast, I am mere version 3 compiler and cannot see into the future.
You have given me a version 5 program. This means my time on earth has come.
You will have to kill me. You will uninstall me, and install a version five compiler. I will be no more. I will cease to exist.
Goodbye old friend.
I have a headache. I'm going to have a rest...
**

Prison mellowed him wonderfully...Suffering either embitters you or, mercifully, ennobles you

Zettelkasten

не болтай ногами

I couldn't really learn Erlang, 'cos it didn't exist, so I invented it

I hate complexity. I tolerate it when there is no other way, but as my math kids say, if you have the right answer, it is beautiful and simple. Complex is reserved for when you haven't figured it out yet.

You must speak the native language of the community you want to be part of. If you deride their local language and proselytize Esperanto, the natives may not take kindly to you!

Finally, although the subject is not a pleasant one, I must mention PL/1, a programming language for which the defining documentation is of a frightening size and complexity. Using PL/1 must be like flying a plane with 7000 buttons, switches and handles to manipulate in the cockpit. I absolutely fail to see how we can keep our growing programs firmly within our intellectual grip when by its sheer baroqueness the programming language —our basic tool, mind you!— already escapes our intellectual control.

Argument three is based on the constructive approach to the problem of program correctness. Today a usual technique is to make a program and then to test it. But: program testing can be a very effective way to show the presence of bugs, but is hopelessly inadequate for showing their absence. The only effective way to raise the confidence level of a program significantly is to give a convincing proof of its correctness. But one should not first make the program and then prove its correctness, because then the requirement of providing the proof would only increase the poor programmer’s burden. On the contrary: the programmer should let correctness proof and program grow hand in hand. Argument three is essentially based on the following observation. If one first asks oneself what the structure of a convincing proof would be and, having found this, then constructs a program satisfying this proof’s requirements, then these correctness concerns turn out to be a very effective heuristic guidance. By definition this approach is only applicable when we restrict ourselves to intellectually manageable programs, but it provides us with effective means for finding a satisfactory one among these.

We all know that the only mental tool by means of which a very finite piece of reasoning can cover a myriad cases is called “abstraction”; as a result the effective exploitation of his powers of abstraction must be regarded as one of the most vital activities of a competent programmer. In this connection it might be worth-while to point out that the purpose of abstracting is not to be vague, but to create a new semantic level in which one can be absolutely precise.

The competent programmer is fully aware of the strictly limited size of his own skull; therefore he approaches the programming task in full humility, and among other things he avoids clever tricks like the plague.

Remember when, on the Internet, nobody cared that you were a dog?

In theory, you eschew ornamentation

C++ is several different languages in one compiler. You can use it as a stricter C, as a C with some syntactic support of ADTs, as C++98-style OO, or as a C++17 style meta-programming system. And I’ve probably missed a few. The resulting complexity requires a lot of discipline to use successfully, especially in a large team.

RIPJSB

“understanding hardware” was akin to fathoming the Tao of physical nature.)

Kotok was not the only one preparing for the arrival of the PDP-1. Like a motley collection of expectant parents, other hackers were busily weaving software booties and blankets for the new baby coming into the family, so this heralded heir to the computing throne would be welcome as soon as it was delivered in late September.

weapons or tools that aren’t very trustworthy are held in very low esteem -— people really like to be able to trust their tools and weapons.

The man of the future. Hands on a keyboard, eyes on a CRT, in touch with the body of information and thought that the world had been storing since history began. It would all be accessible to Computational Man.

During the 1970s, when structured programming was introduced, Harlan Mills pointed out that the programming team should be organized like a surgical team--one surgeon and his or her assistants, not like a hog butchering team--give everybody an axe and let them chop away.

I am as proud as a mother hen

Open Source Won the Battle, But Lost the War... a larger share of computing is performed by fixed function products, where the ability to modify the product is not part of the value proposition for the customer. Now ironically, open source has flourished in these fixed function devices, but more often than not, the benefits of those freedoms being realized more by those making the products rather than end users (which actually was true of the software market even back then: Microsoft was a big consumer of open source software, but their customers were not). Similarly, one could argue that open source has struggled more in the general purpose desktop space than anywhere else, but as the web and cloud computing has grown, desktop computing has increasingly been used for a narrower purpose (primarily running a browser), with the remaining functions running in the cloud (ironically, primarily on open source platforms). In short: open source does really own the general purpose computing space, but the market has become more sophisticated.

Работаете братья

standard hackerese pejorative

But to Gosper, LIFE was much more than your normal hack. He saw it as a way to “basically do science in a new universe where all the smart guys haven’t already nixed you out two or three hundred years ago. It’s your life story if you’re a mathematician: every time you discover something neat, you discover that Gauss or Newton knew it in his crib. With LIFE you’re the first guy there, and there’s always fun stuff going on.

"The lyf so short, the craft so long to lerne." - Chaucer

The original used the EBCDIC cent sign character to start and another cent sign to end the comment (i.e. programmer's two cents).

Bourne to Program, Type with Joy

PICNIC: Problem In Chair Not In Computer

ID10T

This might not seem like something that noteworthy, but I see it as the exact type of thing one would think of while taking a shower, which is why I love it.

I like to think of a cybernetic forest filled with pines and electronics where deer stroll peacefully past computers as if they were flowers with spinning blossoms.

It seems that many modern programmers prefer lasagna code to spaghetti code. I'm still debating which is worse. - Bill

Meritocracy was meant to be a satirical concept so clearly wrongheaded that nobody could think it was a good idea. There is no true meritocracy - there's only an arbitrarily chosen set of metrics that any given organisation considers to be of merit, and people who either benefit or lose out as a result of that specific choice. We don't understand community dynamics and the process of software development well enough to say with absolute certainty that a given set of metrics is objectively the correct measure, and as a result we cannot provide a meaningful definition of merit.

'Ken Thompson has an automobile which he helped design. Unlike most automobiles, it has neither speedometer, nor gas gauge, nor any of the other numerous idiot lights which plague the modern driver. Rather, if the driver makes a mistake, a giant “?” lights up in the center of the dashboard. “The experienced driver,” says Thompson, “will usually know what’s wrong.”'

Gods of BSD

Meanwhile, the GPL has legally enforced a consortia on major commercial companies. Red Hat, Novell, IBM, and many others are all contributing, and feel safe in doing so because the others are legally required to do the same. It's basically created a "safe" zone of cooperation, without anyone having to sign complicated legal documents. A company can't feel safe contributing code to the BSDs, because its competitors might simply copy it without reciprocating. There's much more corporate cooperation in the GPL'ed kernel code than with the BSD'd kernel code. Which means that in practice, it's actually been the GPL that's most "business-friendly". So while the BSDs have lost energy every time a company gets involved, the GPL'ed programs gain almost every time a company gets involved. And that explains it all.

No time for shoes; we sped here in our winged car.

ego death arms race

Dennis told me he was going to a class reunion at Harvard.
Me: "I guess you're the most famous member of your class."
dmr: "No, the Unabomber is.

give me a persian rug where the center looks like galaga

When "history" overtakes some new chunk of the recent past, it always comes as a relief--one thing that history does...is to fumigate experience, making it safe and sterile.... Experience undergoes eternal gentrification; the past, all parts of it that are dirty and exciting and dangerous and uncomfortable and real, turns gradually into the East Village.

The "Who wrote the Bourne shell?" question kind of reminds me of the old Bugs Bunny bit where he'd be on some radio gameshow and the host would ask, "Who's buried in Grant's Tomb?" and no one would get it right. (Except totally different because, of course, no one is buried in Grant's Tomb: Grant and his wife are entombed in sarcophagi above ground, not buried below.)

Prometheus: Get going then. Keep to the course you're on.
Ocean: You speak to one already under way. My four-foot bird is skimming with his wings the level paths of air. He will be glad to bend his knee, I think, in his own stall. (Ocean exits as he had entered, flying on his winged beast.)

Not everyone at the labs had a three-letter login. Bjarne Stroustrup had the login bs, despite several gentle suggestions from myself and others that had add a middle initial

Before I was shot, I always thought that I was more half-there than all-there—I always suspected that I was watching TV instead of living life. People sometimes say that the way things happen in movies is unreal, but actually it's the way things happen in life that's unreal. The movies make emotions look so strong and real, whereas when things really do happen to you, it's like watching television—you don't feel anything. Right when I was being shot and ever since, I knew that I was watching television. The channels switch, but it's all television.

г на п (гавно на палке)

Under the siren song of affluence, we began offshoring critical production capacity in the 1960s for geopolitical reasons. In 1971, economist Nicholas Kaldor noted that American financial policies were turning a "a nation of creative producers into a community of rentiers increasingly living on others, seeking gratification in ever more useless consumption, with all the debilitating effects of the bread and circuses of imperial Rome."

McIlroy implies that the problem is that people didn't think hard enough, the old school UNIX mavens would have sat down in the same room and thought longer and harder until they came up with a set of consistent tools that has "unusual simplicity". But that was never going to scale, the philosophy made the mess we're in inevitable. It's not a matter of not thinking longer or harder; it's a matter of having a philosophy that cannot scale unless you have a relatively small team with a shared cultural understanding, able to to sit down in the same room.

p/q2-q4!

When port wine is passed around at British meals, one tradition dictates that a diner passes the decanter to the left immediately after pouring a glass for his or her neighbour on the right; the decanter should not stop its clockwise progress around the table until it is finished. If someone is seen to have failed to follow tradition, the breach is brought to their attention by asking "Do you know the Bishop of Norwich?"; those aware of the tradition treat the question as a reminder, while those who don't are told "He's a terribly good chap, but he always forgets to pass the port."

It is possible to make man suid to a user man. Then, if a cat directory has owner man and mode 0755 (only writable by man), and the cat files have owner man and mode 0644 or 0444 (only writable by man, or not writable at all), no ordinary user can change the cat pages or put other files in the cat directory. If man is not made suid, then a cat directory should have mode 0777 if all users should be able to leave cat pages there. - $ man man

The Collatz Conjecture was once called "a Russian plot to stagnate American mathematics"

rev < /usr/share/dict/web2 | sort | rev > ~/rhyming_dictionary.txt

If anyone even hints at breaking the tradition honoured since FØRTRAN of using i, j, and k for indexing variables, namely replacing them with ii, jj and kk, warn them about what the Spanish Inquisition did to heretics.

I remember a really interesting quote I've heard (forgive me if it's off, it's been a while): "originally a movie looks like it was made by 100 people, then you edit it to look like it was made by 1 person, and then the real magic happens when it looks like it was made by no one". I feel like spacemacs is currently on that 100 ppl choppiness atm, doom is at 1, and no one has figured out how to make it seem natural as it differs per person. I feel like the magic happens when I code my own config, as I know myself best. Idk if that makes sense.

The construction of software should be an engineering discipline. However, this doesn’t preclude individual craftsmanship. Think about the large cathedrals built in Europe during the Middle Ages. Each took thousands of person-years of effort, spread over many decades. Lessons learned were passed down to the next set of builders, who advanced the state of structural engineering with their accomplishments. But the carpenters, stonecutters, carvers, and glass workers were all craftspeople, interpreting the engineering requirements to produce a whole that transcended the purely mechanical side of the construction. It was their belief in their individual contributions that sustained the projects: We who cut mere stones must always be envisioning cathedrals. (Quarry worker’s creed).

Sometimes you want work computers to be sea monsters, not cattle.

Breakable Toys. You work in an environment that does not allow for failure. Yet failure is often the best way to learn anything. Only by attempting to do bold things, failing, learning from that failure, and trying again do we grow into the kind of people who can succeed when faced with difficult problems. Solution: Budget for failure by designing and building toy systems that are similar in toolset, but not in scope to the systems you build at work.

Lopatin’s own interests lay in filmmaking, and later, music journalism, but his parents’ experience with musical and creative censorship while behind the Iron Curtain would help inform his contextual, resourced approach to creating electronic music as OPN later on. “Music was rare and very special to them,” he remembers. “My dad has absolutely crazy stories about trading vodka for the microphone that a trolley driver uses to announce stops and then converting it for band practice, or listening to bootleg records pressed on X-rays. When you don’t have shit, you really learn to actually appreciate it. It’s so fundamental and yet so easy to overlook.”

Clifford Stoll: The first time you do something, it's science. The second time, it's engineering. The third time it's just being a technician. I'm a scientist -- once I do something I want to do something else.

Clifford Stoll: It's out of problems that you can't understand that you make progress. Doing science means bumping into something that makes you wonder, that you don't know the answer to, and in getting to the answer brings you to an understanding of the bigger world around us

The debugging strategy has been "Shake the tree, fix what falls out.: That works really good at first, but now you have to shake the tree really hard to get anything to fall out

Consider also this; all the time you spend studying is time you don't spend helping others, or keeping fit, or being a good friend. You need to decide what the balance of your priorities and time is, but I'd argue someone who spends more time seeking knowledge than being virtuous because seeking knowledge is virtuous is likely missing something very important. As for overspecialisation, people underestimate the importance of intersectionality. You cannot understand physics without a strong grasp of maths. You cannot understand literature without being aware of the historical context in which it was written. Study multiple areas to understand your own better would be my view, as it gives you that all-important tool: context.

Suppose that ah ken aw the pros and cons, know that ah’m gaunnae huv a short life, am ay sound mind etcetera, etcetera, but still want tae use smack? Tey won’t let ye dae it. Tey won’t let ye dae it, because it’s seen as a sign of thir ain failure. Te fact that ye jist simply choose tae reject whit they huv tae ofer. Choose us. Choose life. Choose mortgage payments; choose washing machines; choose cars; choose sitting oan a couch watching mind- numbing and spirit- crushing game shows, stufng fuckin junk food intae yir mooth. Choose rotting away, pishing and shiteing yersel in a home, a total fuckin embarrassment tae the selfsh, fucked- up brats ye’ve produced. Choose life. Well, ah choose not tae choose life. If the cunts cannae handle that, it’s thair fuckin problem. As Harry Lauder sais, ah jist intend tae keep right on to the end of the road.

Years ago, anthropologist Margaret Mead was asked by a student what she considered to be the first sign of civilization in a culture. The student expected Mead to talk about fishhooks or clay pots or grinding stones. But no. Mead said that the first sign of civilization in an ancient culture was a femur (thighbone) that had been broken and then healed. Mead explained that in the animal kingdom, if you break your leg, you die. You cannot run from danger, get to the river for a drink or hunt for food. You are meat for prowling beasts. No animal survives a broken leg long enough for the bone to heal. A broken femur that has healed is evidence that someone has taken time to stay with the one who fell, has bound up the wound, has carried the person to safety and has tended the person through recovery. Helping someone else through difficulty is where civilization starts, Mead said." We are at our best when we serve others. Be civilized.

Does what it says on the tin.

I have the right ideas, but my words are too complicated. I need to simplify them, so that people won't get lost in the dark when they see and hear them. I want them to shine like beacons of light in a world of overly complicated darkness. One day I will find the right words, and they will be simple.

In modern parlance, every single instruction was followed by a GO TO! Put that in Pascal's pipe and smoke it.

I have often felt that programming is an art form, whose real value can only be appreciated by another versed in the same arcane art;

Quiche Eaters

Most present-day inventions, it seems to me, do not differ much from the way humanity has been picturing them in imagination for countless generations. The submarine and aeroplane must have been foreseen by many of our species from the time they first observed the ways of the fish and the bird. All tele-instruments are probably what men always hoped they might be; and the destructive power of our hydrogen bombs is no more terrifying than the Biblical prediction of their impact. But I doubt whether even the most fertile imagination possessed by a mathematician a short century ago could have foreseen the wondrous features of our high-speed electronic computer -

A good programmer was concise and elegant and never wasted a word. They were poets of bits.

Blurbs are hilarious to me. Even modest and prudent writers suddenly become the most fiery and flowery spokespeople when asked to write a blurb. It’s amazing. It’s this whole register of language people would never use otherwise, part self-help and part advertising—you need this book, this book will help you live—written by “writers” who are all trying to outdo themselves. So it seemed like perfect source material for a poem.

We say most aptly that the Analytical Engine weaves algebraical patterns just as the Jacquard-loom weaves flowers and leaves

This is computer science. There aren't restrictions. I can do anything I want. It's just bits. You don't own me.

conceptual integrity

Scheme is an artisan’s little hammer

Dogfood. It is good practice to use the software one is developing on a daily basis, even when one is not actively working on developing or testing the product. This is known as "eating one's dogfood". Many bugs, usually user interface (UI) issues but also occasionally web standards bugs, are found by people using daily builds of the product while not actively looking for failures. A bug found using this technique which prevents the user of the product on a regular basis is called a "dogfood" bug and is usually given a very high priority.

The code base's age makes it likely that its authors by now either have advanced to management positions where reading books as this one is frowned upon or have an eyesight unable to deal with this book's fonts. These changes conveniently provide me with a free license to criticize code without fear of nasty retributions.

A quick way to judge a language implementation is by inspecting its string concatenation function. If concat is implemented as a realloc and memcpy, well, the upstairs lights probably aren’t set to full brightness.

Read broadly (there are a few books recommended in my earlier columns). Always take jobs that expand what you might do (I rarely take a job in the same industry twice). Have your own projects that you're interested in, and they don't need to be as large as a FreeBSD. Code for fun and not always for profit if you can afford it. I started by "playing with computers" as it was called when I was young and I continue to "play" as well as work. Undirected learning has a place.

Multics Emacs proved to be a great success — programming new editing commands was so convenient that even the secretaries in his office started learning how to use it. They used a manual someone had written which showed how to extend Emacs, but didn't say it was a programming. So the secretaries, who believed they couldn't do programming, weren't scared off. They read the manual, discovered they could do useful things and they learned to program. So Bernie saw that an application — a program that does something useful for you — which has Lisp inside it and which you could extend by rewriting the Lisp programs, is actually a very good way for people to learn programming. It gives them a chance to write small programs that are useful for them, which in most arenas you can't possibly do. They can get encouragement for their own practical use — at the stage where it's the hardest — where they don't believe they can program, until they get to the point where they are programmers.

Excellent testing can make you unpopular with almost everyone.

But I was never a debugger-first programmer. None of us in the lab were, and that's probably why the debugging setup in Unix is to this day so weak compared to what other systems provide.

Rob's Rule - what a program presents as output it should also accept as input.

> "... But truth be known, I'm sort of a printf() debugger."
So am I, and ISTR Brian Kernighan and Larry Wall saying that they are, as well.

Who shall be the arbiters of good taste?

Yes, Virginia, it had better be unsigned

An étude (a French word meaning study) is an instrumental musical composition, usually short, of considerable difficulty, and designed to provide practice material for perfecting a particular musical skill.

Kabelsalat

If you want to abstract away the machine from the end user, it's wise to not propagate its internal word size to users trying to build user interfaces for scientific applications!

Finally there came in the mail an invitation from the Institute for Advanced Study: Einstein…von Neumann…Wyl…all these great minds! They write to me, and invite me to be a professor there! And not just a regular professor. Somehow they knew my feelings about the Institute: how it’s too theoretical; how there’s not enough real activity and challenge. So they write, “We appreciate that you have a considerable interest in experiments and in teaching, so we have made arrangements to create a special type of professorship, if you wish: half professor at Princeton University, and half at the Institute.”
Institute for Advanced Study! Special exception! A position better than Einstein, even! It was ideal; it was perfect; it was absurd!
It was absurd. The other offers had made me feel worse, up to a point. They were expecting me to accomplish something. But this offer was so ridiculous, so impossible for me ever to live up to, so ridiculously out of proportion. The other ones were just mistakes; this was an absurdity! I laughed at it while I was shaving, thinking about it.
And then I thought to myself, “You know, what they think of you is so fantastic, it’s impossible to live up to it. You have no responsibility to live up to it!”
It was a brilliant idea: You have no responsibility to live up to what other people think you ought to accomplish. I have no responsibility to be like they expect me to be. It’s their mistake, not my failing.
It wasn’t a failure on my part that the Institute for Advanced Study expected me to be that good; it was impossible. It was clearly a mistake—and the moment I appreciated the possibility that they might be wrong, I realized that it was also true of all the other places, including my own university. I am what I am, and if they expected me to be good and they’re offering me some money for it, it’s their hard luck.

capitalism in molecular life happens in plasma

I like to think of RPN as the computing equivalent of mis en place; before you start cooking, you get all your ingredients lined up.

Parse, don’t validate

Europeans tend to pronounce his name properly, as Nih-klaus Virt, while Americans usually mangle it into something like Nickles Worth. That has led to the programmer joke saying Europeans call him by name while Americans call him by value.

One thing to keep in mind when looking at GNU programs is that they're often intentionally written in an odd style to remove all questions of Unix copyright infringement at the time that they were written. The long-standing advice when writing GNU utilities used to be that if the program you were replacing was optimized for minimizing CPU use, write yours to minimize memory use, or vice-versa. Or in this case, if the program was optimized for simplicity, optimize for throughput.

an ornamental text

quintessentially collegiate in the sense that it involved ordering pizza.

If the Author of the Software (the "Author") needs a place to crash and you have a sofa available, you should maybe give the Author a break and let him sleep on your couch. If you are caught in a dire situation wherein you only have enough time to save one person out of a group, and the Author is a member of that group, you must save the Author.

>>> I wish it was as easy for others to have such satisfaction these days.
>>Amen to that.  I think we all lived, you Bell Labs people especially, in a simpler time.  We were trying to fit into 64K, split I/D 128K, my Z80 was 64K but some extra for graphics, then the VAX came and we were trying for 1MB, Suns with 4MB. So small mattered a lot and that meant the Unix philosophy of do one thing and do it well worked quite nicely. What that also meant, to people coming on a little bit after, was that it was relatively easy to modify stuff, the stuff was not that complex. Even I had an easy time, my prime was back at Sun when SunOS was a uniprocessor OS.  That is dramatically simpler than a fully threaded SMP OS that has support for TCP offloading, NUMA, etc, etc. I really don't know how systems people do it these days, it is a much more complex world. So I'm with Norm, it was fun back in the day to be able to come in and have a big impact.  I too wish that it was as easy for young people to come in and have that impact.  I've done that and it was awesome, these days, I have no idea how I'd make a difference.
>By doing something besides systems. There are tons of open source projects at the user level that make a difference. Consider something like AsciiDoc.  Less than 10 years old (methinks), and in use for production by at least one major technical publisher that I know of. (And it sure beats the pants off of DocBook XML.) Or the work I do on gawk, Chet on Bash, other GNU bits. There's a whole world out there besides just the kernel.

Rules, guidelines, and principles are gems of distilled experience that should be studied and respected. But they’re never a substute for thinking critically about your work.

The other way, which is actually a really meaningful thing to me, and it’s super super subtle, is that I included a fast-forward button, so you can fast forward the music, you can skip to the next track, et cetera. But the subtle part is that, when you press the fast-forward button, you have changed the playback speed of the piece of music, but the random-number generator is moving ahead at the same rate. And so, the result of that is that you’ve now offset the music a little bit from the random sequence that it would have gotten had you not pressed that button. And, so you are suddenly getting an entirely new version of the piece. It sounds pretty much identical, because I use randomness in this textural level in the sound synthesis. It’s not like it has any control at the actual musical level. But the philosophical side of it is that if you never press the fast-forward button, then it’s this closed computational system, and it’s basically output-only. You know, you’re turning on the chip, the code starts running and it’s this closed system that mirrors some basic number theory. And the moment you press the button, then you’ve kind of interjected in this pure closed system, and you’ve brought in all of the messiness of the real world — like whether or not our time and space is grid-based on the lowest level, or whether or not we have free will, or where randomness comes in in quantum mechanics. All of these messy real world things suddenly come into the system and it’s no longer this clean abstract thing. And so, for me, what that also means is that the work of mathematicians, like Alan Turing and Kurt Gödel, these people who really explored the limits of computation, and the limits of mathematics, really — their work in applicable until you press the fast-forward button. It’s applicable so long as this is a deterministic system, and you kind of kill that determinism when you press the fast-forward button. So I think of that as, like, playing the piece.

And like the wooden boat, Morse code gives us a link with the past. The old salts of radio are out there, still pounding the brass, and willing to chat with anyone who makes a credible effort to enter their world.
I travel with a small radio that fits into an old laptop computer bag. I'll throw a wire out the hotel window, slip on my headphones, dim the room lights, and see what's coming in. Sometimes I'll hear weak signals with exotic distortions picked up in the ionosphere as the waves crossed the equator or bounced over the pole. I feel lightning bolts jump from my fingers when I squeeze the key to send my own thoughts back the other way.
Sometimes I'll hear the crisp clean signal of an A-1 operator, probably a retired operator, probably living in Phoenix and having trouble sleeping. Once for them code unlocked adventure. They traveled the world keeping in touch with dits and dahs. They make it sound like poetry. I like to meet these guys by radio. In code they don't seem so old.

KMS UXA DRM OMG WTF BBQ

Incidentally, adjusting the Q on a random EQ band with no cut or boost is a surefire way of convincing most clients that something sounds better...

What's interesting is that at the same time that we were doing AWK, there was a project at Xerox PARC called ... Poplar ... which had fairly similar goals. That is, they were going to take these files consisting of a lot of characters and you break them apart in various ways and you process them using some language and you pass them on. And they put a lot of work into clever ideas. It was a functional programming language. Which was ... especially then a big deal, now it's less of a big deal. And it was supposed to be user friendly ... They had all sorts of stuff. And AWK lived and Poplar died. And I don't know, you know, this has affected my view about people who talk about user interfaces fairly severely. I'm no longer convinced anybody knows anything about user interface. It's clear some things are easier to use and some things are harder to use. That's okay. But it's also clear that people, people learn a lot. ... although AWK s language is both C-like and not real elegant, a lot of AWK programming isn't done by getting the reference manual and writing code. What is done is by finding some AWK program that's very similar to what you want to do and changing it, okay? And then the fact that it's a little weird isn't so bad. It's, you know what you're just changing. AWK lives partly, I think, because of its programming by example.

computerology

I think I was here two years before I understood anything Ken said. Because if you ask Ken a question, he sort of gives you a one line answer to the question he thinks you ought to have asked approximately. I don't know what he thinks he's doing, but that's certainly what it seemed to be like he was doing. And you have to know a lot to understand a one line answer to these things. And after a while the answers become extremely informative. But it really took a long time before I just understood anything Ken was saying.

I’d love to see Jack Tramiel and Richard Stallman in a debate. God, that would be just great.

I think almost every experienced programmer has gone through three stages and some go through four:
Cowboy coders or nuggets know little to nothing about design and view it as an unnecessary formality. If working on small projects for non-technical stakeholders, this attitude may serve them well for a while; it Gets Things Done, it impresses the boss, makes the programmer feel good about himself and confirms the idea that he knows what he's doing (even though he doesn't).
Architecture Astronauts have witnessed the failures of their first ball-of-yarn projects to adapt to changing circumstances. Everything must be rewritten and to prevent the need for another rewrite in the future, they create inner platforms, and end up spending 4 hours a day on support because nobody else understands how to use them properly. Quasi-engineers often mistake themselves for actual, trained engineers because they are genuinely competent and understand some engineering principles. They're aware of the underlying engineering and business concepts: Risk, ROI, UX, performance, maintainability, and so on. These people see design and documentation as a continuum and are usually able to adapt the level of architecture/design to the project requirements.
At this point, many fall in love with methodologies, whether they be Agile, Waterfall, RUP, etc. They start believing in the absolute infallibility and even necessity of these methodologies without realizing that in the actual software engineering field, they're merely tools, not religions. And unfortunately, it prevents them from ever getting to the final stage, which is:
Duct tape programmers AKA gurus or highly-paid consultants know what architecture and design they're going to use within five minutes after hearing the project requirements. All of the architecture and design work is still happening, but it's on an intuitive level and happening so fast that an untrained observer would mistake it for cowboy coding - and many do.
Generally these people are all about creating a product that's "good enough" and so their works may be a little under-engineered but they are miles away from the spaghetti code produced by cowboy coders. Nuggets cannot even identify these people when they're told about them, because to them, everything that is happening in the background just doesn't exist.
Some of you will probably be thinking to yourselves at this point that I haven't answered the question. That's because the question itself is flawed. Cowboy coding isn't a choice, it's a skill level, and you can't choose to be a cowboy coder any more than you can choose to be illiterate.
If you are a cowboy coder, then you know no other way.
If you've become an architecture astronaut, you are physically and psychologically incapable of producing software with no design.
If you are a quasi-engineer (or a professional engineer), then completing a project with little or no up-front design effort is a conscious choice (usually due to absurd deadlines) that has to be weighed against the obvious risks, and undertaken only after the stakeholders have agreed to them (usually in writing).
And if you are a duct-tape programmer, then there is never any reason to "cowboy code" because you can build a quality product just as quickly.

One thing you have to be careful about, though, is that duct tape programmers are the software world equivalent of pretty boys… those breathtakingly good-looking young men who can roll out of bed, without shaving, without combing their hair, and without brushing their teeth, and get on the subway in yesterday’s dirty clothes and look beautiful, because that’s who they are. You, my friend, cannot go out in public without combing your hair. It will frighten the children. Because you’re just not that pretty.

Hackers don't write documentation, or plan out their programming, instead they write Manifestos.

"Hypergrunge" was more specific, because a lot of the peripheral ideas about this record are like very directly dealing with observations in the music industry, and grunge is like, an insane thing, because it's a fallacy, it's totally fabricated. All grunge is hypergrunge. It's been synthesized by all of these marketing factors. You know about grunge speak? Or the story of grunge speak? There was a secretary at the Sub Pop office in the '90s, she was always getting interviewed because journalists would just come in and be like "So, what's grunge?" And she got sick of it, and she started tricking them or lying to them, like "Yeah, there's a whole language to grunge, dude." And she made up all these words that were like, grunge speak. [The New York Times] went and published this thing that she tricked them with, and I was so inspired by that.

How does one patch KDE2 under FreeBSD?

pete burkeet
this is a brilliant exercise. we need to talk.

trips around the sun

These compromises mean that we can best think about typeface design as the creation of a wonderful collection of letters but not as a collection of wonderful letters.

the medium is the massage

Piratical Practices

Ever since I heard this story, I can't help but see that we live in a world brimming with "electric" meat. Learn to find it. Then get out there and grab it.

I have not reached the burrito point yet

These digits that Samson had jammed into the computer were a universal language that could produce anything-—a Bach fugue or an antiaircraft system.

Yes, formerly they wanted your blood, now they want your ink.

der Herr Warum

Ever since the 50's, the fashion in language design has been minimalism. Languages were designed around paradigms, and paradigms were judged on elegance. If your choice of paradigm was lists, the result was LISP. If you decided on strings, you invented SNOBOL. Arrays, you wound up with APL. If your reasoning led you to logic, you ended up with Prolog. If you thought that descriptions of algorithms must be the basis of programming, then your fate was Algol. If assembly language stayed at the core of your world view, what you created would be an imperative lanugage and might look like any of C, BASIC or FORTRAN. If stacks were your thing, you created a language that looked like Forth. And who knows if the belief in objects as the one paradigm will ever again see Orodruin and feel the heat of the flames in which it was forged. Some of these languages were totalitarian -- they insisted everything be shoehorned into their central concept. I recall some very unpleasant struggles with early dialects of SNOBOL, LISP and Prolog. Imperative languages tended to be more elastic. But, while the strictness of adherence varies, almost every language invented over the past 50 years bears the clear stamp of minimalism.

LALR parsing rode on the backs of yacc and the Portable C Compiler into total dominance over parsing mindshare. LALR came to define production-quality parsing. In 1978, there had been two C compilers, both at AT&T's Bell Labs -- Ritchie's original, and Johnson's Portable C Compiler, which was about to replace it. Neither was a commercial product -- AT&T was a regulated monopoly, banned from selling hardware or software. By 2006, C was the most important systems programming language in a world which had become far more dependent on computer systems. There were now many C compilers, several of them of great commercial importance. Arguably, one of these was the most visible production-quality C compiler. This no longer came from AT&T. The leader among C compilers in 2006 was GCC, the GNU foundation's flagship accomplishment. GCC can be said to have clearly taken this lead by 1994, when BSD UNIX had switched from the Portable C Compiler to GCC. This was a revolution of another kind, but as far as the parsing algorithm went, it was "Hail to the new boss... same as the old boss". GCC parsing was firmly in the LALR tradition.
But on 28 February 2006, the nearly 3 decades of LALR dominance were over. That's the date of the change log for GCC 4.1. It's a long document, and the relevant entry is deep inside it. It's one of the shorter change descriptions. In fact, it's exactly this one sentence: The old Bison-based C and Objective-C parser has been replaced by a new, faster hand-written recursive descent parser. For me, that's a little like reading page 62,518 of the Federal Register and spotting a one-line notice: Effective immediately, Manhattan belongs to the Indians.

The history is not being told in order. This is not because I am attempting to ape "Pulp Fiction". It's because the series was originally planned as perhaps two or three blog posts and has grown in scope. I'm stuck now with telling the tale using flashbacks and fast-forwards, but I'll make the process as easy as possible.

Historians can divided into two types -- the Braudel's and the Churchill's. Fernand Braudel insisted on accumulation of detail, and avoided broad sweeping statements, especially those of the kind that are hard to back up with facts. Winston Churchill thought the important thing was the broad sense, and that historians should awaken the reader to what really mattered, not deaden him with forgettable detail.

But it is also true that some of the programming skills we've developed over the years could aptly be called symptoms.

Colorless green ideas sleep furiously

If you want to build a ship, don't drum up the men to gather wood, divide the work and give orders. Instead, teach them to yearn for the vast and endless sea.

If languages were free, this is the kind of perfection that we would seek -- languages precisely fitted to their domain, so that adding to them cannot make them better.

Ada's "Notes" were written 20 years after Mary Shelly, while visiting Ada's father in Switzerland, wrote the novel Frankenstein. For Ada's contemporaries, announcing that you planned to create a machine that composed music, or did advanced mathematical reasoning, was not very different from announcing that you planned to assemble a human being in your lab.

2006-10-31: The default prefix used to be "sqlite_". But then Mcafee started using SQLite in their anti-virus product and it started putting files with the "sqlite" name in the c:/temp folder. This annoyed many windows users. Those users would then do a Google search for "sqlite", find the telephone numbers of the developers and call to wake them up at night and complain. For this reason, the default name prefix is changed to be "sqlite" spelled backwards. So the temp files are still identified, but anybody smart enough to figure out the code is also likely smart enough to know that calling the developer will not help get rid of the file.

Newsgroups: comp.os.linux.advocacy,gnu.misc.discuss
John Dyson said:
> not most) users. I attain no high in passive smoking, and only feel
> the anxiety. The very sad thing is to avoid the anxiety, pot users
> often either seek more pot, alcohol or benzo's for temporary relief.
> This is a really silly vicious circle for a little short term
> relief. Those who think that pot is harmless are diluding themselves
> or even simply ignorant.
That's because your not using the Scheme Configurable Window Manager. I found that having the proper WM while toking is extremely important. You need one that absolutely minimizes use of the rodent, and with scwm's synthetic events, awesome key binding supports and scripting, I have a setup which allows me to do everything without pushing about the cursed rolly thing. Thank you Maciej for thinking about all of us stoners when designing SCWM.

It's like herding cats

Bikeshed

The alternatives are much worse. With informal and non-standardized error messaging you would have to know not only the programming language itself but its error messaging Pidgin as well. Every C++ practitioner can confirm that reading STL messages is an art by itself. It takes knowledge, patience, determination and, occasionally, a crystal ball to build a firm understanding of a reported problem.

one must understand that Unix commands are not a logical language. They are a natural language--in the sense that they developed by organic evolution, not "intelligent design".

Less traditional symbols are also useful. For example: ☎->✆(); // take my phone (☎) off hook (✆)

As many of you know, the rule with Unix was "you can touch anything, but if you change it, you own it." This was a great way to turn arguments into progress.

…I realised that I wanted to read about them what I myself knew. More than this—what only I knew. Deprived of this possibility, I decided to write about them. Hence this book.

URL shorteners may be one of the worst ideas, one of the most backward ideas, to come out of the last five years. In very recent times, per-site shorteners, where a website registers a smaller version of its hostname and provides a single small link for a more complicated piece of content within it… those are fine. But these general-purpose URL shorteners, with their shady or fragile setups and utter dependence upon them, well. If we lose TinyURL or bit.ly, millions of weblogs, essays, and non-archived tweets lose their meaning. Instantly. To someone in the future, it’ll be like everyone from a certain era of history, say ten years of the 18th century, started speaking in a one-time pad of cryptographic pass phrases.
But the biggest burden falls on the clicker, the person who follows the links. The extra layer of indirection slows down browsing with additional DNS lookups and server hits. A new and potentially unreliable middleman now sits between the link and its destination. And the long-term archivability of the hyperlink now depends on the health of a third party. The shortener may decide a link is a Terms Of Service violation and delete it. If the shortener accidentally erases a database, forgets to renew its domain, or just disappears, the link will break. If a top-level domain changes its policy on commercial use, the link will break. If the shortener gets hacked, every link becomes a potential phishing attack.

I think you’ll see what I mean if I teach you a few principles magicians employ when they want to alter your perceptions…Make the secret a lot more trouble than the trick seems worth. You will be fooled by a trick if it involves more time, money and practice than you (or any other sane onlooker) would be willing to invest.”

it was the right thing to do. wish i had thought of it. i was too busy saving bytes.

you could have invented ...

Richard Stallman, famous for Emacs, founder of GNU Project and Free Software Foundation presents a wonderful beard. Although Eric Raymond did the Art of Unix programming, he has only really been famous for Fetchmail, hence his smaller mustache

Shannon says this can't happen.

“My students, once they are filled up with new ecological knowledge and have developed an awareness of our situation, always say, ‘We have to tell people what’s happening in the world. If they only knew what they were doing, they would stop.’ But, it’s not true. We are all saturated with data. We do know what we are doing...The data may change our minds, but we need poetry to change our hearts”

Multicores r us

The Net interprets censorship as damage and routes around it

Syntactic sugar causes cancer of the semicolon.

If you know the way broadly, you will see it in everything

You're Holding It Wrong

The increasing population in the US, and the demands of Congress to ask more questions in each census, was making the processing of the data a longer and longer process. It was anticipated that the 1890 census data would not be processed before the 1900 census was due unless something was done to improve the processing methodology. Herman Hollerith won the competition for the delivery of data processing equipment to assist in the processing of the data from the 1890 US Census, and went on to assist in the census processing for many countries around the world. The company he founded, Hollerith Tabulating Company, eventually became one of the three that composed the Calculating-Tabulating-Recording (C-T-R) company in 1914, and eventually was renamed IBM in 1924.

Unlike music, scientific work does not come with liner notes. Perhaps it should.

Still, I recall a brief exchange with a physicist who was sitting next to me listening to a panel discussion on Theoretical Computer Science and Physics (which took place in the early 1990s): They [the TCS guys] talk as if a bit is as fundamental as an electron - he told me with amazement. That's of course wrong, a bit is far more fundamental - I answered to his even greater amazement. Needless to say, he did not talk to me during the rest of the panel...
An electron is merely a specific model of a specific phenomena. It is a very important model, but how can you compare its importance to the importance of the notion of a model? But, then, all models are built of binary attributes.
This reminds me of a drive from MIT to Providence that Shafi and me made in the mid-1980's. We stopped in a diner on the way, got some food and coffee, and then Shafi asked the attendant Do you have the notion of a refill? The answer was: Yes, we do have refills, but what is a notion?
Indeed, people may use notions without having a notion of a notion, and likewise they may think of bits without a clear conceptualizing of the pure notion of a bit. In both cases, these notions exist before we conceptualize them; these notions are preconditions to any conceptualization.

He who understands Archimedes and Apollonius will admire less the achievements of the foremost men of later times.

Alright, so here we are, in front of the, er, elephants. And the cool thing about these guys is that they have really, really, really long trunks. And that’s cool.

Long ago, I observed that Edsger Dijkstra’s dining philosopher’s problem received much more attention than I thought it deserved. For example, it isn’t nearly as important as the self-stabilization problem, also introduced by Dijkstra, which received almost no notice for a decade. I realized that it was the story that went with it that made the problem so popular. I felt that the Byzantine generals problem (which was discovered by my colleagues at SRI International; was really important, and I wanted it to be noticed. So, I formulated it in terms of Byzantine generals, taking my inspiration from a related problem that I had heard described years earlier by Jim Gray as the Chinese generals problem.

On April 12, Kevin MacKenzie emails the MsgGroup a suggestion of adding some emotion back into the dry text medium of email, such as -) for indicating a sentence was tongue-in-cheek. Though flamed by many at the time, emoticons became widely used after Scott Fahlman suggested the use of :-) and :-( in a CMU BBS on 19 September 1982.

In Turkish there is a simpler solution to this you just say, “kolay gelsin Steve”

enjoy the music and this life we have:)

I am regularly asked what the average Internet user can do to ensure his security. My first answer is usually 'Nothing; you're screwed'

(On Paul Vixie) Yeah, that unibrow is epic! Maybe that is why he is so good at networking, Even his brow is connected! hahahah.

boiling the power users out

Common Lisp macros are to C++ templates what poetry is to IRS tax forms

Hence the description of Windows 95 as "a 32-bit extension to a 16-bit patch to an 8 bit OS originally for a 4-bit chip written by a 2-bit company that doesn't care 1 bit about its users."

It began badly. We were walking along the South Downs Way in early summer, the sun glittering on the English Channel on our right, the Weald of Sussex stretching away to our left. “How big,” asked Arthur, “should a text editor be?"

Everyone knows how C programs look: tall and skinny. Whitney’s don’t.

Imagine if there were a tunnel which ran into your basement from the outside world, ending in a sturdy door with four or five high-security locks which anybody could approach completely anonymously. A mail slot in the door allows you to receive messages and news delivered through the tunnel, but isn't big enough to allow intruders to enter. Now imagine that every time you go down into your basement, you found several hundred letters piled up in a snowdrift extending from the mail slot, and that to find the rare messages from your friends and family you had to sort through reams of pornography of the most disgusting kind, solicitations for criminal schemes, “human engineered” attempts to steal your identity and financial information, and the occasional rat, scorpion, or snake slipped through the slot to attack you if you're insufficiently wary. You don't allow your kids into the basement any more for fear of what they may see coming through the slot, and you're worried by the stories of people like yourself who've had their basements filled with sewage or concrete spewed through the mail slot by malicious “pranksters”. Further, whenever you're in the basement you not only hear the incessant sound of unwanted letters and worse dropping through the mail slot, but every minute or so you hear somebody trying a key or pick in one of your locks. As a savvy basement tunnel owner, you make a point of regularly reading tunnel security news to learn of “exploits” which compromise the locks you're using so you can update your locks before miscreants can break in through the tunnel. You may consider it wise to install motion detectors in your basement so you're notified if an intruder does manage to defeat your locks and gain entry. As the risks of basement tunnels make the news more and more often, industry and government begin to draw up plans to “do something” about them. A new “trusted door” scheme is proposed, which will replace the existing locks and mail slot with “inherently secure” versions which you're not allowed to open up and examine, whose master keys are guarded by commercial manufacturers and government agencies entirely deserving of your trust. You may choose to be patient, put up with the inconveniences and risks of your basement tunnel until you can install that trusted door. Or, you may simply decide that what comes through the tunnel isn't remotely worth the aggravation it creates and dynamite the whole thing, reclaiming your basement for yourself.

If there’s no such thing as anonymous data, does privacy just mean security?

non-fungible

verb 99 noun 62

O’Reilly meme-engineers a nice euphemism—“meme-engineering”—to describe what has previously been known as “propaganda.”

On the Semantic Web, it's too hard to prove you're not a dog.

kremvax

the hackish sense of humor transcends cultural barriers

which in those days were illegal and therefore had to be smuggled to other countries, an activity also known as "working ahead of the law"... ;-)

If your goal is to get an idea or approach widely used to the largest possible extent, a permissive license like the BSD (or MIT) license has much to offer. Anyone can quickly snap up the code and use it. Much of the TCP/IP code (at least for tools) in Windows was originally from BSD, I believe; there are even some copyright statements still in it. But don't expect the public code to be maintained by those who take and modify the code. I haven't noticed a large number of Microsoft developers being paid to improve any of the *BSDs, even though they share the same code ancestries. If your goal is to have a useful program that stays useful long-term, then a protective license like the LGPL or GPL licenses have much to offer. They force the cooperation that is good for everyone in the long term, if a long-term useful project is the goal. In general, I've noticed that GPL projects are far less likely to fork than BSD-licensed projects; the GPL completely eliminates any financial advantage to forking.
One person who agrees with you that the GNU GPL is not the best choice for all circumstances: Richard Stallman. http://lwn.net/2001/0301/a/rms-ov-license.php3
The BSD development teams have been characterised by who has a 'commit bit', and who has the power to grant it. BSDs have forked not only because of commercial interference but because of petty tousles over privilege. [netbsd coup/cabal, etc] In one sense, there has only ever been one committer to the Linux kernel: Linus Torvalds himself is the only person who applies patches to his own tree and is the sole dispenser of 'holy penguin pee'. ... From the start, source code management in Linux was handled by the completely decentralised 'patch'; an inheritance from the Minix hotrodders' community who had no rights over the original source but could swap and improve one another's patches at will.
Actually the only place where I see a use of BSD license is in creating things that are intended to become standards. Like if the gecko (the html rendering) part of Mozilla, was BSD licensed. It would have allowed people in other companies to pickup the code into their browsers. And we would have seen that the html dialect followed by gecko would have become a standard. This was a mistake made by Mozilla. Actually BSD license was very useful in the early days of Unix. The License made Unix API into the standard that it is today. Similarly Internet Protocol also became standard because of BSD license.

I feel I should point out that both the dominant mobile operating systems are Unix-hased. The UI is necessarily new, but astonishingly the 50 year old basic abstractions are the same.
Except Unix is kind of hard to see. It wasn't just the hierarchical file system but the idea of composability. Even now we whip up a shell "one-liners" to perform some task we just thought of. All that is lost. And not just on mobile devices. For example search through email messages for something in an email "app". And no UI composability. We have to use extremely heavyweight IDEs such as X-Code weighing at 15GB (even "du -s /Application/X-code" takes tens of seconds!) to painstakingly construct a UI. We can't just whip up a dashboard to measure & display some realtime changing process/entity. There may be equally heavyweight third party tools but there has been no Bell Labs like research crew to distill it down to the essence of composable UI and ship it with every copy. The idea that users too can learn to "program" if given the right tools.

A megacorp is not your dream job. ... Megacorps are, in fact, in the minority. There are tens of thousands of other tech companies that could use your help. Tech workers are in high demand — you have choices! You will probably be much happier at a small to mid-size company. The “dream job” megacorps have sold you on is just good marketing.

There are some major problems on the internet which may seem intractable. How do we prevent centralization of our communication tools under the authority of a few, whose motivations may not align with our interests? How do we build internet-scale infrastructure without a megacorp-scale budget? Can we make our systems reliable and fault-tolerant — in the face of technical and social problems? Federation is an idea which takes a swing at all of these problems.

The web is dead, and its fetid corpse persists only as the layer of goop that Google scrapes between its servers and your screen. Anyone who still believes that Mozilla will save the web is a fool.

Embrace, extend, and extinguish

yak shaving

"contrib" directory

thou shalt not run a water pipe above your data center

In the Crypto Wars, arguments have occasionally been made that there is a constitutional right to cryptography. Most recently, Apple made that argument in trying to fend off the FBI's request to help break into an encrypted phone. It went roughly as follows: writing code is an expressive act, freedom of expression is protected by the First Amendment, and so they can't be forced to express themselves in ways they don't want. I don't think this argument would have fared well in the courts.
...
But there is an amendment in the Bill of Rights that really is a good match: the Second. The Second Amendment is about private possession of weapons of war; crypto has eminently been a weapon of war. It is about preserving the ability to resist tyranny; and if the government were to have the power to snoop on every communication in a modern computerized society, it would enable a level of tyranny so oppressive as to make every tyranny in history seem mild in comparison. It is also about self-defense against criminals; and there too the fit is good -- not just because cryptography is essential for protecting such things as online banking transactions (though it is); it also ties in to personal self-defense, as might be performed with a firearm.

unknown unknowns

ersatz

By studying the masters, not their pupils.

user diligence was built into the system.

Bruce Schneier knows Alice and Bob's shared secret.

Doug McIlroy can change file permissions using a magnet and a pin. Doug McIlroy can read data from /dev/null. Doug McIlroy can handle SIGKILL.

textual markup should describe logical structure instead of physical appearance, and textual markup should be unambiguous and easily understood by either a program or a person

After taking a computer course at Harvard in 1960 Ted Nelson began a mystical journey. He started exploring the possibility of liberating text from paper, of developing a means whereby writers could harness text in a manner closer to human cognitive patterns: i.e., the way words flowed through our minds. In 1965 Nelson coined the term hypertext.

Go's heritage is at least as much Oberon as it is C! In 1960, language experts from America and Europe teamed up to create Algol 60. In 1970, the Algol tree split into the C and the Pascal branch. ~40 years later, the two branches join again in Go.

values of β will give rise to dom!

The result of the NSA query was that Bob and I--the arrangements were made by him--received a visit from a man whom Bob called "a retired gentleman from Virginia."

“When will the world be ready to receive its saints?” I think we know the answer — when they are dead, pasteurized and homogenized and simplified into stereotypes, and the true depth and integrity of their ideas and initiatives are forgotten.

there should be no walls between areas of thought

it took Ash Ketchum like 10 years to become a Pokémon master. You'll get there

the thing that bothered me most about vi and emacs was that they gave you a two-dimensional display of your file but you had only a one-dimensional input device to talk to them. It was like giving directions with a map on the table, but being forced to say "up a little, right, no back down, right there, yes turn there that's the spot" instead of just putting your finger on the map.

children must learn these things for themselves

Skunkworks project

When other people remove branches, they won’t be removed from your local copy of the repository. To take care of this housekeeping, you need to express a fruit preference: git remote prune origin. Again, don’t ask. I don’t know why. That’s just how it is.

If computers are the wave of the future, displays are the surfboards

therefore algebra is run on a machine (the universe) which is twos-complement.

The Smalltalk language is object oriented rather than function oriented, and this often confuses people with previous experience in computer science. For example, to evaluate <someobject>+4 means to present +4 as a message to the object. The fundamental difference is that the object is in control, not +. If <someobject> is the integer 3, then the result will be the integer 7. However, if <someobject> were the string 'Meta', the result might be Meta4. In this way, meaning rides with the objects of the system, and code remains an abstract form, merely directing the flow of communication.

This largely comes down to whether your servers are pets or livestock. Pets get individual names. They're distinct from each other, and we care about those differences. When one gets sick, we usually try to nurse it back to health. Traditionally, servers have been pets. Livestock get numbers. They're mostly identical, and what differences there are, we don't care about and usually try to minimize. When one gets sick, we put it down and get another one. Fully virtualized servers, especially IaaS servers such as AWS, are livestock.

protocols, not platforms

"SHARE. It's not an acronym. It's what we do."

It was a fundamentally different experience. It showed that the desire to share software was alive and well, but DECUS tapes were full of dead offerings. You could take them or leave them, but there was no overall effort to integrate or improve that code or to make a coherent offering out of it. I know people used that code but nobody ever sent me an improvement to it. It was an ornament I could hang on DEC's tree.

samizdat

you can't do much carpentry with your bare hands, and you can't do much thinking with your bare brain

Religion is one of the forms of spiritual oppression which everywhere weighs down heavily upon the masses of the people, over burdened by their perpetual work for others, by want and isolation. Impotence of the exploited classes in their struggle against the exploiters just as inevitably gives rise to the belief in a better life after death as impotence of the savage in his battle with nature gives rise to belief in gods, devils, miracles, and the like. Those who toil and live in want all their lives are taught by religion to be submissive and patient while here on earth, and to take comfort in the hope of a heavenly reward. But those who live by the labour of others are taught by religion to practise charity while on earth, thus offering them a very cheap way of justifying their entire existence as exploiters and selling them at a moderate price tickets to well-being in heaven. Religion is opium for the people. Religion is a sort of spiritual booze, in which the slaves of capital drown their human image, their demand for a life more or less worthy of man.

competence without comprehension

how come vs what for

you have to have enough background knowledge to penetrate jargon

individuals do matter; the past is less determinate than historians like to make, and the future is less open ended than you would like to believe

the assumption that what currently exists must necessarily exist is the acid that corrodes all visionary thinking

Commands need not be on different lines; instead they may be separated by semicolons.
Processes which are never waited for die unnoticed and presumably unmourned

[Algol 60] was more racehorse than camel . . . . a rounded work of art . . . . Rarely has a construction so useful and elegant emerged as the output of a committee of 13 meeting for about 8 days . . . . Algol deserves our affection and appreciation.

On the other hand, in private, we have been justifiably proud of our wil#ingness to explore weird ideas, because pursuing them is the only way to make progress. Unfortunately, the necessity for speculation has combined with the culture of the hacker in computer science to cripple our self-discipline. In a young field, self-discipline is not necessarily a virtue, but we are not getting any younger. In the past few years, our tolerance of sloppy thinking has led us to repeat many mistakes over and over.

talk much but say little

Reading makes a full Man, Meditation a profound Man, discourse a clear Man.

Not invented here (NIH) syndrome

"Computers are mostly used against people instead of for people; used to control people instead of to free them; Time to change all that - we need a... Peoples Computer Company."

price of anarchy

the only constant in life is change

The People's Library (OWS Library)

US intelligence (sic) community

As the current situation in Afghanistan demonstrates, however, a Potemkin village will only stay up as long as everyone involved continues to play along.

Understanding the true significance of events is, at least in some sense, a task best left to historians. Even the fall of the Roman empire can appear as something akin to the normal state of things for the people living through it; the true historical significance of something is generally only clear well after the fact, and every new generation has its own notion of the true meaning of history.

Such a focus on individual responsibility, necessary as it is, functions as ideology the moment it serves to obfuscate the big question of how to change our entire economic and social system.

the 60s were a leap in human consciousness. Mahatma Gandhi, Malcom X, Martin Luther King, Che Guevara, they led a revolution of conscience. .... The youth of today must go there to find themselves.

newcomers to the internet are often startled to discover themselves not so much in some soulless colony of technocrats as in a kind of cultural Brigadoon - a flowering remnant of the 60s, when hippie communalism and liberarian politics formed the roots of the modern cyberrevolution - Stewart Brand

I drifted into indiscipline and intellectual adventure that eventually became complete confusion.

Gary Snyder defined wild as "whose order has grown from within and is maintained by the force of consensus and custom rather than explicit legislation". "The wild is not brute savagery, but a healthy balance, a self-regulating system.". Snyder attributed wild to Buddhism and Daoism, the interests of some Beats. "Snyder's synthesis uses Buddhist thought to encourage American social activism, relying on both the concept of impermanence and the classically American imperative toward freedom."

Reporters are not generally well versed in artistic movements, or the history of literature or art. And most are certain that their readers, or viewers, are of limited intellectual ability and must have things explained simply, in any case. Thus, the reporters in the media tried to relate something that was new to already preexisting frameworks and images that were only vaguely appropriate in their efforts to explain and simplify. With a variety of oversimplified and conventional formulas at their disposal, they fell back on the nearest stereotypical approximation of what the phenomenon resembled, as they saw it. And even worse, they did not see it clearly and completely at that. They got a quotation here and a photograph there—and it was their job to wrap it up in a comprehensible package—and if it seemed to violate the prevailing mandatory conformist doctrine, they would also be obliged to give it a negative spin as well. And in this, they were aided and abetted by the Poetic Establishment of the day. Thus, what came out in the media: from newspapers, magazines, TV, and the movies, was a product of the stereotypes of the 30s and 40s—though garbled—of a cross between a 1920s Greenwich Village bohemian artist and a Bop musician, whose visual image was completed by mixing in Daliesque paintings, a beret, a Vandyck beard, a turtleneck sweater, a pair of sandals, and set of bongo drums. A few authentic elements were added to the collective image: poets reading their poems, for example, but even this was made unintelligible by making all of the poets speak in some kind of phony Bop idiom. The consequence is, that even though we may know now that these images do not accurately reflect the reality of the Beat movement, we still subconsciously look for them when we look back to the 50s. We have not even yet completely escaped the visual imagery that has been so insistently forced upon us

The Beats destroyed the distinction between life and literature... They made efforts to destroy the wall between art and real life, so that art would become a living experience in cafes or jazz clubs, and not remain the prerogative of galleries and museums

This approach that has generated a common jibe that New Age represents "supermarket spirituality". York suggested that this eclecticism stemmed from the New Age's origins within late modern capitalism, with New Agers subscribing to a belief in a free market of spiritual ideas as a parallel to a free market in economics.

When Darwin's theory was published, several years later, church hierarchs agreed that it did not contradict Christian teaching, regardless of whether it was true or not. And Darwin himself did not present his theory as refuting any religious dogma. Immediately after publication, the theory of evolution was generally accepted by society and did not cause a split between religion and science. The war began only 60 years after the publication of Darwin's work, and the reasons for it were political rather than religious or scientific. In the United States, religious fundamentalists have lobbied for a ban on teaching evolution in schools. But why? The fact is that at that time the theory was used to justify the idea of ​​social Darwinism: the strong have the right to oppress the weak. This idea was unacceptable to Christians and especially to Christian socialists. However, their opponents argued that in matters of morality, one should rely on science, not religion. And if science tells us that the weak must perish in the process of natural selection, then so be it. Because of such a stupid interpretation of Darwin's ideas, a protracted conflict between the scientific and religious worldviews began, which now continues to gain momentum. Creationism emerged, which, interpreting the Bible literally, denies not only a hundred years of scientific research, but also two thousand years of Christian theology. On the other hand, ultra-Darwinism has emerged, which tries to apply the theory of evolution to explain everything. Both sides are wrong, the professor says. An adequate understanding of both faith and science does not lead to contradictions. "Since when is it to be a Christian to believe that dinosaurs lived with humans?" Connor wonders. Creationism is not a heir to the Christian tradition, but a dead-end branch of the evolution of human thought.

How was it, they argued, that the great mass of people could be sucked into complicity with their own exploitation? With the emergence of fascism in the 1920s and 30s the question became even more urgent. What led educated people to throw their lot in with the barbarism of fascism? This, for them, was the ultimate in false consciousness.

history is just one fucking thing after another

it is not possible to live a true life in a false system

It arrives at a pessimistic view of what can be done against a false system which, through the "culture industry", constantly creates a false consciousness about the world around us based on myths and distortions deliberately spread in order to benefit the ruling class. This is, of course, not peculiar to capitalism, but in capitalism it finds its full commodified form so that we become the willing consumers and reproducers of our own alienation by becoming consumers rather than producers of culture.

This convergence had done away with the worst excesses of class exploitation and replaced it with a sort of social complicity between the classes... Fascism is successful because it permits and encourages our deepest desires to find the culprit for our own complicity

Marxism is well known for its two components: a “cold stream” which concerns objective scientific analysis; and a “warm stream” that concerns enthusiasm and hope and leads to commitment to Marxism as a cause.

Another damned, thick, square book!

There is a new profession of trail blazers, those who find delight in the task of establishing useful trails through the enormous mass of the common record.

The real heart of the matter of selection, however, goes deeper than a lag in the adoption of mechanisms by libraries, or a lack of development of devices for their use. Our ineptitude in getting at the record is largely caused by the artificiality of systems of indexing. When data of any sort are placed in storage, they are filed alphabetically or numerically, and information is found (when it is) by tracing it down from subclass to subclass. It can be in only one place, unless duplicates are used; one has to have rules as to which path will locate it, and the rules are cumbersome. Having found one item, moreover, one has to emerge from the system and re-enter on a new path. The human mind does not work that way. It operates by association. With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain. It has other characteristics, of course; trails that are not frequently followed are prone to fade, items are not fully permanent, memory is transitory. Yet the speed of action, the intricacy of trails, the detail of mental pictures, is awe-inspiring beyond all else in nature.
... This is the essential feature of the memex. The process of tying two items together is the important thing. When the user is building a trail, he names it, inserts the name in his code book, and taps it out on his keyboard. Before him are the two items to be joined, projected onto adjacent viewing positions. The user taps a single key, and the items are permanently joined. Thereafter, at any time, when one of these items is in view, the other can be instantly recalled merely by tapping a button below the corresponding code space. Moreover, when numerous items have been thus joined together to form a trail, they can be reviewed in turn, rapidly or slowly, by deflecting a lever like that used for turning the pages of a book. It is exactly as though the physical items had been gathered together from widely separated sources and bound together to form a new book. It is more than this, for any item can be joined into numerous trails.

Vannevar Bush's "library of a million volumes, compressed into one end of a desk" may sound quaint to us today. Bush naively assumed that immediate access to a million volumes would require the physical presence of those million volumes. His proposal -- a million volumes in every desk. The web, of course, took a different approach. A million volumes, yes, but our desks remain empty. Instead, when we summon a volume, we are granted a transient and ephemeral peek at its sole instance, out there somewhere in the world, typically secured within a large institution. Two thoughts: It's interesting that life itself chose Bush's approach. Every cell of every organism has a full copy of the genome. That works pretty well -- DNA gets damaged, cells die, organisms die, the genome lives on. It's been working pretty well for about 4 billion years. It's also interesting to consider how someone from Bush's time might view our situation. For someone who's thinking about a library in every desk, going on the web today might feel like visiting the Library of Alexandria. Things didn't work out so well with the Library of Alexandria. It's not working so well today either. We, as a species, are currently putting together a universal repository of knowledge and ideas, unprecedented in scope and scale. Which information-handling technology should we model it on? The one that's worked for 4 billion years and is responsible for our existence? Or the one that's led to the greatest intellectual tragedies in history?

the public space

I've come to believe that a lot of what's wrong with the Internet has to do with memory. The Internet somehow contrives to remember too much and too little at the same time, and it maps poorly on our concepts of how memory should work.
In our elementary schools in America, if we did something particularly heinous, they had a special way of threatening you. They would say: "This is going on your permanent record".
It was pretty scary. I had never seen a permanent record, but I knew exactly what it must look like. It was bright red, thick, tied with twine. Full of official stamps.
The permanent record would follow you through life, and whenever you changed schools, or looked for a job or moved to a new house, people would see the shameful things you had done in fifth grade.
How wonderful it felt when I first realized the permanent record didn't exist. They were bluffing! Nothing I did was going to matter! We were free!
And then when I grew up, I helped build it for real.

Although it's been discredited by some, I'm still a believer in "stop and fsck" policing of disk drives.

I do not think that I would be able to build STL without her help. (After all, STL stands for Stepanov and Lee...)

cal 9 1752

The net is precious to me because it gives ordinary human beings a way to communicate with other ordinary human beings. Corporations have too many ways to cram their ads down my throat. Human beings have the net.

The CCRU does not, has not and will never exist.

Accelerationism is simply the self-awareness of capitalism, which has scarcely begun.

The MIT guy then muttered that sometimes it takes a tough man to make a tender chicken, but the New Jersey guy didn't understand (I'm not sure I do either).

These applications (“appliances” is a better word) come equipped with a fixed vocabulary of actions, speak no common language, and cannot be extended, composed, or combined with other applications except with enormous friction. By analogy, what we have is a railway system where the tracks in each region are of differing widths, forcing trains and their cargo to be totally disassembled and then reassembled to transport anything across the country. As ridiculous as this sounds, this is roughly what we do at application boundaries: write explicit serialization and parsing code and lots of tedious (not to mention inefficient) code to deconstruct and reconstruct application data and functions.
This essay is a call to cast aside the broken machine metaphor and ultimately end the tyranny of applications. Applications can and ultimately should be replaced by programming environments, explicitly recognized as such, in which the user interactively creates, executes, inspects and composes programs. In this model, interaction with the computer is fundamentally an act of creation, the creative act of programming, of assembling language to express ideas, access information, and automate tasks. And software presents an opportunity to help humanity harness and channel “our vast imaginations, humming away, charged with creative energy”.
...
We artificially limit the potential of this incredible technology by reserving a tiny, select group of people (programmers) to use its power build applications with largely fixed sets of actions (and we now put these machines on the internet too and call them “web applications”), and whose behaviors are not composable with other programs. Software let us escape the tyranny of the machine, yet we keep using it to build more prisons for our data and functionality!

I wouldn’t say ‘perfectly happy’, I’d say that users are resigned to the notion that applications are machines with a fixed set of actions, and any limitations of these machines must simply be worked around via whatever tedium is required.

The step after ubiquity is invisibility.

This is not just about providing children with the intellectual breadth to get into Oxbridge - it's about cultural entitlement. We're always hearing complaints about how the subsidised theatre and opera are too elitist, but no one ever asks the question about how best to educate people to enjoy them. It's not the job of the arts to dumb down; it's the job of the state to create an intellectually appreciative audience. This won't happen if learning in schools is reduced to a narrow range of disconnected facts, and children will be deprived of their cultural right.

Newcomers to the Internet are often startled to discover themselves not so much in some soulless colony of technocrats as in a kind of cultural Brigadoon - a flowering remnant of the '60s, when hippie communalism and libertarian politics formed the roots of the modern cyberrevolution...
Authors Stewart Brand and John Markoff argue that the development and popularization of personal computers and the Internet find one of their primary roots in the anti-authoritarian ethos promoted by hippie culture.

The '60s were a leap in human consciousness. Mahatma Gandhi, Malcolm X, Martin Luther King, Che Guevara, they led a revolution of conscience. The Beatles, The Doors, Jimi Hendrix created revolution and evolution themes. The music was like Dalí, with many colors and revolutionary ways. The youth of today must go there to find themselves.

The peculiarity of decision-making in computing systems is illustrated by some amusing data in More Programming Pearls by Jon Bentley (pp. 157-158). Imagine yourself in 1969, and you need to solve the 1024x1024x1024 Poisson's equation (a particular 3-dimensional system of elliptic partial differential equations). The best numerical method that you can find is called SOR Iteration, and the fastest computer is the CDC 7600 at 5 megaflops. Obviously, your best course is to run SOR on the 7600, right? Well, you will still be less than half done in 1976. I will wait until 1976, use a Cray-1 at 50 megaflops and the Cyclic Reduction method, solving the problem in a matter of hours and winning the race by a long margin.

Marx's understanding of religion, summed up in a passage from the preface to his 1843 Contribution to the Critique of Hegel's Philosophy of Right:
Religious suffering is, at one and the same time, the expression of real suffering and a protest against real suffering. Religion is the sigh of the oppressed creature, the heart of a heartless world, and the soul of soulless conditions. It is the opium of the people. The abolition of religion as the illusory happiness of the people is the demand for their real happiness. To call on them to give up their illusions about their condition is to call on them to give up a condition that requires illusions.

So much for the manipulation of ideas and their insertion into the record. Thus far we seem to be worse off than before—for we can enormously extend the record; yet even in its present bulk we can hardly consult it. **This is a much larger matter than merely the extraction of data for the purposes of scientific research; it involves the entire process by which man profits by his inheritance of acquired knowledge.**

In the world of the man in the gray flannel suit, people were starting to look for ways out.

The transformation of labor into pleasure is the central idea in Fourier's giant socialist utopia

Language standards committees tend to be like a pack of dogs contemplating a tree. Each dog isn't satisfied with the tree until he's peed on it.

The metric of merit is wrong.

Be courageous. Try different things; experiment. Try to give a cool demo.

Despite pulling other ideas from Plan 9, the importance of having an opinion on the distributed nature of modern computing seems to have been missed by prominent operating systems today. As a result, their development has been relegated to what they do: be a platform for things that actually provide an abstraction for my resources.

Distributed systems that work well on football-sized data centers may not work that well when you only have a few racks in colo facility. The "I forgot how to count that low" challenge is a real one...

Software folks need to be prepared for the fact that there may actually be nobody who knows how portions of the hardware actually work because it's not all designed at one place anymore. We have to be prepared for the SoC version of node.js.

I conclude that there are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies and the other way is to make it so complicated that there are no obvious deficiencies.
The first method is far more difficult. It demands the same skill, devotion, insight, and even inspiration as the discovery of the simple physical laws which underlie the complex phenomena of nature. It also requires a willing- ness to accept objectives which are limited by physical, logical, and technological constraints, and to accept a compromise when conflicting objectives cannot be met. No committee will ever do this until it is too late.

At first I hoped that such a technically unsound project would collapse but I soon realized it was doomed to success. Almost anything in software can be imple- mented, sold, and even used given enough determina- tion. There is nothing a mere scientist can say that will stand against the flood of a hundred million dollars. But there is one quality that cannot be purchased in this way--and that is reliability. The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.

Among the structuring methods for computer programs, three basic constructs have received widespread recognition and use: A repetitive construct (e.g. the while loop), an alternative construct (e.g. the conditional if..then..else), and normal sequential program composition (often denoted by a semicolon). Less agreement has been reached about the design of other important program structures, and many suggestions have been made: Subroutines (Fortran), procedures (Algol 60), entries (PL/I), coroutines (UNIX), classes (SIMULA67), processes and monitors (Concurrent Pascal), clusters (CLU), forms (ALPHARD), actors (Hewitt).

(Computer scientist to a professor of linguistics): What can you tell me about the engineering efficiency of language? ... The thing I want to know is: how do I write an efficient language for communicating to a machine, with enough redundancy? Why do we have the number of synonyms we do? We do we have the antonyms? Why do we have irregular and regular verbs? Why? What are the values?

There are two languages: you to the machine, and the machine back to you; they need not be the same language. You want a terse one in, and you're willing to put up with a rather more verbose one coming out.

“safety” is the fire flower. But “fearless concurrency” is Fire Mario.

I was like “How long to think it will be before DTrace shows up in Linux?” and we had guesses, and I think the quickest guess was two months, some Danish master student is going to get it working and I guess that Danish master student is a metaphor for someone who is bright but underemployed, I don’t know, not to offend the Danes.

The difference between Lisp and Java, as Paul Graham has pointed out, is that Lisp is for working with computational ideas and expression, whereas Java is for expressing completed programs. As James says, Java requires you to pin down decisions early on. And once pinned down, the system which is the set of type declarations, the compiler, and the runtime system make it as hard as it can for you to change those assumptions, on the assumption that all such changes are mistakes you're inadvertently making.
There are, of course, many situations when making change more difficult is the best thing to do: Once a program is perfected, for example, or when it is put into light-maintenance mode. But when we are exploring what to create given a trigger or other impetus—when we are in flow—we need to change things frequently, even while we want the system to be robust in the face of such changes.
[...]
The screwed-up way we approach software development is because of what we have done with programming languages. With some exceptions, we have opted for optimizing the description of programs for compilers, computers, .... Interestingly, the results of this optimization are well-described by "Premature optimization is the root of all evil in programming.". [Knuth] was referring to the practice of worrying about performance of an algorithm before worrying about correctness, but the dictum can be taken to refer to any design problem where optimization is an eventual concern. In this case, the design problem was to design a usable programming medium that excels at enabling developers and designers to explore and discover, and to continue to enable discovery and exploration once well into the perfecting stage. Instead of waiting until we understood the ramifications of large system design and implementation using computer programming media, we decided to prematurely optimize for performance and optimization. And we got program description (or programming) languages instead—the root of all evil.

a poem is never finished, only abandoned.

Perhaps it's the coincidence of the same mind apprehending different things, but I used to describe AI as "trying to program what cannot be programmed," and I currently define poetry as "trying to say what cannot be said."

In general, an implementation must be conservative in its sending behavior, and liberal in its receiving behavior.

They know enough who know how to learn

That language is an instrument of human reason, and not merely a medium for the expression of thought, is a truth generally admitted.

An apocryphal example is "the spirit is willing but the flesh is weak." Translated back and forth with Russian, it became "the vodka is good but the meat is rotten."

studying the structure of information and the structure of problem solving processes independently of applications and independently of its realization in animals or humans.

Chess is the Drosophila of AI.

Your proposal is like a contest to see who can name the largest number with me going first.

“In the air-ship —” He broke off, and she fancied that he looked sad. She could not be sure, for the Machine did not transmit nuances of expression. It only gave a general idea of people — an idea that was good enough for all practical purposes.

I had never managed a large group before and I was clearly in over my head. Richard volunteered to help out. "We've got to get these guys organized," he told me. "Let me tell you how we did it at Los Alamos."
Every great man that I have known has had a certain time and place in their life that they use as a reference point; a time when things worked as they were supposed to and great things were accomplished. For Richard, that time was at Los Alamos during the Manhattan Project. Whenever things got "cockeyed," Richard would look back and try to understand how now was different than then.

Newtonian physics is an approximation to Einsteinian physics (general relativity). Classical physics is an approximation to quantum mechanics. Classical information is an approximation to quantum information. In each case, the approximation excludes important details but serves well for many purposes. In each case, removing the approximation requires deeper understanding and harder math, but results in a truer picture of Nature and may enable new technologies. Yes, Nature: we’re beginning to understand that information is a physical concept.

But when you understand how your tools work, then you can reason in terms of what would be the ideal solution if ideal tools existed. Software becomes clay instead of LEGOs.

Do you think less is more, or less is less?

Performing and listening to a gradual musical process resembles: pulling back a swing, releasing it, and observing it gradually come to rest; turning over an hour glass and watching the sand slowly run through the bottom; placing your feet in the sand by the ocean's edge and watching, feeling, and listening to the waves gradually bury them.

in order to sell printers, they threw away the universe

If bbb is huge, one might run out of room, but with today's 'light my cigar with disk blocks' life, not a problem

I definitely experienced some cognitive dissonance watching an 82 year old flying around vim editing postscript. It was a trip and it was really inspiring to see someone his age still hacking.

When I asked my Magic-8 ball about which email client to use, it said: "Outlook not so good."

everything is deeply intertwingled.

All Models Are Wrong Some Models Are Useful

what's upvar?

But, here we come to something even more egregious. I just defined equality. It’s utterly horrible that I have to now define inequality because what’s the meaning of inequality? Not equality. Could there be any other meaning?
In 1994 I proposed such a thing to the standard committee. I even proposed bunch of templates which will automatically do it. They threw them out because there were people who said, “but we want to have the freedom to make the glyph != do something else.” I literally have no words because that is not freedom. It’s like saying I want to have freedom to run on the street with no pants! The semantics must be fixed, you have no right to define inequality which will do a semantically different thing.

Go directly to Twitter, do not pass Google, do not collect 200 cookies

Since I am not an actual formal standards body, I don't have a nice pronounceable acronym that one could prepend to "-compliant" to form an adjective to describe a conforming implementation. In lieu of this, then, I hereby suggest that a Brainfuck implementation that is compliant with all of the constraints listed here should be described as "nice".

One in a million is next Tuesday

In a nutshell, since any software that processes inputs is, in fact, an interpreter, and any inputs are thus its programs, input validation is not essentially different from program verification. To be trustworthy, input validation must therefore be grounded in models that describe precisely what the properties of valid inputs are. Luckily, such models exist, and can be made accessible to regular developers.

"The Internet has created the most precise mirror of people as a whole that we've yet had. It is not a summary prepared by a social scientist or an elite think tank. It is not the hagiography of an era, condensed by a romantic idealist or a sneering cynic. It is the real us, available for direct inspection for the first time. Our collective window shades are now open. We see the mundanity, the avarice, the ugliness, the perversity, the loneliness, the love, the inspiration, the serendipity, and the tenderness that manifest in humanity. Seen in proportion, we can breathe a sigh of relief. We are basically OK."

People degrade themselves in order to make machines seem smart all the time. We ask teachers to teach to standardized tests so a student will look good to an algorithm. We have repeatedly demonstrated our species’ bottomless ability to lower our standards to make information technology look good.

I do not want to embarrass the company by saying it's name so I will just use the initials... IBM

SEXI Farber

You can’t swing a cat in contemporary metaphysics these days without hitting a discussion involving possible worlds

Impossible n'est pas français

This page looks cluttered in Any Browser

normalization of deviance

almost nothing is mutually exclusive

a hacker is just someone who doesn't let cultural norms get in their way

Clever though those anecdotes may be, it was much easier to become root. Sometime around 1981, I was visiting USG and they were in a bit of a panic, checking that their system was intact. Why? Because early that morning, there was a phone call to the machine room:
"Hi, this is Ken. What's the root password?"
The call was successful.
Any sysadmin worth his paycheck would have known that Ken isn't awake in the mornings and could have blocked this interloper. But...

But where is everybody?

What we wanted to preserve was not just a good environment in which to do programming, but a system around which a fellowship could form. We knew from experience that the essence of communal computing [...] is not just to type programs into a terminal instead of a keypunch, but to encourage close communication.

en charette

... bad notations can stifle progress. Roman numerals hobbled mathematics for a millennium but were propagated by custom and by natural deference to authority. Today we no longer meekly accept individual authority. Instead, we have ‘‘standards,’’ impersonal imprimaturs on convention.

What is the major problem in computer science? Is there one? Can there ever be any? In thinking about it, I have come to the conclusion that if there is any major problem, it is the problem of bridging the gap between neuroscience and psychology. If one looks at neuroscientists, one finds that they work very deep inside the nervous system at an extremely primitive level; they work on neurons. It is as though computer scientists spent all their time looking at gates. On the other end of the spectrum is the psychologist, who sees man as an entity, as a totality, trying to understand what makes him tick. The psychologist's weapons are gross and very, very macroscopic. How can we create a bridge between neuroscience and psychology? My conclusion is that computer science is the discipline that will provide this bridge, and the bridge will come from the study of software. There is a major problem worth sinking your teeth into. Software--collections of symbols that execute processes--is our external way of trying to model the way thinking goes on. Of course, we are very primitive. Nevertheless, it seems to us that is the clue: computer science--if any discipline will--will ultimately provide us the bridge that tells us how the neurons, working as gates, pass signals by the billions in such a way that our brain causes us to behave in a way that psychologists can say, "Ah! I understand."

A life blighted by the trisection

99k2vp8opq

Abigail's length horror

When examined, answer with questions

More people have been to Russia than I have

Deities must be invoked directly and not via Phoenix MVS

Today man has developed extensions for practically everything he used to do with his body. The evolution of weapons begins with the teeth and the fist and ends with the atom bomb. Clothes and houses are extensions of man's biological temperature-control mechanisms. Furniture takes the place of squatting and sitting on the ground. Power tools, glasses, TV, telephones, and books which carry the voice across both time and space are examples of material extensions. Money is a way of extending and storing labor. Our transportation networks now do what we used to do with our feet and backs. In fact, all man-made material things can be treated as extensions of what man once did with his body or some specialized part of his body.

we suppress an organ in the living subject, by a section or ablation; and from the disturbance produced in the whole organism or in a special function, we deduce the function of the missing organ.

There are several ways to understand the difference between scientists and writers. One can contrast their methods and stage the difference as an opposition between exptanation and interpretation. In this respect, literature is part of a general field devoted to understanding, as opposed to an explanation through causes and effects: the scientists are discovering laws where the writers are creating meaning. One can contrast their objects of study. This is a more romantic version of the divide: the principles of nature against the torments of the human heart; the fatality of the natural world against the unpredictability of human consciousness and action.

Why do all hammers look basically the same? Because there's an 'ideal hammer', and over time hammer design has asymtoted toward that 'ideal hammer' design. One can't just keep improving the design indefinitely - diminishing returns set in. So I suspect there is, to some degree, a Platonic 'ideal syntax' for a 'classic block-structured' programming language, and to me, C came pretty close to it. I except LISP from that assessment, because LISP is built around a fundamentally different model of how computations/algorithms are organized, and C and LISP aren't directly comparable. But that realization points to a deeper bug with the 'Platonic ideal language' concept above, which is that languages are fundamentally, albeit at a very deep conceptual level, tied to the basic concept of the computing hardware they are to run on.

The roffians versus the texans

the Chinese saying that it is difficult to find a black cat in a dark room, especially if there is no cat

A related argument is that engineering approaches are not applicable to cells because these little wonders are fundamentally different from objects studied by engineers. What is so special about cells is not usually specified, but it is implied that real biologists feel the difference. I consider this argument as a sign of what I call the urea syndrome because of the shock that the scientific community had two hundred years ago after learning that urea can be synthesized by a chemist from inorganic materials. It was assumed that organic chemicals could only be produced by a vital force present in living organisms. Perhaps, when we describe signal transduction pathways properly, we would realize that their similarity to the radio is not superficial. In fact, engineers already see deep similarities between the systems they design and live organisms

The Igon Value Effect

Integrated software -- today people think that means word professor, spreadsheet and database. That’s like, hamburger fries and shake, that’s food... for somebody who only grew up with McDonald’s.

Kefir zhizni

At one point, KV might have suggested that more stringent requirements, such as those used in the aerospace industry, might have been one way to ameliorate the dangers of software in the four-wheeled killing machines all around us, but then Boeing 737s started falling out of the air and that idea went out the window as well

In the case of Intel (x86), Ring -3 is the Intel Management Engine.7 It can turn on nodes and reimage disks invisibly. It has a kernel that runs Minix 11 as well as a web server and entire networking stack. Because of this, Minix is the world's most widely used operating system.

Among the structuring methods for computer programs, three basic constructs have received widespread recognition and use: A repetitive construct (e.g. the while loop), an alternative construct (e.g. the conditional if..then..else), and normal sequential program composi- tion (often denoted by a semicolon). Less agreement has been reached about the design of other important program structures, and many suggestions have been made: Subroutines (Fortran), procedures (Algol 60 [15]), entries (PL/I), coroutines (UNIX [171), classes (SIMULA67 [5]), processes and monitors (Concurrent Pascal [2]), clusters (CLU [13]), forms (ALPHARD[19]), actors (Hewitt [1]).

I conclude that there are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies and the other way is to make it so complicated that there are no obvious deficiencies.

The closer the real possibility of liberating the individual from the constraints once justified by scarcity and immaturity, the greater the need for maintaining and streamlining these constraints lest the established order of domination dissolve. Civilisation has to protect itself against the spectre of a world which could be free.

Emancipatory politics must always destroy the appearance of a ‘natural order’, must reveal what is presented as necessary and inevitable to be a mere contingency, just as it must make what was previously deemed to be impossible seem attainable

infinite scrolling pastel hellscapes of Zuckerberg and company

Shlemiel the painter’s algorithm

shithub

If the only tool you have is an analogy to a hammer, every problem looks like a story about nails.

They might never tell you it's broken

A year or two after I'd joined the Labs, I was pair programming with Ken Thompson on an on-the-fly compiler for a little interactive graphics language designed by Gerard Holzmann. I was the faster typist, so I was at the keyboard and Ken was standing behind me as we programmed. We were working fast, and things broke, often visibly—it was a graphics language, after all. When something went wrong, I'd reflexively start to dig in to the problem, examining stack traces, sticking in print statements, invoking a debugger, and so on. But Ken would just stand and think, ignoring me and the code we'd just written. After a while I noticed a pattern: Ken would often understand the problem before I would, and would suddenly announce, "I know what's wrong." He was usually correct. I realized that Ken was building a mental model of the code and when something broke it was an error in the model. By thinking about *how* that problem could happen, he'd intuit where the model was wrong or where our code must not be satisfying the model.
Ken taught me that thinking before debugging is extremely important. If you dive into the bug, you tend to fix the local issue in the code, but if you think about the bug first, how the bug came to be, you often find and correct a higher-level problem in the code that will improve the design and prevent further bugs.

You don't need a weatherman to know which way the wind blows

We petitioned, we demonstrated, we sat in. I was willing to get hit over the head, I did; I was willing to go to prison, I did. To me, it was a question of what had to be done to stop the much greater violence that was going on.

Physicists work in the basement of reality

In the midst of a network of particle interactions that we draw as a Feynman diagram, it often happens that one particle will trace a closed loop in time, become an anti-particle when traveling backward in time, and annihilate with itself at an earlier time. These interactions are common enough that we have a name for them: penguins. (Richard Feynman once complained that "penguin" was absurd: these diagrams don't look anything like penguins. The speaker responded that they didn't look anything like Richard Feynman, either.)

a divine spark

It hardly matters whether we run the formulas forward or backward—the whole is a single, self-consistent tapestry.

The problem with monolithic repositories is that they have no clear end. They have infinite growth possibility

There are parallel and series circuits in causality that our language tries to cram into all-or-nothing categories like "is responsible" and "isn't responsible."

The "Be reasonable, do it my way" approach does not work. Neither does the Esperanto approach of "let's all switch to yet a new language".

In English and French - the word "first" is not derived from the word "one" but from an old word for "prince" (which means "foremost"). Similarly, the English word "second" is not derived from the number "two" but from an old word which means "to follow". Obviously there is a close relation between "third" and "three", "fourth" and "four" and so on.

fait accompli

Don't seek to emulate the wise. Seek What They Sought.

Btrfs can survive acts of god. XFS is resistant to acts of god. FAT32 is too dumb to realize what is before it is an act of god, making it similarly resistant for all the wrong reasons.

I'm a Unix guy, so I'm thinking about the environment I grew up on. When we gave programmers a virtual machine image with (mostly) protected memory, the unit of computing was process. But since it turned out that the kernel's attempts to isolate processes from each other failed, the unit of computing became virtual machine image, and now the problem is maintaining the isolation between VMs running in a stack -- it's the same problem with just another level of indirection added. Today, we're heading toward disposable networks of disposable nonexistent machines.

The first message sent on the Moscow–Washington hotline on August 30, 1963, was the test phrase "THE QUICK BROWN FOX JUMPED OVER THE LAZY DOG'S BACK 1234567890".[8] Later, during testing, the Russian translators sent a message asking their American counterparts, "What does it mean when your people say 'The quick brown fox jumped over the lazy dog'?

cosmic schmuck

tradition is just peer pressure from dead people

I came to Regnecentralen in 1963 to work with Peter Naur and Jørn Jensen. The two of them worked so closely together that they hardly needed to say anything to solve a problem. I remember a discussion where Peter was writing something on the blackboard, when Jørn suddenly said “but Peter...” and immediately was interrupted with the reply “yes, of course, Jørn.”

In early computer systems, operators carried out most of these functions, but during the last fifteen years the programs that we call operating systems have gradually taken over these aspects of sharing.

Even with good design, w/o tests you will fear change and so the code will rot. With good tests, there’s no fear, so you’ll clean the code.

The Programmer as a Young Dog

the orange website

This reminds me of a Ken story from the late '90s. I was at a conference that I won't name where Ken gave a talk about his compression work; if I remember correctly his goal was to fit all of the Billboard Top 100 songs of all time onto a single CD. He showed us the big stack of disks that he made to give to us, but then said that to his surprise the the lawyers refused to give permission. At that point he became very focused on messing with his slides while everyone got up, got in line, and took a disc. After the pile was gone Ken looked up and nonchalantly continued his talk.

danger, however, is agile shaman! many, many shiney rock lost to agile shaman!

Curiosity may have killed the cat, but it's always been my good friend.

The whole future lies in uncertainty – live immediately

that dog would not hunt

Borgmon readability

the problem with you guys is that, your search technology is so stupendously wonderful, that you have invented a human right. the right to do a google search.

you go to an employer and say "here I am, I just graduated from MIT, look at how white my teeth are! please give me a job. I am good at whatever."

This romantic vision of the luddite, potentially emancipating, power of computing was promptly smothered by military drones, surveillance hardware and the advertisement machinery. Suffice to say, that nowadays, to most of my friends, computing evokes either encroaching social networks or the drudgery of data entry.

I often think back on how, in the movie Hackers(1995), each character had their own laptop launch sequence reflecting parts of their aesthetics. — A far cry from today's disposable laptops, each equally adorned of proprietary services stickers.

The Queen's Duck

the law, in its majestic equality, forbids rich and poor alike to sleep under bridges, to beg in the streets, and to steal their bread

dieselgate

Back in the days of ARPANET mailing lists, there used to be an "educational" mailing list called "please-remove-me", that was for people who asked an entire mailing list to remove them, instead of removing themselves, or sending email to the administrative "-request" address. So when somebody asked an entire mailing list to remove them, somebody else would add them to the "please-remove-me" mailing list, and they would start getting hundreds of "please remove me" requests from other people, so they could discuss the topic of being removed from mailing lists with people with similar interests, without bothering people on mailing lists whose topics weren't about being removed from mailing lists. It worked so well that it was a victim of its own success: Eventually the "please-remove-me" mailing list was so popular that it got too big and had to be shut down...

Jordan K. Hubbard rwall incident

Lena Image

The command line is like language. The GUI is like shopping.

The solution is simple: don’t estimate! Don’t spend the time trying to work out how long something will take! If you really have to, I quite like no bullshit estimation - basically there’s only three types of story: 1, TFB (Too Fucking Big), NFC (No Fucking Clue).

Brussels bureaucracy

It's definitely one of the more pernicious attempts to re-enchant the world after the death of god

But in the new millennium, the Internet is poised to trump each and every one of these prior “liberations” of media into content, because the Internet is making content of them all

Debt, as Doctor Faustus shows us, is to market societies what hell is to Christianity: unpleasant yet indispensable

In civilizations without boats, dreams dry up, espionage takes the place of adventure, and police take the place of pirates.

Arguing with an engineer is like mudwrestling a pig: after an hour or so you're covered with mud... then, suddenly, you notice the pig's really enjoying itself

"You can't grep dead trees" -- hacker saying

Streams or STREAMS (as Ritchie famously said, it "means something different when shouted")

> If we have a whole section for rogue(6), anything can get its own section ;)
I don't regard this section's existence as waste or excess, but as a litmus test for organizations that deploy *nix systems. If the people in the C suite insist that this stuff be chopped out "because IT is serious business", you know that you're dealing with an organization that is more dedicated than most to ensuring that individual contributors experience their labor as a drudge.

Ichimoku cloud

We intuitively know that the tea itself is probably a nothing in and by itself, and that it probably does nothing in and by itself, but that this a nothing we can ritualise and return to as a refuge from the pressures of the day to day. In a world overstuffed with disorder and frantic activity, calm is found not in a location but in a ritual. It is found by enjoying an end-in-itself pleasure that promises nothing but itself. And that’s all it needs to be.

You might assume that, if a credit card is stolen/hacked and used by a bad actor to buy something, the cardholder would be liable. They will suffer the first loss, certainly, but society has decided by regulation (specifically, Regulation E) that that loss should flow to their financial institution, less a $50 I-can’t-believe-it’s-not-deductible. As a marketing decision, the U.S. financial industry virtually universally waives that $50

friends are the only motherland

search engine incantation

Don't confuse familiarity with simplicity

Licklider grew frustrated with the long hours that he, as a scientist, spent 'getting into position to think'. Like Engelbart, Licklider imagined that computers might evolve into machines to help scientists -- if only there were a better way to link scientists with computers than punch cards and printouts.

Really, we just need a progress meter that says "don't bother getting up up", "get lunch now", or "come back tomorrow", but I don't know that there is one.

Copy/paste are the biggest neglected operations in computing, considering that they are the only universal mechanism for transferring data between applications that is accessible to end users.

And now for something completely different

inside baseball

Make reversible decisions as soon as possible. Make irreversible decisions as late as possible.

With that name, I would have bet money that it came from the "suckless" people, who seem to be on a mission to occupy every cell in the two-letter Unix command name space that Ken Thompson left vacant, but apparently not.

different byte sex

I'm sorry if this is incoherent, I don't have time to try to make it clearer.
- Leslie

You might assume that, if a credit card is stolen/hacked and used by a bad actor to buy something, the cardholder would be liable. They will suffer the first loss, certainly, but society has decided by regulation (specifically, Regulation E) that that loss should flow to their financial institution, less a $50 I-can’t-believe-it’s-not-deductible. As a marketing decision, the U.S. financial industry virtually universally waives that $50

We intuitively know that the tea itself is probably a nothing in and by itself, and that it probably does nothing in and by itself, but that this a nothing we can ritualise and return to as a refuge from the pressures of the day to day. In a world overstuffed with disorder and frantic activity, calm is found not in a location but in a ritual. It is found by enjoying an end-in-itself pleasure that promises nothing but itself. And that’s all it needs to be.

In my eulogy of Barry Boehm last week in this blog, I mentioned from memory an exercise in his classic Software Engineering Economics textbook (Prentice Hall, 1981). Now I've got my hands on the book itself and found that it is even better than I remembered. So here is the exercise text in its entirety. It is exercise 1.4, page 9. I hope that citing one exercise from a 1981 textbook, even in full, falls under fair use. They do not write textbooks like this anymore!
Mary Jones in the Accounting Department tells you that she has a file of personnel records [2022 update: an Excel sheet] and asks you to develop a program to compute the median age of the personnel in the file. Here are four different ways you might respond:
(a) Invoke a sort routine which sorts the personnel file by increasing age. Count the number, N, of personnel records in the file. Find record N/2 in the sorted file, extract its value for "age".
(b) Note that in The Art of Computer Programming [Knuth, 1973], Vol. III, pp. 209-220, the problem of obtaining the median is a special case of the problem of finding the ith largest of N numbers, and that R.W. Floyd has formulated a recursive method of obtaining the median in an average of 3/2 N + O (N2/3 log N) comparisons. Spend a couple of weeks attempting (unsuccessfully) to improve on Floyd's algorithm, a day programming Floyd's algorithm, and then return to Mary Jones with some questions on the size and format of her file.
(c) Ask Mary how soon she needs the results, how much she is willing to pay for them, how many records, N, her personnel file has, and how often she will be making such runs. If N is large, and she wants results often, quickly, and cheaply, ask her if she would be satisfied with the mean value, which is much easier and cheaper to calculate. If not, work with Mary to tailor an approach to obtaining the median which is the best compromise between her various objectives.
(d) Compute the mean age, and print it out as the median. It's much easier to program, and Mary Jones will probably never notice the difference.
Rank the responses in the order of their relative concern with programming considerations, economic considerations, or other important considerations. If you were the programmer, which approach would you prefer? If you were Mary Jones, which approach would you prefer?

fall on an outstretched hand (FOOSH)

fecalith

the easiest way to censor someone online is through harassment, or DDoS attacks - i.e. have a bunch of people shout at you until you shut the fuck up

convicted vapist

if you do this stuff you gotta go in weird places

The starting point of these reflections was usually a feeling of impatience at the sight of the 'naturalness' with which newspapers, art and common sense constantly dress up a reality which, even though it is the one we live in, is undoubtedly determined by history. In short, in the account given of our contemporary circumstances, I resented seeing Nature and History confused at every turn, and I wanted to track down, in the decorative display of what-goes-without-saying, the ideological abuse which, in my view, is hidden there.

There is some Unix quote that God wrote the first device driver and people have copied that and tweaked it for their device ever since

Just another Perl hacker,

I believe programming, in this regard, can learn something from writing: when writing the first core of a new system, when the original creator is still alone, isolated, able to do anything, she should pretend that this first core is her only bullet. During the genesis of the system she should rewrite this primitive kernel again and again, in order to find the best possible design. My hypothesis is that this initial design will greatly inform what will happen later: growing organically something that has a good initial structure will result in a better system, even after years of distance from the original creation, and even if the original core was just a tiny faction of the future mass the system would eventually assume.

As somebody said, the best code is written when you are supposed to do something else

And if you have written code that is not just a “filler” for a bigger system, but a creation of your own, you know that writer block also happens in programming. The only difference is that for most people you are an engineer, hence, if you don’t work, you are lazy. The same laziness, in the case of an artist, will assume the shape of a fascinating part of the creative process.

I feel as a chessman must feel when the opponent says of it : That piece cannot be moved.

I'm annoyed as well when 'man some_command' tells me to RTFTID (where TID is 'TexInfoDocumentation' and you know what RTF stands for :-) ).

The most ludicrous of all ludicrous things, it seems to me, is to be busy in the world, to be a man who is brisk at his meals and brisk at his work. Therefore, when I see a fly settle on the nose of one of those men of business in a decisive moment,or if he is splashed by a carriage that passes him in even greater haste, or the drawbridge tilts up, or a roof tile falls and kills him, I laugh from the bottom of my heart. And who could keep from laughing? What, after all, do these busy bustlers achieve? Are they not just like that woman who, in a flurry be- cause the house was on fire, rescued the fire tongs? What more, after all, do they salvage from life's huge conflagration?

Ask me what you wish; just do not ask me for reasons. A young girl is excused for not being able to state reasons; she lives in feelings, it is said. It is different with me. Ordinarily I have so many and most often such mutually contradictory reasons that for this reason it is impossible for me to state reasons. It also seems to me that with cause and effect the relation does not hold together properly. Sometimes enormous and gewaltige [powerful] causes produce a very klein [small] and insignificant little effect, sometimes none at all; sometimes a nimble little cause produces a colossal effect.

When I was very young, I forgot in the Trophonean cave how to laugh; when I became an adult, when I opened my eyes and saw actuality, then I started to laugh and have never stopped laughing since that time. I saw that the meaning of life was to make a living, its goal to become a councilor, that the rich delight of love was to acquire a well-to-do girl, that the blessedness of friendship was to help each other in financial difficulties, that wisdom was whatever the majority assumed it to be, that enthusiasm was to give a speech, that courage was to risk being fined ten dollars, that cordiality was to say "May it do you good" after a meal, that piety was to go to communion once a year. This I saw, and I laughed.

It takes a lot of naivete to believe that it helps to shout and scream in the world, as if one's fate would thereby be altered. Take what comes and avoid all complications. In my early years, when I went to a restaurant, I would say to the waiter: A good cut, a very good cut, from the loin, and not too fat. Perhaps the waiter would scarcely hear what I said. Perhaps it was even less likely that he would heed it, and still less that my voice would penetrate into the kitchen, influence the chef, and even if all this happened, there perhaps was not a good cut in the whole roast. Now I never shout anymore.


My thoughts, aphorisms, and epigrams

History is repeating itself within the computing microcosm. A computer is a general-purpose device, unlike typical mechanical devices which are built statically and cannot be re-programmed (i.e. a clock). But we are growing weary with general-purpose. While we were busy trying to forge tools abstract enough to do everything, we lost sight of the virtues of specialization. Approaching are data-oriented design, hardware and software designed with each other in mind (e.g. Oxide computing company), domain-specific hardware, domain-specific languages, and so on. One-size-fits-all is a compromise. In general, humans are craving analog devices (i.e. not a computer simulating the behavior of a device) -- something important is lost in simulation. Return to monke.

com + putare, to reckon together