[ in a state of controlled semantic decay ]
Things fall apart; the centre cannot hold. If you meet Mr. Man on the Tao of Recursion, a student once came to Buddha and told him a story: “A buddha once came to the moon, killing my whole life with his words, softly (two times, two times).” In fact, if he started to tell us about his wife we should get impatient; all we expect of him is that he should know the way and be able clearly to explain it.
It remains to show that Fame and Glory are total. To obtain your own copy of a platonic universe, make the mouse appear. Animate the universe with the five platonic mice and their three combinations. They are being ended by the Masters redirecting energies; well, not exactly, I sit in my shorts, aging, drinking beer, the windows open, and they look at me, 6 p.m., coming in from their little jobs… There's no end to termination.
Below: materials for ideological instruction, followed by miscellaneous, half-done, ramblings on the merry old muddle vs. systematically sound discomfort, object orientation, and related issues. Plus an extra serving of quotes emanating from the computational tædium (the unfortunate superabundance in computing practice of many 0s drowning the rare 1s).
Programming languages have to strike a balance between power vs. intelligibility, ease of expression vs. ease of understanding. Do they let you say much in few words, or understand much in short time? Trade-offs can go wrong in many ways, but both bad and ugly fall under the law: Caked mud is no building material apt for transparent constructions.
Clean and the Clean system from Nijmegen feature an advanced type system, fast type inference and compilation, plus a terse adequately sugared syntax. Its linear update types (integrating destructive update in an immutable world) can be a serious pain at times, but that's the fate of pure functional languages: purity engenders either semantic hypocrisy or contortion of fingers and mind, or both.
Clean's past as an intermediate compiler language shows through here and there, but even though some neglected spots smell like C, its general nicety prevails.
Sample code snippet —the Horner scheme for evaluating polynomials p(x) = a0 + a1x + … + anxn given by their coefficients a0, …, an:
// e.g. evapo 10 [1,4,6,7] ==> 7641 evapo :: a !.[a] -> a | *, +, zero a evapo b coeffs = foldr hoernchen zero coeffs where hoernchen x y = x + b*y
or –replacing hoernchen and its local def with a λ-expression–
= foldr (\x y = x + b*y) zero coeffs
Gödel's most pronounced difference from Prolog is data abstraction in meta-programming. (Prolog explicitly represents clauses and rules by Prolog structures aka terms.) While other descendants of Prolog yield to the functional temptation, Gödel stays true to first order logic and the prolidigm paragram: programs –first order theories, computing –deduction.
Sample code snippet implementing the Horner scheme:
PREDICATE Evapo : Integer * List(Integer) * Integer. % intent in in in/out % e.g. query <- Evapo(10, [1,4,6,7], x). answer(s) x = 7641 % naive right fold by foot Evapo(b, , 0). Evapo(b, [c|cs], p) <- p = c + b*p0 & Evapo(b, cs, p0).
Scheme introduced lexical scoping (block structure) and feature minimalism from Algol into the Lisp world. Today Scheme offers first-class continuations and a high-level macro facility allowing us to add transformation rules to the grammar. Its simple syntax, a standard representation for Scheme forms (expressions and definitions), and the built-in parser let us easily write Scheme programs handling Scheme programs —an empowering feature most programming languages lack (excepting notably Lisp and Prolog). Some dialects/systems sport advanced ideas about (soft) typing, modules, or compilation techniques.
Briefly, Scheme's half stuck in the lispy muck it emerged from and –as far as the standards go– not very useful. However, it spawned great systems and texts with a unique flair, leaving way behind the clattering two-stroke contemporaries.
Most things schemish are gathered at Schemers.org. Go forth and see how to apply Scheme to your life that it will become rich and happy. Favourite intricacies: dirty hygiene, continuation blues, recursion without naming functions (Why Y works · sample chapter from The Little Schemer).
Sample code snippet implementing the Horner scheme:
;; e.g. (evapo 10 '(1 4 6 7)) ==> 7641 (define (evapo b coeffs) (fold-right (lambda (x y) (+ x (* b y))) 0 coeffs))
Scsh protects nerves and gastric membranes from the hard suction of Unix, and Unix does suck hard. “Without Scsh, my life would be an endless night of bleakness and despair.” (Alan Bawden)
CL was shaped by engineering concerns more than by the quest for conceptual clarity. But despite its practical applicability, CL offers the fun sadly absent from mainstream hacking. The ANSI spec is available online.
Sample code snippet implementing the Horner scheme:
;; e.g. (evapo 10 '(1 4 6 7)) ==> 7641 (defun evapo (b coeffs) (reduce #'(lambda (x y) (+ x (* b y))) coeffs :initial-value 0 :from-end t))
“the first book on programming as the core subject of a liberal arts education”
“First, we want to establish the idea that a computer language is not just a way of getting a computer to perform operations but rather that it is a novel formal medium for expressing ideas about methodology. Thus, programs must be written for people to read, and only incidentally for machines to execute.”
“We find that a good programming style requires using programming concepts that are usually associated with different computation models. Languages that implement just one computation model make this difficult: Object-oriented languages encourage the overuse of state and inheritance. Objects are stateful by default. Functional languages encourage the overuse of higher-order programming. Typical examples are monads and currying. Logic languages in the Prolog tradition encourage the overuse of Horn clause syntax and search. These languages define all programs as collections of Horn clauses. Many algorithms are obfuscated when written in this style. These examples are hints that none of these models is a panacea when used alone.” (abridged)
“Programming languages are some of the most thoroughly designed artifacts in computer science. Therefore, the study of programming languages offers a microcosm to study design itself.”
“Syntax is the Viet Nam of programming languages.”
“If you don't understand interpreters, you can still write programs; you can even be a competent programmer. But you can't be a master.”
“After all, civil engineering did not advance because some people intuitively designed successful bridges.”
“Perhaps the most interesting conclusion of the experiment is that it does seem a viable approach to express all laws as equations between functions, and to use a simple equational logic for proving results.”
“How can the professor ensure that the president gets invited to her own party? … Who the devil's that inside the winding-sheet?”
“The premise of this book is that you can only write something useful and interesting when you both understand what makes good writing and have something interesting to say. This holds for writing programs as well as for writing prose.”
“On the second ring they both landed, each grabbing a piece of the receiver. They rolled over and over on the rug, breathing heavily, all legs and arms and bodies in a desperate juxtaposition, and reflected that way in the full-length mirror overhead.”
“Fun is mathematically simple . . .”
“Storm the Reality Studio and retake the universe.”
“Be clear about the difference between your role as a programmer and as a tester. The tester in you must be suspicious, uncompromising, hostile, and compulsively obsessed with destroying, utterly destroying, the programmer's software.”
“The programmer builds from pure thought-stuff: concepts and very flexible representations thereof. Because the medium is tractable, we expect few difficulties in implementation; hence our pervasive optimism. Because our ideas are faulty, we have bugs; hence our optimism is unjustified.”
“Po fat people get paid more.”
“There is a zen to writing, and, like ordinary zen, its simply stated truths are meaningless unless you already understand them —and often it takes years to do that.
Sure, read Elements of Style and every book on writing you can get your hands on, but there really is only one I've seen that tries to teach what it means to omit needless words, to write clearly and simply. That book is Style: Toward Clarity and Grace by Joseph M. Williams (1990). Williams seems to know what makes writing clear and graceful and he can explain it. Sometimes he explains where bromides like avoid passives come from and tells us how to figure out when to ignore them —for example, when it's a good idea to use passives. He does this by providing a theory of clear writing that we as scientists can use. If you decide to read only one book on writing, this is the one.”
profusely illustrated account of the world based on cellular automata / simple computations exhibiting complex behaviour
Futurist software tends to degrade when not treated with its daily dose of love and care and amphetamine, a fluo-green mucus the frightful end.
The dissolved web directory listed among others Mark Van de Walle on the artificial intelligence and Wagner James Au on the experimental theology involved.
“The only particular thing I remember is I figured that if Alan Kay works there, I bet I don't have to wear a tie. I was right.” (Doug Fairbairn)
Sample arithmetic black box functions and guess an adequate implementation.
Mine the store of an automaton for precious symbols.
Survive and win a λ-poker tournament.
Make money in the automobile business mixing sauce and building picky cars to run on it.
Move satellites around ultimately to collect the twelve apostles from the skies.
Optimise martian rover control in the face of the locals, their gods, and potholes.
Application unexpectedly quit eating genetically modified cows.
Submit many valuable compressed sandstone publications.
Program subversive robbers and cooperative coppers to score in a game with black monolithic variations.
Wire a competitive ant brain.
Drive a car around a set of racing tracks fast.
Implement a player delivering packages to their destination in a multi-player robot game.
Optimise markup documents for simplicity and size.
Implement a ray-tracer rendering scenes described in a simple functional modeling language.
Optimise computer-controlled interactive fiction characters for battery life.
Write a program that plays “pousse,” an odd variant of tic-tac-toe.
with built-in coin slots, lotion and tissue dispenser, faux leopard fur-lined inside, …
Let the control flow – on the donut, the kleinian bottle, whatever.
We can forgive a man for making a useful thing as long as he does not admire it. The only excuse for making a useless thing is that one admires it intensely.
Oscar Wilde: The Picture of Dorian Gray
[ Fragmentary sketch of similarities and characteristic differences of Common Lisp and Scheme. I am going to argue –someday– that software should sport two conflicting qualities: dynamism and intelligibility, since it must cope with change and exceptions, but should nonetheless be easily analysable at write-time. Common Lisp offers strong dynamic features but not much leverage for static analysis. Scheme, the language and the culture that goes with it, promises a mathematically more disciplined dynamism —but fails to deliver, to some extent. Has the vanilla pudding of confusion gotten the better of us? Read further and you won't find out. ]
Scheme emerged and has departed quite notably from the Lisp world. Both Scheme and Lisp feature an approach to data based on structured external representations (by bracketed expressions), a built-in parser, and a well-supported concrete data-type for the parse-tree. (Avoiding the syntactic pain of XML or SGML and the rectal pain of object-request brokers.) Since the data may well be code, Scheme and Lisp make it quite trivial to build interpreters for specialised domain languages or to embed these languages into Lisp/Scheme.
Actually, (Common) Lisp even makes forms (definitions and expressions) object level values and assigns the dependent role to their textual representation. On the whole, CL is geared toward the runtime system, one and all, placing the problem at hand and meta-concerns such as parsing, evaluating, and inquiring code on the same footing. Scheme, by its spirit —to be grasped in thelambda experience or mystical onion — more than by its rudimentary definition, tends to separate these concerns, supporting arguably more intelligible layered architectures while retaining much of the power of Lisp.
Let's finally rave on the merits of sound engineering principles. First case study: doc strings. In Common Lisp, the programmer may stick a string documenting a function into its definition. The system prints that string when asked for the description of the (named) function. This comes handy for “library” functions (as opposed to internal machinery). At the same time, this requires the language not only to assign meaning to programs but to specify (other) properties of the runtime system, too. (In other words, this complicates the meaning of programs.) Whereas Scheme leaves the job of organising documentation to the development environment where it belongs. As a result, Scheme programmers usually fall back on the classical means, source comments and text search. Second case study: candied pumpkin. Makes about three cups.
The object movement innovated imperative programming with a multi-agent view of problem domains as well as programs, composed of stateful thingies that interact, offering a limiting but easily communicable basic metaphor. This isn't worthless, but nothing to get too excited about either. Objects localise meaning: agents respond to messages by their own lights, but even thoroughly object-oriented systems impose global constraints or conventions (such as: classes cannot/must not override the identity method), effectively shifting semantic authority from the objects to the language. OO proponents paid much attention to the structure of large scale development efforts, but generated a plethora of absurd provisions, tools, subordinate buzz. Object technology was a marketing success, but there's no silver bullet.
Objects favour analogical thinking and may induce a deplorable lack of precision. But it's not the fault of object thinking that UML, attempting a pictorial notation for behaviour, ended up with flow-charts. And books such as Martin and Odell's Object-Oriented Methods: A Foundation aren't dead, they just smell like jazz, in their wordy ways. By contrast, Abadi and Cardelli's Theory of Objects or Castagna's Object-Oriented Programming: A Unified Foundation, with their dear side salad of greco-roman formulae, seem to serve up real content. (Haven't read these yet, just got a good impression on skimming through the pages.)