In the last post I explained how temporal logic came to the rescue and enabled equations like
next(I) = I+1
to be interpreted as real equations. In this temporal logic variables like I are variables that change with time – the meaning (denotation) of I is the whole sequence <0,1,2,…> of values that I takes on through its lifetime. Then the mysterious operator next loses its mystery. The operator next maps sequences to sequences in such a way that the tth value of next(I) is the t+1th value of I. Even “+” denotes an operation on sequences, one that adds the components pairwise. For that matter even the numeral “1” denotes the constant sequence <1,1,1,…>.
This revelation opened up a whole universe of possibilities. Why limit the (formerly) mysterious operators to first and next? We immediately found two useful additions (described last entry), namely assoonas to extract values from loops, and fby to combine recurrence equations into a single equation.
The next step was to allow user-defined temporal operators, most interestingly with definitions combining recursion and other temporal operators. For example,
assoonas(X,P) = if first(P) then first(X) else assoonas(next(P),next(X))
However assoonas can be defined in terms of a more general operator whenever, that selects those values of its first argument corresponding to true values of its second:
whenever(X,P) = if first(P) then first(X) fby whenever(next(X),next(P)) else whenever(next(X),next(P))
The operator whenever takes us far from the original idea of simple iterative loops defined by recurrence equations and implemented (presumably) by translating into while loops and assignment statements.
Fortunately another computational model was available: data flow. A dataflow computation takes place in a network of processing stations connected by tubes or channels. Streams of data flowing in the channels are are transformed as they pass through the processing stations.
Ed Ashcroft and I did not invent dataflow. We learned about it from Gilles Kahn and Dave MacQueen, and from the Doug McIlroy’s pipeline features of UNIX. But it seemed to fit our Lucid language perfectly, with streams corresponding to sequences and filters to temporal operators.
We could write some amazing programs consisting of a few equations, like this one
P = sieve(N)
N = 2 fby N+1
sieve(X) = first(X) fby sieve( whenever(X, (X mod first(X) ne 0)))
that generates the stream of all primes (P)
We were euphoric about the possibilities and our enthusiasm infected some other people as well. One of them was David May, then a grad student at the University of Warwick where I was a lecturer. (He went on to invent Occam and the transputer). David decided to implement Lucid by simulating dataflow networks, for example, maintaining the queues that might accumulate along the channels connecting processing stations. Even recursive definitions like the one above were not expected to cause problems, they required only the ability to incrementally grow the net as the computation procedes.
However, one day we met in the Warwick Art Center cafeteria for lunch. As I was selecting my cheese sandwich, I realized there was something on David’s mind. “We have a problem” he said, ominously. (to be continued)