Not a clock — a computer

Not a clock — a computer

A Mandelbrot set • Wolfgang Beyer (CC BY-SA 3.0)

A sto­ry by Bar­ry Cipra in [the Mar. 25, 1988] issue of Sci­ence draws atten­tion to a loom­ing “cri­sis in math­e­mat­ics.” The cri­sis has to do with the teach­ing of cal­cu­lus, the branch of math­e­mat­ics that has long been the cor­ner­stone of edu­ca­tion in sci­ence and engineering.

Accord­ing to Cipra, stu­dents are com­ing to col­lege cal­cu­lus with poor back­grounds in alge­bra, geom­e­try, and trig. More­over, stu­dents do not under­stand why they must learn cal­cu­lus when most of the rel­e­vant prob­lems can be solved more expe­di­tious­ly with sophis­ti­cat­ed new hand-held cal­cu­la­tors or computers.

I remem­ber with delight my own under­grad­u­ate expe­ri­ence with cal­cu­lus, back in the days when the most sophis­ti­cat­ed com­pu­ta­tion­al device avail­able to a stu­dent was a slide rule. I was daz­zled by the beau­ty of the sub­ject, impressed by its pow­er to solve prob­lems, over­whelmed by its philo­soph­i­cal ingenuity.

So enthralled was I by cal­cu­lus that when I mar­ried I imme­di­ate­ly set out to rem­e­dy what I per­ceived to be the most glar­ing gap in my spouse’s edu­ca­tion. No one, I thought, could be tru­ly hap­py with­out know­ing cal­cu­lus. What fol­lowed is best left unwrit­ten. The mar­riage some­how sur­vived my fool­ish­ness, and my spouse has man­aged to live a full and hap­py life with­out the calculus.

Mozart and the Brooklyn Bridge

I still think the cal­cu­lus is one of the grand­est achieve­ments of the human mind, right up there with things like Mozart’s Requiem, Jef­fer­son­’s Dec­la­ra­tion, and Roe­bling’s Brook­lyn Bridge. But look­ing back, I real­ize that I sel­dom used cal­cu­lus in my career. In my teach­ing, yes, but not in research or in every­day life. And I would guess that my expe­ri­ence is typ­i­cal of the major­i­ty of stu­dents who have suc­cess­ful­ly tra­versed the casu­al­ty-strewn mine­field of Cal­cu­lus 101 – 102.

So why does cal­cu­lus loom so large in the edu­ca­tion of sci­en­tists and engi­neers? The answer, of course, is that since the 17th cen­tu­ry the laws of nature as we under­stand them have been expressed in the lan­guage of calculus.

The cal­cu­lus was invent­ed by New­ton and Leib­niz pre­cise­ly to treat prob­lems of motion and change that arose with the inven­tion of mod­ern sci­ence. There was a close cor­re­spon­dence between the new math­e­mat­ics and a new metaphor that 17th cen­tu­ry sci­en­tists used to describe the world. In the view of New­ton’s con­tem­po­raries, the world is a machine, a sort of elab­o­rate mechan­i­cal clock, set going by the Great Clock­mak­er. The machine runs smooth­ly in con­tin­u­ous space and time, and cal­cu­lus is the lan­guage that best describes con­tin­u­ous change.

But wait! Is it pos­si­ble that a new metaphor for describ­ing the uni­verse is emerg­ing in our own time? Is there some­thing in our lives — now, in the late 20th cen­tu­ry — more impres­sive to us than were mechan­i­cal clocks to the con­tem­po­raries of New­ton? And who is this man, fea­tured on the cov­er of [the Apr. 1988] issue of The Atlantic, who says that the uni­verse is not a clock but a com­put­er, con­trived and pro­grammed by the Great Programmer?

Pure digital data

The man is Edward Fred­kin, an icon­o­clas­tic com­put­er sci­en­tist who for a time was asso­ci­at­ed with the MIT Lab­o­ra­to­ry for Com­put­er Sci­ence, and he is seri­ous. Fred­kin imag­ines a uni­verse that is not made of mat­ter and ener­gy dis­trib­uted in con­tin­u­ous space and time, but of elab­o­rate pat­terns of dis­crete bits, pure dig­i­tal infor­ma­tion — ones and zeros, if you like — tick­ing away, and chang­ing accord­ing to a pro­grammed rule, like bits stored in the mem­o­ry of a com­put­er. There is a new math­e­mat­ics behind these pat­terns of chang­ing bits. It is called the the­o­ry of cel­lu­lar automa­ta, and many physi­cists, at MIT and else­where, are explor­ing its potential.

The cel­lu­lar-automa­ta the­o­rists have had a few suc­cess­es, but they have yet to find dig­i­tal “laws of nature” that describe the world with any­thing remote­ly resem­bling the suc­cess of con­ven­tion­al physics, and until they do Edward Fred­kin’s extrav­a­gant views must be con­sid­ered as wild, even the­o­log­i­cal, spec­u­la­tion. For the time being, cal­cu­lus remains the pre­mier lan­guage of science.

But I would­n’t sell short the idea of “the world as a com­put­er.” Com­put­ers have dra­mat­i­cal­ly changed the way sci­ence is done, and cel­lu­lar automa­ta — those lit­tle uni­vers­es of flick­er­ing bits in the mem­o­ries of com­put­ers — have delight­ful­ly mim­ic­ked cer­tain aspects of the real world. So have frac­tal geome­tries, anoth­er rapid­ly emerg­ing field of com­put­er-based math­e­mat­ics. And there is cer­tain­ly more to come.

The more sig­nif­i­cant com­put­ers become in our lives, the more like­ly they are to dom­i­nate our imag­i­na­tions. The com­put­er is a provoca­tive metaphor, and a good metaphor is a pow­er­ful stim­u­lus to creativity.

Today’s under­grad­u­ate sci­en­tists and engi­neers need cal­cu­lus if they are to under­stand the way the world works. But the day may come when the cal­cu­lus of New­ton and Leib­niz will go the way of slide rules and mechan­i­cal clocks. When every­thing from time­keep­ers to music has been “dig­i­tized,” can physics be far behind? Maybe some­where, right now, a fool­ish young physi­cist, hooked on the the­o­ry of cel­lu­lar automa­ta, is try­ing to inspire a reluc­tant spouse to learn the new math­e­mat­ics of per­mu­tat­ing bits.

Share this Musing: