Welcome

Photos of Larryblakeley
http://www.royblakeley.name/larry_blakeley/larryblakeley_photos_jpeg.htm

(Contact Info: larry at larryblakeley dot com)

Important Note: You will need to click this icon to download the free needed to view most of the images on this Web site - just a couple of clicks and you're "good to go."

Like so many English words, "computer" derives from Latin and therefore traces its origins back many centuries. A link exists between it and Christianity's greatest feast, Easter. In fact, the word computer, or at least its Latin equivalent, has long been connected to astronomy, time, and the calendar. This article gives a short history of the word, beginning from its use in early Roman times up to the introduction of the digital computer.

In the introductory chapter of his book "The Ordering of Time" 1, historian Arno Borst makes the point that few people are aware of the true origin of the word "computer." We withdraw money from our bank accounts and use our PCs without noticing that the words we use to refer to a bank account (all similar in many European languages) and the word computer are derived from "computus," a centuries-old word that has had various connotations, and has troubled the mind of many a scholar. Borst goes on to criticize those computer specialists who, in his opinion, "know nothing about the past history of the word that constitutes their motto for the future." 2 Of course, whether this harsh criticism is warranted is questionable, but in all fairness, there has been, at least until quite recently, scarcely any literature that properly explains the etymology of computer; and the historian supports his claim by noting that several scholarly works on computing overlook the word's etymology.

According to Borst, the word "computare," which meant "to reckon up," or "to count on one's fingers," was already in use in early Roman times. This word frequently accompanied the word "numerare," which had a similar meaning. Later, the word "calculare" was added to indicate the counting of numbers with beads (or pebbles). Like the cultures before them, the Romans often calculated by placing and moving pebbles ("calculi") around a flat surface ("abacus") marked out in squares. 3,4 The word "calculus" as used in the modern mathematical sense is a direct derivative. Other related words included "computatio" and "calculatio," both apparently having had varied semantics.

In many dictionaries, the etymology of computer is not typically given because the entry for this word follows that of the verb compute, for which the derivation normally given is computare, a word that can be broken down into "com," the Latin for "together" and "putare" (or "pute" or "puto"), the Latin for "to reckon" or "to settle." 5 Although this is generally correct, the Latin word "computus" (sometimes "compotus" or "compotos") may well have been the one giving rise to the word computer as used in the modern sense, because this word was in widespread use in Europe throughout the Middle Ages. Indeed, the word computus may have been used for the first time in the third century AD, initially taking the same varied meaning as computatio, for example referring to arithmetic or economic "estimation." However, not until a century later did it begin to denote something different, and then gained wide currency. The first specific meaning was coined by a Sicily-based writer who used it to denote "the astrological interpretation of computed and observed planetary orbits," 6 a practice prevalent among pagans at that time.

The probable reason why computus acquired widespread use has to do with ecclesiastical history, that relating to Easter. When the Nicene council, convened by Constantine in AD 325, laid down the rules (actually just adopted an already established method) for determining the date of Easter, it certainly did not anticipate the confusion that would ensue for centuries to follow. Had this council decided on a fixed date for this feast or set the feast on, for example, the first or last Sunday of a particular month, it would have simplified matters considerably, and the history of the Roman Catholic Church in connection with the ecclesiastical calendar would probably have taken a different turn. However, the general consensus among Christians was that Easter should be celebrated on a Sunday and, importantly, on the Sunday after the feast of the Jewish Passover. 7 Passover is based on the lunar cycle; consequently, the date of Easter was inextricably linked with the moon. To calculate this date therefore required almost the impossible: an accurate determination in advance of the movements of the sun, earth, and moon.

By opting for a movable date, little did the Nicene council realize the task it had set. Anyone with a basic knowledge of calendrics will tell you that determining the date of Easter is no trivial matter. 8 In an era when the length of the tropical year (the year defined by the seasons) was not known with certainty, the mathematics for manipulating fractions was still in its infancy, and calculating aids (other than the abacus) and accurate astronomical observing instruments (such as the astrolabe) were practically nonexistent, tackling the problem was indeed a grand challenge. Thus, although not entirely beyond the grasp of the Alexandrian astronomers, the problem was difficult because the tools to solve it were not forthcoming, and the methods used not sufficiently refined. The Easter date could only be approximately determined and hence not always corroborated. At times, the different churches would come up with different dates for Easter Sunday, with Easter being celebrated on different Sundays of the same year. 9 For centuries, this problem continued to trouble the minds of scholars as they tried, but failed, to determine the correct Easter date. 10

One of the first persons to have possessed a thorough knowledge of time and the calendar was the Scythian monk Dionysius Exiguus who, in 525, was instructed by Pope John I to determine the date of Easter for the following year. This abbot not only calculated the Easter date for one year, but went on to draw up a table of future Easter dates covering a period of some 95 years. Traditionally, the computation of Easter was the realm of the Alexandrian fathers who had treated this practice with some secrecy, as though it belonged only to a gifted few. The Roman Catholic Church, probably not wanting to depend entirely on the Eastern Orthodox Church regarding this matter, and aware of Dionysius' competence, therefore sought to go its way by seeking this bright fellow's services.

Apart from the mathematical sciences, Dionysius was well versed in languages, and the translations of the Greek writings into Latin that he undertook contain terms such as "sancta pasche compotum" [Holy Easter Reckoning]. Dionysius gave us our Anno Domini (AD) system of dating. Cassiodorus, Dionysius's friend, was the first to officially use it in his "Computus Paschalis," a textbook describing the basic skills of time reckoning. From now on, for centuries to follow, computus essentially meant "the calculation of Easter." 11

"'Compotiste' and 'abaciste' of late medieval times"

Both computus and computare continued to be used frequently in writings, but variations of them began to appear as well. For example, in the 7th and 8th centuries, several nonscientific attempts were made to calculate - computare - the age of the world. 12 Those involved in such time reckoning were given the title of "computator," a word changed by the Venerable Bede of Jarrow to "calculator." The first chapter of Bede's textbook of 725, "De temporum ratione," was headed "De computo vel loquela digitorum," so that we have here the word "digitorum" used to denote one of the 10 digits we use when counting on our fingers. 13

The meaning of computare as "to recount" was suggested by Bishop Gregory of Tours as well as by Bede, from the custom that "uneducated" people, when asked to give information in terms of numbers would, instead, recount stories. 14 The use of the word in this sense has been traced to a Regensburg scholar who, toward the end of the 9th century, wrote a story and used the phrase "instanti anno, quo ista computamus" ("in the current year in which we are counting this"). 15

With the advent of the astrolabe in the 10th century, the abacus may have become even more popular because the use of the astrolabe required calculations for which the abacus, if not the ideal reckoning tool, was certainly handy. The astrolabe was one of the first elaborate and accurate instruments for sighting the position of celestial objects, and for this reason may be considered among the earliest analog devices. Used by the Greeks and later developed and refined by Arab astronomers, it also served as a timepiece and a navigating aid. The late science historian Colin Ronan describes it as "a graphical computer"; 16 Chaucer wrote an entire treatise on it, one of the earliest technical works on science written in the English language in an age when Latin was the common medium of instruction. As for the abacus, this has given us the term "abaciste," which together with "calculatores," referred to those who habitually used the abacus for computations. In later times, the word "abaciste" was reserved for those who used a particular method of working out multiplication. 17 Interestingly, abacus as a word was unknown in Chaucer's time in England, when the word "augrim" (or "augrym") was used instead. 18

The word augrim descends from "algorism," itself a corruption of the Latinized version of the name of the 9th-century Muslim mathematician al-Khwarizmi. 19 In fact, "algorism" can be traced to the mathematician Alexander of Villedieu who, at the turn of the 13th century, wrote a book entitled "Algorism," explaining how the new Arabic methods dealing with decimal numbers worked. This was a period when the Hindu-Arabic methods of manipulating fractions were being studied in Europe and the realization that the bulky Roman numeral system had severe limitations. This word appeared again some 50 years later in an encyclopedia by the great French scholar and encyclopediast Vincent of Beauvais in a section titled "De computo et algorismo." The modern word "algorithm" is, of course, a direct derivative.

In the second half of the 13th century, Roger Bacon wrote his famous work titled "Compotus," a treatise on the science of time. Bacon noted that Easter was already out of step with the moon by three or four days and later made an appeal to Pope Clement IV for a reform of the calendar. (The Julian calendar was actually already out of step with the seasons by about nine days.) He also noted that to achieve better results regarding the calendar, one could now no longer work with whole numbers as the earlier "compotiste" did. He insisted that Christians should not look ignorant before Muslims who had one of the most accurate lunar calendars. 20 By this time, the word "compotista" was not only reserved for time reckoners but also for music scholars. Indeed, some music theorists began to call themselves "compotiste."

In this same period, the word "conto" in Italian still meant astronomical time-reckoning as did, more or less, the word "conteour." 21 However, Dante Alighieri wrote a collection of love poems in which conto was used in a different context. It suggested the relationship between two lovers - not physically, but in terms of monetary accounting, how lovers reckon and balance income and expenditure. It subsequently found its way into neighboring countries, as "compte" in French and "Konto" in German. The papal chancellery helped complete the change to Latin when it created the office of the "taxator" or "computator" so that papal bulls could be charged and registered. 22 In English, the word "computist" was also used in the 16th and 17th centuries to refer to a keeper of accounts (that is, an accountant). 23

During this period, double-entry bookkeeping was already being practiced. 24 The Italian Franciscan Luca Pacioli made the double-entry technique widely known through his "Summa de arithmetica, geometria, proportionis et proportionalita," published in 1494, where he devoted a section on bookkeeping titled "De computis et scriptures." This particular chapter was subsequently published separately in five languages and plagiarized widely. 25

"First use: 'Compute, Computation, and Computers'"

With a practical solution for bringing the length of the calendar year close to the true length of the tropical year, manifesting itself in the reform of the calendar in 1582, the era of the computus virtually came to an end. Although the word compotista was still being used to refer to time reckoners, a new twist to this word was taking shape.

Chaucer is well renowned for his influence on English literature and has often been called "the father of English poetry." In his time, many new English words were made up, and the use of the modern English word compute is probably no exception. It goes back to Chaucer's time, when the French word "compte" was used in an English text to denote the measurements of short time intervals. The word compute, which is a Latinized version of compte, may have been used for the first time by a follower of Chaucer in the early 15th century. 26 In two centuries, it gained wide popularity and appeared in famous works including Milton's "Paradise Lost." 27 Like Chaucer, Milton was well versed in astronomy and astrology, and his use of this word is therefore hardly surprising. The Benedictine monk, John Lydgate, a contemporary poet of Chaucer's who Latinized the word compte also introduced "computacion" circa 1420.

"Computacion," which was also spelled the modern way ("computation") from the beginning, appeared frequently in 17th-century texts that involved dates, because it often had to be made clear if a specified date referred to the old or new calendar system (for example, .".. according to the new 'Computation'"). The earliest reference to the word computer in the English literature is probably by the physician Thomas Browne in 1646. In his "Pseudodoxia Epidemica," he used its plural form to refer to the persons involved in time calculations, so that the word computers was in fact used instead of the then more popular Latin word compotiste. 28

The word computer made another appearance some half a century later when it was used by the satirist Jonathan Swift. In his "A Tale of a Tub," published in 1704, Swift dedicates a sec44 tion in which he attacks those scholars who spend hours on end reading countless abstracts only to produce further duplicated works when, according to information provided to him by a "very skillful 'Computer,'" all the ideas could easily be made to fit into one volume. 29 This "skillful Computer" was an informed person who had arrived at this conclusion by applying the "rules of 'Arithmetick.'" In part III of "Gulliver's Travels," Swift refers to another "computer" with the aid of which anyone would be able to master all the arts and sciences. This must be one of the earliest instances when the word was used - by the same author and in a short space of time - to refer both to a machine and a person.

"Era of logarithms and the early 'computists'"

In the 17th century, primitive calculating machines began to be produced. The slide rule's earliest ancestor is generally attributed to the Englishman Edmund Gunter who, circa 1620, pioneered an analog device, which became to be known as the "logarithmic line." Soon afterward, William Oughtred simplified things further by taking two Gunther "lines" and sliding them relative to each other (instead of measuring distances on one scale with a pair of dividers). Before long, other people refined Oughtred's design and produced a variety of instruments. They were well received by the scientific community of the time, particularly by engineers and navigators who may later have preferred them to the more bulky digital mechanical calculating machines.

These instruments were the outgrowth of logarithms, which had just been invented by the Scotsman John Napier and the Swiss watchmaker Joost Bürgi. Napier had been working on logarithms since the 1590s, but his "Mirifici logarithmorum canonis descriptio" was not published until 1614. Napier was also one of the first persons to attempt "mechanizing" mathematical calculations when he built his instrument now referred to as Napier's bones, or Napier's rods. The German scholar Wilhelm Schickhardt designed and built his ingenious calculating device at roughly the same period (circa 1624). Although meant to aid in the astronomical calculations of his friend Johannes Kepler, this "arithmeticum organum" was apparently poorly suited to the purpose because it could not add up numbers of more than six figures.

The mathematician and historian Lancelot Hogben, in his classic book "Mathematics for the Million," mentions two needs that may have contributed to the development of logarithms (and ultimately to the design and manufacture of calculating machines). The first has to do with the preparation of trigonometric tables for use in navigation. The second is related to accounting, namely the lengthy calculations required when reckoning compound interest upon investments. 30

A major navigational problem and obstacle at the turn of the 17th century was the determination of the exact geographical longitude at sea. No practical and accurate method had yet been found. Often positions were grossly miscalculated with the result that ships sometimes got lost or shipwrecked. Following a number of notable fatal accidents, various governments eventually decided to reward prize money to anyone who came up with an accurate, practical, and reliable method of solving the longitude problem. 31

In the absence of an accurate timepiece - which, in the form of the marine chronometer, took John Harrison about 30 years to perfect - navigating at sea consisted primarily of making astronomical observations and consulting the trigonometric tables. The compilation of these (and various other) tables required much effort on the part of the people working to produce them. They involved lengthy mathematical calculations, so that the "computing" persons who carried out these manual calculations - including those in business concerns - were given various titles, including computists, calculators, and computers. 32

For a long time, the calculators' main calculating aids remained the logarithms (and, for less exact work, the slide rules), for although a number of machines had been conceived and built, none was practical or reliable. The general public showed little interest in these "Rechnungs-Maschinen" or "Rechenmaschinen" [Calculating Machines], and it was not until the 1820s that Leibniz-type machines, which were among the best then available, began to be manufactured in large quantities.

"Human computer era"

When Blaise Pascal built his machine "d'arithmetique" in 1642, it was meant to relieve the "calculateur" of the task of using the "jetons," or counting beads. Some 800 years before, Bede had already used the Latin word calculator in connection with time reckoners, but Pascal's reference to calculateur applied particularly to those involved in tax calculations. As a result of the French usage, the English equivalent may have been used in this sense from that time onward.

It is not possible to say which of the two words, calculator or computer, was the more frequently used from about the middle of the 17th to the end of the 19th centuries because references to both can be found in a number of English texts. It is certain, however, that the word computer to refer to a human had become popular by the beginning of the 20th century and retained this meaning for a few decades thereafter. By this time, it was a popular title given to those who were properly trained in performing complex "programmed" computations. With few exceptions, it was rarely used to refer to a machine. 33

Although in Pascal's time some persons were already given the title of computist or calculator, the era of the human computers may be considered to have truly begun when groups of people began to collaborate on lengthy computational problems. By the late 1700s, it was becoming apparent that the numerical solutions to some of these mathematical problems could not possibly be completed by one person in a realistic time frame, although the task could be achieved if the problem was appropriately prepared, broken down, and given to several people to work on.

One of the earliest small groups to work in this manner included three astronomers who, in the summer of 1758, worked for nearly five months on the orbit of comet Halley to predict its perihelion passage. 34 Although their predictions were out by about one month, it had been proven that, given enough human resources, large-scale problems were not insurmountable.

With the invention of the calculus, new analytical methods, and many other mathematical and scientific discoveries, the mathematical problems naturally became more complicated and their numerical solutions correspondingly tedious. The advances in the field of celestial mechanics, for example, led to new models for the computation of orbits that required ever-greater human computing power. Observatories began to employ personnel whose sole task was purely numerical calculation. Analytical work could not be complete without the numerical solutions.

The same was happening in the business and accounting profession where the amount of arithmetic had increased vastly, although the need for huge computations still remained in the scientific and engineering fields.

To get some idea of the scale of these computations, it is worth citing a few examples. Thus, around 1835 the Hungarian mathematician and physicist Josef Petzval, who had come up with a set of complicated formulas for the design of a fast and precise photographic lens, often complained that he lacked the financial means to engage the many human calculators to carry out the complicated task of producing the required mathematical tables for use in constructing the actual lens. Eventually, he was lent 10 computing bombardiers from the military who apparently worked for several years on this job. The lens was produced according to the specifications in 1840 and reduced the exposure time needed for taking a photograph from minutes to a few seconds. 35

Earlier still, in the 1790s, when the French government authorized the preparation of new tables of logarithms and trigonometric functions, it took nearly 100 human computers two years to complete the job.

An example, from more recent times, which perhaps more than anything else summarizes the nature of the job, is that by a Harvard astronomer who, in 1879, wrote a parody titled "Observatory Pinafore," which described the work of these human computers:

"We work from morn till night,
For computing is our duty;
We're faithful and polite,
And our record book's a beauty;
With Crelle and Gauss, Chavenet and Pierce,
We labor hard all day;
We add, subtract, multiply, and divide,
And we never have time to play." 36

These examples not only illustrate the scale of the computational effort involved, they also show that employing human computers to work in unison on specific problems had become a well-established, if perhaps not an entirely refined, practice. The divide-and-conquer strategy and the division-of-labor principle had shown their worth. Moreover, as a result, the words calculator and computer became firmly established. For a good two centuries their meaning remained synonymous, referring only to the human being.

"New definitions"

In the first few decades of the past century, the demand for more human computers continued to rise. During World War I, for example, many computers were employed on both sides of the war to perform tasks related to ballistics, surveying, navigation, and cartography. Also, because most of the men went to war, this period marked an increase in women computers. Although the hiring of women to do computing jobs was not entirely new, dating back to at least Babbage's time when women helped astronomers with their calculations, 37 it was done on a relatively large scale only a few decades before in the 1880s. Before this time, a handful of women also did work for the Nautical Almanac and Survey offices as computers, and a few more did it gratis for friends. 38 These women calculators continued to be engaged in this type of work up to World War II, by which time their role, as well as that of men, was beginning to change.

Three significant things happened in the years leading to the digital computer that initially started differentiating between the words computer and calculator, and subsequently completely changed their meaning. First, a number of adding machines were becoming popular. The comptometer and the millionaire, to name two examples, were used extensively, as may be gathered from the literature (including newspaper advertisements). 39 Second, the art and science of human computation was being professionalized. Not only had the computers become more disciplined and better organized in their methods, they also began promoting the field by forming groups, holding meetings, and publishing notes and journals. 40 In part, this may have been due to the introduction of the more powerful calculating aids that eased the employees' workload and let them concentrate on more important functions such as the techniques that they used. Third, the progress in electronics, combined with that in the theoretical field of computer science - which led to the introduction of portable "scientific" calculators and digital computers - ultimately changed the role of those employed in the field and created new titles for both machine and person that were to stick.

From an etymological point of view, probably the most noteworthy period was that circa 1930 - 1960, when the words computer and calculator were beginning to take on a new sense, but had not yet lost their old meaning. Most calculating devices were still generally referred to as calculating machines, but when the first portable electromechanical and electronic machines, such as the once ubiquitous, Friden began to be produced in the 1950s and 1960s, they were called calculators. Although it was not the first time that calculator was used to refer to a machine, it appears that this meaning was now becoming prevalent as a result of the widespread use of these new computing machines. Thus, calculator now had two specific meanings; indeed, dictionaries of this period describe it as such.

Roughly the same thing was happening to its twin word computer. As early as the 1920s, the term "computing machine" had been used for any machine that did the work of a human computer, that is, that calculated in accordance with effective methods. With the onset of the first digital computers, this term gradually gave way to computer.

At least one document has been traced that recommended a new definition for both calculator and computer. In February 1945, in a report issued for the National Defense Research Committee on "Relay Computers," George Stibitz of the Bell National Laboratories suggested that calculator or calculating machine should henceforth mean a device capable of performing the four basic arithmetical operations. Computer, however, would refer to a machine capable of carrying out automatically a succession of operations of this kind and of storing the necessary intermediate results. Additionally, Stibitz suggested that human agents would be called operators to distinguish them from computers, the machines. 41

Stibitz's new definitions stuck, but neither immediately replaced the old ones, which continued to appear in a number of works, nor won universal support. The definitions caused a stir in some circles, when groups and individuals, IBM and John von Neumann included, actively resisted them. The US National Bureau of Standards dropped a human computer division and stopped using the word to describe people in 1964. Dictionaries of the 1970s and 1980s still gave the old meanings in addition to the new ones. The last generation of human computers retired in about 1970, which probably explains the change in the dictionary entries. Interestingly, some notable dictionaries published in the 1960s even omit an entry for computer despite the fact that calculator is included with an explanation of both meanings. 42

As for those nations - other than Britain and the US - that played a key role in the invention of the calculating machine and the digital computer, it would be even longer for them to embrace the English word computer as part of their vocabulary. In Germany, the computer continued to be called "Rechner;" 43 in France, it was referred to initially as "calculateur" and later as "ordinateur;" and in Italy they called it "calcolatóre," a word that was once reserved for the human computer.

Today a number of European languages have adopted computer, or a variant, and in so doing made it universal. Yet, outside academia, few are aware that this generic term has been used for animate as well as inanimate objects. Considering what modern computers are now capable of doing, the word that describes them has, paradoxically, almost become a misnomer.

- "The Calculation of Easter Day, and the Origin and Use of the Word ‘Computer' ," Mario Aloisio, University of Malta, as published in the July-September 2004 issue of the IEEE Annals of the History of Computing, IEEE Computer Society.

Accessed September 10, 2004

Mario Aloisio began his computing career with ICI, at the company's research laboratories in Runcorn, Cheshire, in 1979. He has held various posts, designing and developing software for both the local and overseas markets. Major projects he worked on include the computerization of the Malta government lotto system and a real-time SCADA system for the Waha Oil Company (of Libya). In 1997, he joined the University of Malta as a full-time, assistant lecturer in computing. A physics and computing graduate, Aloisio is a founding member of the local astronomical society, and author of one astronomy-related book. Readers may contact him at mario.aloisio@um.edu.mt

References and notes

1. A. Borst, "The Ordering of Time" (translated from the German by A. Winnard), Polity Press, 1993.

2. Ibid., pp. 34.

3. The abacus is still widely used in many parts of the world. Its absence from Western Europe between about AD 500 and 1000 is evidence of civilization's nadir there. In the late 11th and 12th centuries, many treatises on elementary calculation were written on the use of the counting board, and a new verb was introduced, "to abacus," meaning to compute. A. W. Crosby, "The Measure of Reality," Cambridge Univ. Press, 1997, p. 44.

4. The book by G. Flegg, "Numbers through the Ages," (Macmillan, 1989), contains an interesting section on the use of the abacus in Roman times. It also describes the abaci used by the Chinese, Japanese, and the Russians.

5. See, for example, L. Brown, ed., "The New Shorter Oxford English Dictionary" (2 vols.), Clarendon Press, 1993.

6. A. Borst, "Ordering of Time," p. 20.

7. The Easter practices of the early church varied. In some communities, especially those of Jewish Christians, Easter was celebrated with a Paschal meal on the evening between the 14th and 15th day of the Jewish month Nisan. Elsewhere the feast became detached from the Hebrew luni-solar calendar and celebrated on different dates. Because the gospels recount that the resurrection took place on the Sunday following the first day of unleavened bread, the practice spread from Rome of celebrating Easter on the Sunday after 14 Nisan.

8. Some books on astronomy and others on calendars give methods of calculating the Easter date. A famous method is from Gauss who, circa 1800, derived an elegant formula using just a few variables and conditions. To know how to compute the Easter date, refer to the article on the church lunar calendar in "The Catholic Encyclopedia," http://www.newadvent.org/cathen. See also H. V. Smith, J. Br. Astron. Assoc., vol. 87, no. 4, 1977, p. 417; E. G. Richards, "Mapping Time," Oxford Univ. Press, 1998, pp. 354-378; and M. Aloisio, "The Nature of Calendars," Publishers Enterprises Group, 2003, pp. 143-156; 208-213.

9. In the 5th century, Pope Leo was well aware of this problem and discussed it frequently with Bishop Paschasimus of Lilybaeum. The year AD 445 was one such year when the Easter computations, worked out according to the Eastern and Western practices, differed by one week. See S. C. McCluskey, "Astronomies and Cultures in Early Medieval Europe," Cambridge Univ. Press, 1998, pp. 85-86.

10. Some 700 years later Hermann the Lame, a Benedictine monk at Reichenau, wrote: "whence comes the error that the real age of the moon so often does not correspond to our reckoning, "compotus," or the rules of the ancients, and why, as Bede himself admits and our own eyes confirm, does a full moon appear in the sky in most cases one day, and in others two days, before the computed date?"

11. "Computus" as a noun was also used in medieval times to refer to the set of tables and rules necessary for the same purpose (that is, for calculating astronomical events and movable dates in the calendar). See, for example, S. C. McCluskey, "Astronomies and Cultures," for an account of Computus; and M. R. Williams, "Building a World-Class Book Collection: The Tomash Library," "IEEE Annals of the History of Computing," vol. 23, no. 4, Oct. - Dec. 2001, p. 42, for an example of an extract from an actual 1488 "Computus."

12. In those times, nobody ever dared to put the time of the Creation at any more than a few thousand years. In the 8th century, the Franciscan monk and scholar Roger Bacon calculated that a person walking 20 miles a day would take 14 years, 7 months, 29 days, and a fraction to reach the moon. For some of the West's best-informed scholars, the extent of the universe could still be described in terms of walking. See A. W. Crosby, "Measure of Reality," p. 23.

13. A. Borst, "Ordering of Time," p. 36.

14. Ibid., p. 46.

15. Ibid., pp. 46-47.

16. C. A. Ronan, "The Cambridge Illustrated History of the World's Science," Cambridge Univ. Press, 1983, p. 207.

17. Until the 14th century, various arithmetical calculation methods were in use including the Arabic method of dust numbers; finger-counting, as described by Bede; and the counting-board or abacus method. Practitioners of these methods were called algorists in honor of the 9thcentury mathematician al-Khwarizmi. Practitioners of a slightly different method in which all workings in the calculation was preserved were called abacists. The choice of the latter word is unfortunate because no abacus was actually used. I. GrattanGuinness, "The Fontana History of Mathematical Sciences," Fontana Press, 1988.

18. What is now called a line abacus appears in Chaucer's "The Miller's Tale" as the "augrym stones." D. Brewer, "Chaucer and His World," Eyre Metheun, 1978, pp. 61-62. See also the entry under "augrim" in H. Kurath, ed., "Middle English Dictionary," Univ. of Michigan Press, 1956. In this edition, an augrim stone is described as "a stone or counter inscribed with an Arabic numeral and used in computing [often upon an abacus]."

19. The full name is usually given as Abu Jafar Muhammed Ibn Musa al-Khwarizmi. The word algebra also comes from one of al-Khwarizmi's books, "Kitab aljabr wa almuqabalah" [Calculating by Restoration and Reduction].

20. In 1267, he wrote: "The calendar is intolerable to all wisdom, the horror of all astronomy, and a laughing-stock from a mathematician's point of view."

21. A. Borst, "Ordering of Time," p. 85. 22. Ibid., p. 86.

23. See the "computist" entry in J. A. Simpson and E. S. C. Weiner, eds., "The Oxford English Dictionary" (20 vols.), Clarendon Press, 1989.

24. For an entire chapter on the history of bookkeeping, see A. W. Crosby, "Measure of Reality."

25. Ibid., pp. 215-216.

26. A. Borst, "Ordering of Time," p. 99.

27. For example, in Book VIII, verse 16. See, for example, J. Carey and A. Fowler, eds., "The Poems of John Milton," Longmans, 1968, p. 814.

28. See G. Keynes, ed., "The Works of Sir Thomas Browne," Univ. of Chicago Press, and Faber, vol. II, book VI, 1964, p. 419.

29. See, for example, J. Hayward, ed., "Swift: Gulliver's Travels and Selected Writings in Prose and Verse," Nonesuch Press, 1942, pp. 324-325.

30. L. Hogben, "Mathematics for the Million," Pan Books, 1967, p. 402. Hogben dedicates a long section explaining how the theory of logarithms has developed.

31. Among these were the British Parliament, offering a reward of £20,000, and the States-General of Holland who offered 25,000 florins. The Board of Longitude in England was created in 1714 specifically to judge claims to the British prize. It was disbanded in 1828 after the prize money had been settled and when its service was no longer deemed necessary. See D. Sobel, "Longitude," Fourth Estate, 1995.

32. Often, these human computers worked from home. For example, to prepare the "Connaissance des Temps" - the first volume of which came out in 1679 - the French astronomer Lalande employed a small group of people to do the computing work from their own homes. Later, the same method was adopted to produce the "Nautical Almanac." See M. Croarken, "Tabulating the Heavens: Computing the Nautical Almanac in 18th-Century England," "IEEE Annals of the History of Computing," vol. 25, no. 3, July - Sept. 2003, pp. 48-61.

33. One exception was Hollerith's punch machine, built specifically for the US 1890 census, which was called a statistical computer.

34. See, for example, D. A. Grier, "The Human Computer and the Birth of the Information Age," Joseph Henry Lecture of the Philosophical Soc. of Washington, 2001; F. L. Whipple, "The Mystery of Comets," Cambridge Univ. Press, 1986, pp. 42-43; and D. K. Yeomans, "Comets," John Wiley & Sons, 1991, pp. 126-130.

35. H. Zemanek, "Central European Prehistory of Computing," "A History of Computing in the Twentieth Century," N. Metropolis, J. Howlett, and G. Rota, eds., Academic Press, 1980.

36. D. A. Grier, "Nineteenth-Century Observatories and the Chorus of Computers," "IEEE Annals of the History of Computing," vol. 21, no. 1, Jan. - Mar. 1999, p. 45.

37. By the 1890s, it was common for observatories to employ women computers to classify stellar spectra. See, for example, P. E. Ceruzzi, "When Computers Were Human," "Annals of the History of Computing," vol. 13, no. 3, July - Sept. 1991, pp. 237-244.

38. See, for example, M. Croarken, "Mary Edwards: Computing for a Living in 18th-Century England," "IEEE Annals of the History of Computing," vol. 25, no. 4, Oct. - Dec. 2003, pp. 913.

39 P. E. Ceruzzi, "When Computers Were Human," pp. 242-243.

40. See, for example, D. A. Grier, "The Human Computer," and D. A. Grier, "The Rise and Fall of the Committee on Mathematical Tables and Other Aids to Computation," "IEEE Annals of the History of Computing," vol. 23, no. 2, Apr. - June 2001, p. 28.

41. P. E. Ceruzzi, "When Computers Were Human," p. 240.

42. Examples include the fifth edition of H. W. Fowler and F. G. Fowler, "The Concise Oxford Dictionary of Current English," 1964; and the 1965 edition of J. B. Foreman, ed., "The New National Dictionary," Collins, 1965.

43. This is the most popular word used in Germany for the modern computer (apart, of course, from the English word itself, which has now become common usage). Other (older) terms include "Elektronenrechner" and "Rechenautomat." "Rechenmaschine" and "Taschenrechner" are now reserved for the calculator, but "elektronische Rechenmaschine" also refers to a computer.