Yale-New Haven Teachers Institute Home

Math on the Computer

Joyce Bryant

Contents of Curriculum Unit 81.06.01:

To Guide Entry


This unit is about Math on the Computer and it is intended for use with middle school students. However, the material may be appropriate for the students in elementary or high school.

Within the past decade, virtually all disciplines have been affected by computers. High-speed computational devices have been a primary factor in the rapidly changing techniques used in many areas, mathematics being one of them. In the teaching of mathematics there is a tendency to stick to the book which presents a problem. The number of school days and the number of pages in the average textbook just does not balance out. There is very little room, if any, for the student to be creative and explore new methods and techniques for accomplishing desired results in mathematics.

One of the ways that students can be creative and explore new ways of doing mathematics is to make use of the computer. Educators at all levels, from the elementary school to the private business school, to the community college, to the University must prepare students to meet the challenge that the computer offers in this ever changing and complex society of ours.

We as educators can begin by introducing the students to this unusually fast and versatile machine called the computer For processing data.

First, the unit introduces the students to the history of the computer and its advancements. Introduction to computers provides a definition of the computer, as well as an introduction to the basic components of a typical data-processing system. Discussion of the newer and more sophisticated input-output devices, terminal devices and the concept of storage or memory units have been added. Also, the unit exposes the student to the two most commonly used programming languages. This exposure familiarizes the student with the fundamentals of the languages and their usage. The student is then prepared to understand computer programs using the languages to write simple programs in mathematics and do computation on the computer. The terms and their definitions will provide the student with a working vocabulary.


Ever since man started using mathematics he has been inventing devices and discovering ways to aid in handling numbers. The computer is one of those devices that has been invented for use with arithmetic. The computer got its beginnings as far back as 1600. History of the computer prior to that time will add little to our total knowledge of the electronic computer.

One of the earliest devices that aided computation was the abacus. This primitive device is a predecessor of the modern computer. It consisted of wires supporting a number of beads which in turn were free to slide along the wires. By manipulating the beads an operator could add, subtract, multiply and divide. Familiar devices such as adding machines, calculators, and cash registers were invented more recently.

Blaise Pascal completed the first successful calculator in 1642 and authentic copies of the machine still exist today. The mechanization of the operation of carry was introduced on this machine. The machine consisted of a series of wheels, or dials, numbered from zero to nine and arranged to be read from left to right. The machine directly added and subtracted. Multiplication and division were accomplished by repeated additions and subtractions .

Leibniz constructed a calculating machine similar to Pascal’s unit but with additional gears which enabled the device to multiply directly. Leibniz’s machine was not completely automatic, but was preferable to the tedious and error-prone procedures required to do arithmetic by hand. The machine was based on all the principles used in the design of later calculators.

With the progression of science and business techniques, arithmetic became more complex and there was a need for the mechanization of mathematics operations. Improvements in the design and construction of the calculator were made, and the number and variety of operations performed by these devices were increased. The machine also became more readily available. Electric motors and numerous operational aids were introduced in order to increase the speed and reliability of computers. (see ref. 1)

Babbage changed the entire process by designing a machine that would perform all the necessary operations in a predetermined sequence. The machine that Babbage designed utilized cardboard cards with holes punched in them to introduce instructions and the necessary data. The machine was to perform all the dictated instructions automatically and was to have continued the sequence until all instructions were completed. Unfortunately, Babbage was severely limited by the technology of his time and his machine was never completed. He did succeed in establishing the basic principles upon which modern computers are constructed. Some computer engineers have expressed amazement at how closely Babbage’s machine resembled the computers that have been built in recent years.

George Scheutz constructed a machined based on Babbage’s principles in 1854. The machine was copied and used by the British government. Several years ago IBM constructed an Analytical Engine according to Babbage’s original drawings. This device was capable of performing basic arithmetic operations. It also lent itself to some of the more recent programming techniques.

Herman Hollerith is credited with applying punchcard techniques in 1890. To process these coded cards he devised a tabulating machine. Hollerith also organized the Tabulating Machine Company which merged with other companies to form the International Business Machines Corporation (IBM) in 1924.

In 1937 Howard Aiken, with the aid of IBM, designed and constructed a computing machine. The machine was capable of performing arithmetic operations on data input using Hollerith punch cards. Aiken called his first machine MARK I and he went on to construct three more models culminating in his MARK IV. (see reference 4)

John Von Neumann’s contributions were considered extraordinary. His contributions ranged from setting forth in detail the logical design of the computer. He introduced the concept of instructional modification and worked out the details of the computer’s electronic circuitry. Herman Goldstine, Arthur Burks, and John Von Neumann developed the revolutionary concepts of the stored program and the application of the binary number system. These concepts are employed in today’s most modern computers. Neumann, Burks, and Goldstine wrote a report entitled, “Preliminary Discussion of the Logical Design of an Electronic Computing Instrument”. (see reference 4 & 5)

The first true electronic computer appeared in 1945. This device was the ENIAC (Electronic Numerical Integrator and Calculator). Built by Eckert and Manchly, the ENIAC was used in producing tables and the programming was done by means of switches and plug-in connections. In 1949 Cambridge University completed the EDSAC (Electronic Delay Storage, Automatic Computer). This was a stored program computer. It was controlled by internally stored instructions.

There were no commercial computers until the Sperry Rand Corporation built UNIVAC I (Universal Automatic Computer). It was the first computer designed to handle business applications. In 1954 UNIVAC was installed at General Electric Park in Louisville, Kentucky. After 1954 over three thousand computers were put into operation, and today there are over fifty thousand computers in use.

Probably the most exciting future use of the computer will be in the home where all members of the family will have access to a computer terminal. The modern computer terminal consists of a keyboard and a television-like screen on which program results can be displayed. Such terminals are called CRT’s. CRT stands for Cathode Ray Tube, the central element display used in television sets. The home terminals will be connected to small self contained microcomputers or by telephone lines to large central computers.

The computer is an extremely powerful device which is apt to have an important effect on the future of society. It can have beneficial effects but complete dependence on lead to insurmountable problems.

Society must consider the future and the advantages and disadvantages of the computer. Computers are here to stay and they are also an important part of our education al system. It is up to the student of today to meet the challenges that the computer offers and to make use of computers to provide a more interesting and rewarding life.

Introduction To Computers

The term “computer” means different things to different people. Computers are unusually fast and versatile machines for processing data. They accept data to be processed and instructions to and from internal processors. me machines perform simple arithmetic, such as, addition, subtraction, multiplication, and division. They also perform simple logical operations, distinguishing zero from non-zero and plus from minus and discharging the results. In short, a computer is a data processor that can perform substantial computation, including numerous arithmetic or logic operations, without intervention by a human operator during the run.

The types and number of computers in existence today seem almost endless. They vary in size, cost and type of application. All computers are related from a functional point of view. They may be divided into five areas of operation: input, output, memory, arithmetic-logic unit, and control.

The input devices read the necessary data into the machine. The most common devices are terminal keyboards, punched-paper-tape readers, and magnetic tape readers.

The output devices are used to record results obtained by the computer. Some of the most common output devices are, high-speed printers, magnetic-tape machines, special electromechanical typewriters, terminals and punched paper tapes.

By means of a keyboard, a terminal operator can type or key information directly into the computer and receive replies which are pictured on a CRT. As of 1977 the transmission speeds to and from the computer and the terminal were in excess of two thousand characters per second.

Regardless of the system used to process the data, certain fundamental operations must be performed. They are:

1. Recording

2. Classifying

3. Sorting

4. Calculating

5. Summarizing

6. Reporting

The memory section of the computer consists of devices used to store the information which will be used during the computations. There are many thousands of storage units within the memory of the computer. The section is used to store both intermediate and final results as the computer proceeds through the program. The storage area has the capacity to store large quantities of data, any item of which must be capable of being recalled from its location in storage and moved to a location elsewhere in the computer, such as to the arithmetic unit, in millionths of a second. The storage area is one of the most important units of the computer.

The stored program provides adaptable and automatic processing of data and unlimited application as well.

The arithmetic logic unit performs three basic functions: data transfer, arithmetic calculation, and decision making.

Data transfer involves the relocation of data within the computer. Arithmetic calculation consists of addition, subtraction, multiplication and division, as well as, logical operations. Decision making is the computer’s ability to compare two quantities or numbers to determine if one of the numbers is smaller than, equal to, or greater than the second of the numbers and respond by taking an appropriate action based on the result of the comparison.

The control unit in the computing system is considered to be the “brain of the brain”. Control includes all of the circuitry necessary to perform all functions throughout the computing system. The control device performs the following functions:

1. Determines the instruction to be executed.

2. Determines the operation to be performed by the instructions.

3. Determines what data, if any, are needed and where they are stored.

4. Determines where any results are to be stored.

5. Determines where the next instruction is located.

6. Causes the instruction to be carried out or executed.

(see reference 4)

Computer Languages

The language is a series of numbers and letters of the alphabet, or special characters that are used to represent patterns which can be recognized by the computer and cause specific operations to take place.

The FORTRAN and BASIC are the most commonly used languages. FORTRAN has become so common and heavily used, that some variation of it is available for virtually every medium-to large-computer manufactured. FORTRAN consists of an interpreter and a compiler and operates in two phases. The first phase produces a machine language program to solve a particular problem. The second phase executes the machine language program created in the first phase, processes the data and produces the final results.

Learning the machine language for a computer is a fairly slow process, but by using the “processing languages” such as BASIC and FORTRAN one would be able to write simple, usable programs after a few hours of study.

FORTRAN and BASIC are powerful languages with ingenious translation programs. BASIC is a foundation for all programming because the basic concepts involved remain unchanged. It is an important tool for use with numerical problems.

Since numerical problems are encountered in all of science including mathematics and engineering, as well a business, economics, statistics, psychology, and other fields, a problem stated in FORTRAN and BASIC can be modified for almost any modern computer.

BASIC stands for “Beginner’s All-Purpose Symbolic Instruction Code”. It is an easily learned and easily used computer language. m e user can communicate a program via a console or terminal similar in appearance to a typewriter, with results being returned almost instantaneously.

Each BASIC statement must have a line number between 1 and 9999. Input statements should be labeled in order, as the BASIC compiler automatically sequences them by line number. Spaces should be left when assigning line numbers so that additions can be made without having to reassign the line numbers. A common way is to assign to statements line numbers which are multiples of 10 (i.e., 10, 20, 30, 40, etc.) In this way numbers are in sequence and still allow up to nine additional statements to be inserted between existing statements without violating the sequences.

The computer is capable of dealing quickly and easily with arithmetic computations. As a result virtually every computer language has symbols which represent the fundamental operations of addition, subtraction, multiplication, division, and exponentiation. These five operations are available in BASIC and are represented by the following:

1. + addition

2. - subtraction

3. * multiplication

4. / division

5. **(exponentiation)4

The arithmetic expression is one or more variables or constants combined by operators to perform on constants and variables to achieve a desired result or value. m e operation of exponentiation refers to the raising of a number to a power. Ex. 34


1. A + B

2. A - B

3. A * B

4. A / B

5. 3 ** 5 (= 3 * 3 * 3 * 3 * 3 = 729)

You also can combine all operations in one line. Ex. (A + B = C / D) **2

A simple program in BASIC consists of an ordered set of instructions which generally involve:

1. Putting data into the computer.

2. Calculating something based on the data.

3. Getting the answer out of the computer.

4. Telling the computer to stop running the program.

(see reference 7 & 10)

In some computer centers the computer is already on. In others the user will have to log on and enter a password or number. That information will be given at the appointed time and place. When using the computer, the following may be done:

Logon (user presses RETURN key)

Computer User Number

User types in his/her identification number and presses the RETURN key.

User types in the type of program Ex. BASIC

User also types in whether the program is new or old.

If the program is old the user can type the word

LIST and the entire program will be listed.

If the program is new it can be typed in and saved by typing the word SAVE.

For more information about the computer invite someone to speak to the class and visit a computer center. (see resource list)

Computer Programming

Programming is the solving of problems using a computer. The programmer needs to know how to solve the problem independently of the computer, how to express the steps in the solution of the problem in terms acceptable to the computing process, and how to cause the computer to execute the instructions. There are three major areas to consider when wiring a program. They are: analyzing, coding and operating.

The following should be considered when writing a program. A specific and exact statement should be written of the problem because no slang term will be acceptable. Examine the input data very closely and determine the lowest and highest value and the number of characters in each item. Also the arrangement of the data upon input and if deemed necessary make a reference diagram. Design the arrangement of the output so that it will be easy to read. Understand what is to be accomplished, and how many records will be produced.

Decide on a method for solving the problem, write a plan of action and establish a connection between the input and output desired results. The plan could be sentence or a paragraph, a flow chart may be more convenient. A flow chart shows step by step what is to be done and the order in which it is to be done.

Before a programmer can begin to write a program many areas must be considered and many questions have to be answered about the nature of the problem, its input and output. Some of these questions are; what are the desired results, what are the output formats and media?

Errors and unnecessary changes can be avoided if the programmer considers the following:

1. “Problem Analysis”

2. Flowchart Application

3. Coding and executing the application program.

4. Documentation

After working with the computer for a while students will be able to write simple programs. Some activities that students can write and do are:


20 LET A = 89764

30 LET B = 43512

40 LET N = A + B

50 PRINT “A = “A, “B = “B, “A + B = “N

60 END

A = 89764 B = 43512A + B = 133276

20 LET W = 4

30 LET Y = 424

40 LET Z = W * Y

50 PRINT “W = “W, “Y = “Y, “W * Y = “Z

60 END

W = 37 Y = 437 W * Y = 1696

20 LET Z = 89764

30 LET B = 43512

40 LET Z - B = Y

50 PRINT “A = A, “B = B, “A - B = Y

60 END

A = 89764 B = 43512A - B = 46252

20 LET Y = 422

30 LET X = 2

40 LET Z = Y/X

50 PRINT “y” = “Y”, “X” = X, Y/X = 211

60 END

Y = 422X = 2 Y/X = 211

Terms and Definitions

Analog Computer—Determines quantitative results by incorporating physical processes (mostly voltage drops in simple circuits) which involve the desired arithmetic operations.
Arithmetic Operation—Any of the basic operations of arithmetic, for example, the operations of addition, subtraction, multiplication, and division.
Arithmetic Unit—The part of the computer system that contains the circuits that perform the arithmetic operations.
Artificial Intelligence—The capacity of the computer to perform functions, normally associated with human intelligence, reasoning, learning and self-improvement.
Artificial Language—A language based on a set of prescribed rules that are established prior to usage.
Base—It is a reference value. It is also a number that is multiplied by itself as many times as the exponent indicates.
Binary Element—Data that may take either of two values or states.
Bit—The smallest basic unit of information. The word bit was derived from the terms binary digit.
Code—A system of symbols for representing data. It provides a substitute name in the form of arbitrary characters, for the actual name or numbers.
Digital Computer—A machine that operates with discrete steps and symbols for information. The computation is done with symbols representing digits and with manipulation of these symbols by the rules of arithmetic.
Byte—A sequence of eight adjacent on as a unit.
Calculator—It is a data processor, suitable for performing operations which require intervention by a human operator frequently.
Computer—A data processor which performs computations including arithmetic or logic operations, without human intervention during the run. It is capable of solving problems by accepting data and performing described operations on the data and supplying the results of these operations.
Data Processing System—A machine consisting of a network of components capable of accepting information, processing it in a prescribed way and producing the results.
Disk—A thin metal-like phonograph record coated with ferrous oxide to provide a recording surface.
Program—An ordered series of instructions or statements in a form that the computer can understand. It is prepared to achieve a certain result.
Programmer—A person who writes and tests programs for the computer.
FORTRAN—A language used to express computer programs by arithmetic formulas. (The original acronym was based on Formula Translation).
Automatic Computer—A computer that can perform a sequence of operations without human intervention.
Special Purpose Computer—A computer that is designed to take care of a restricted class of problems.
Storage Unit—A device into which data can be entered and held and retrieved for later use. This device is essential to the operations of the computer and it is sometimes called the memory of the computer. Stored in this unit are the necessary instructions to direct the computer in the solution of a given problem.
Stored Computer Program—A computer controlled by internally stored instructions that can synthesize, store, and in some cases alter instructions as though they were data and that can subsequently execute these instructions.
Some of the terms may be used for a project.

to top

Teacher Bibliography and Reference

1. Bartee, Thomas C., Digital Computer Fundamentals, New York: McGraw Hill Inc., 1972. This book presents the principles of modern digital computers. The uses of digital computers in business, industry and science are described. m e book also treats computer number systems.

2. Davis, Phillip., Computer Science and Applied Mathematics, New York: Academic Press, 1975.

3. Forsythe, George Elmer, Computer Methods for Mathematical Computations, Englewood Cliff, New Jersey: Prentice Hall Inc., 1977. This book covers skills, concepts, methods and techniques.

4. Fuori, William M., Introduction to the Computer, Englewood Cliff, New Jersey: Prentice Hall Inc., 1977. This book provides a basic understanding of what the computer is, what it can do, and how the computer can serve professional needs.

5. Goldstein, Herman, H., The Computer From Pascal to Von Neumann, Princeton, New Jersey: Princeton University Press, 1972.

6. Kuck, David J., The Structure of Computers and Computations, New York John Wiley & Sons, 1978. This book covers the work of computer systems, their organization. It includes the history of machines and how they arrived at their present position and projects what may be ahead, based on current knowledge. Mathematical computations are also included.

7. Bennett, William Ralph J., Scientific and Engineering Problem-Solving With the Computer, Englewood Cliff, New Jersey: Prentice Hall Inc., 1976.

to top

Student Reading List

1. Cook, Joseph, The Electronic Brain: How it Works, New York: G.P. Putnam & Sons, 1969.

2. De Rossi, Claude J., Computer Tools for Today, University of Chicago Press, 1972.

3. Darn, Williams & Greenberg, J., Mathematics and Computing with Fortran Programming, New York: John Wiley & Sons, 1967.

4. Kroft, Christopher C., Computer the Mind Stretcher, Chicago: Dial Press, 1969.

5. O’Brien, Linda, Computers, New York: Watts, 1978.

6. Scott, Theodore C., Computer Programming Techniques, New York: Doubleday, 1964.

to top

Contents of 1981 Volume VI | Directory of Volumes | Index | Yale-New Haven Teachers Institute

© 2016 by the Yale-New Haven Teachers Institute
Terms of Use Contact YNHTI