History of
programming is necessary to learn before start coding.
The first programming languages
designed to communicate instructions to a computer were written in the 1950s.
An early high-level programming language to be designed for a
computer was Plankalkül, developed by the Germans for Z1 by Konrad Zuse between
1943 and 1945.
The first high-level
language to have an associated compiler, was created by Corrado Böhm in 1951,
for his PhD thesis. The first commercially available language was FORTRAN
developed in 1956 (first manual appeared in 1956, but first developed in 1954)
by John Backus, a worker at IBM.
When FORTRAN
was first introduced it was treated with suspicion because of the belief that
programs compiled from high-level language would be less efficient than those
written directly in machine code. FORTRAN became popular because it provided a
means of porting existing code to new computers, in a hardware market that was
rapidly evolving. FORTRAN eventually became known for its efficiency. Over the
years, FORTRAN had been updated, with standards released for FORTRAN-66,
FORTRAN-77 and FORTRAN-92.
1. Fortran :
In 1957,
John Backus at IBM finished Fortran (formula translator) together with an
efficient compiler. It could turn lines of Fortran into machine code that was
almost as good as if a "real" computer programmer had written it. This
made Fortran the first real computer language. It was free for what were
obvious business reasons at the time. IBM viewed itself primarily as selling
computers, not programs or languages. Those were just things it did to support
its computer business or as part of the service contract -- imagine Purina
giving away free puppies.
Not only did
Fortran let more people write programs, faster. It also made it much easier to
adapt an old program and recompile, or run the same program on multiple types
of computers (you just needed someone to write a Fortran compiler, hopefully a
good one, for that machine.)
2. Cobol :
In the 50's,
physical computers had been divided into scientific and business models. For
examples, scientific computers (ex: the IBM 701, 1953) used decimal numbers,
computed logs and cosins and only printed number answers. Business computers
(IBM 650, 1954) had to read names and addresses, and manipulate records (Bob
Smith, 101 OakDale, owes $50, due Apr 3rd) but not much math beyond two decimal
places. Fortran was known as the "scientific" computer language. By
1960, a consortium of the US government and businesses developed a business
computer language, COBOL. It was based on FLOW-MATIC.
Many computer scientists regard cobol as the worst existing computer
language, ever. Here's a psuedo-snippet. It is a loop (lines 110-120) that
reads and adds all employee salaries.
3. Imperative vs Functional :
Nowadays,
COBOL and Fortran look remarkably similar. They are both imperative languages:
variables, a=b+c*7, if-else, while(x<10), dothing(a,b). Do one line at a
time until you get to the end. Almost all of the languages here are imperative.
C++ and Java are sometimes called "object oriented" languages, but
they are imperative languages which also have object orientedness.
The other major paradigm for computer languages is functional as
in, using functions for everything. LISP, scheme and ML fall into this
category. They've been around for a while, and work pretty well, but are
generally not used for production programming, so most people haven't heard of
them. At ISU, we use scheme in (required course) cs342.
4. Basic :
As colleges
got the "new" mini-computers, students started writing programs
(mostly grad students, at first, for their research.) Fortran was easier to
learn than assembly, but still not exactly easy. BASIC was written at Dartmouth
college, in 1964, as something students could learn quickly. For example, x ** 4 computes
x to the fourth power (compare to pow(x,4), which calls the power function
with inputs x and 4.)
In 1975,
when the Altair personal computer kit needed a very simple programming
langauge, Maker Ed Roberts decided on a stripped-down version of Basic (the
first people who answered his ad were Paul Allan and Bill Gates, then in
Harvard.) The Apple-II, in 1977, came with Basic, written by Apple creator
Steve Wozniak. You could turn it on and start typing in a Basic program, then
save it on your 5-1'4" floppy disk with a command. Books with games,
written in Basic, became popular. Later PC's, the PET and TRS-80 also used
Basic. People with PCs in the 80's, the author included, learned Basic as a
first language.
5. Interpreted languages :
As a practical matter, a program could be written, saved on magnetic tape
(or paper tape,) compiled and saved on a different magnetic tape. On a personal
computer, that wasn't practical. A compiler was a fairly large program, home
computers didn't have lots of fast off-line storage and, well, it would be nice
to just type it in and say RUN. The solution was an interpreter.
An interpreter is a program that reads lines in a computer language and
runs them. You write your interpreter in assembly (or in some other language
and compile it, but the first were in assembly) and put it in the chip (that
way it is there when you turn on the computer -- it "knows" Basic.)
They are slower than compiled programs for the obvious reason: the computer
reads the next line of the interpreter, which says to read the next line of
your program. If it is x=3 the
interpreter jumps to the part of itself that knows about equals which looks up x in a table
(instead of having precomputed that a is location 3AF6.)
For a PC, interpreters were fine. They were small, the programs took up
less space than if you compiled them, and home users didn't care that much
about a few hundreths of a second, if they were playing hangman or entering
recipes. Of course, games on early PC's were all written in assembly, for the
speed.
Shell scripts (sh, csh, bash) are interpreted. This way the same program
that reads your keyboard commands can also run scripts. It is formally known as
the command interpreter. The slowness isn't much of a drawback, since most of
the things they are doing (cat, tr, cut, ...) are compiled programs.
6. Other languages :
There were
hundreds(?) of imperative languages written, with various features. Many are
still in limited use. ALGOL(~'60) was the first popular rewrite of FORTRAN. It
never caught on, but the ideas were copied by many other langauges. PL/1 (~'64,
programming language 1) was written by IBM and used for part of the operating
system for their 360 (OS/360). It also never caught on since it was difficult
to write a compiler for, as was known as a "kitchen sink" language,
for having way too many features. But, since it was written by IBM, some PL/1
is still in use.
7. C/C++ :
UNIX was
written at AT&T Bell labs in PDP assembly code. When it came time to
rewrite it, in 1972, a programming language C was written (yes, there is a B,
but it never amounted to anything.) C was designed to allow you to easily
manipulate individual bits and bytes, look at specific memory locations and in
general be very close to the computer (a low-level high-level
language.) These aren't things a normal programmer cares about. In 1991, when
Linux Torvalds wanted to write his own OS (now named Linux) C was the obvious
choice. Most Unix/Linux utilities (cat, tr, bash) are written in C.
As campuses bought mini-computers, running UNIX, computer science students
started learning and prefering C, the language of UNIX.
In the 80's Bjarne Stroustrup at AT&T BELL Labs worked on an improved
"object oriented" version of C, named C++. It became
"official" in 1998, but various version were used long before that.
ISU taught programming in C++ from (at least) 1997 to 2004. MS-Windows is
written in assembly and in C++.
8. Java :
SUN was
using C++ to develope applications for chips in cell phones, cable TV boxes and
such. C++ can be a tricky language to test, and a bad or unlucky programmer can
make mistakes that are very difficult to find. Also, C++ was written for speed
and efficiency. While it can produce fine graphics, it can be tricky.
In 1992 James Gosling, at SUN, had written OAK (later renamed JAVA, after
someone in a focus group said it made you feel energetic, like coffee.) It was
intended to run on "things like VCRs, telephones, games, automobiles,
dishwashers, thermostats...". The idea was it would compile down to fake
assembly (bytecode) and each kind of chip would have a small program that knew
how to run bytecode, making it appear to run the same, anywhere. It didn't
catch on at first inthe consumer electronics market, but the sudden growth of
the internet around 1994 was a lucky break.
SUN contacted the leading browser maker, NetScape, about adding a Java
bytecode interpreter (a virtual machine (as opposed to a real
one) to their browser. This allowed web page makers to program in Java, where
it would run in a browser on a Mac, UNIX, MS-Windows, ... . One important
feature of Java is can run in a "sand-box." An amount of memory is
allocated which it can't go outside of, leaving no chance for viruses, etc…
As the the web became popular Java became the prefered way to write the
"front-end" of applications. For example, it might be used to bring
up a pretty web-like page you fill in, with help from the Java program. Once it
has all the data, it sends it to a COBOL program to update the database.
Java is still owned and controlled by SUN. For a computer to be labelled
"Java compliant" SUN needs to approve the compiler/interpreter. In
theory, this means a Java program will run without changes on any computer.
9. Object Oriented languages :
High-level
languages are a little slower than assembly code, but allow much faster
programming with fewer errors. For most programs assembly is completely
impractical -- it would take too long and and have too many errors. Linux does
use assembly for small snippets of frequently used code. As computers' speed
and memory increased, we had the same problem with imperative high-level
languages -- too long to write programs and too many errors. A solution to this
was object-oriented languages.
Two problems with really big programs involve the fact that we tend to
have large data structures (say, a list of employees, each with name, address,
etc... ) and lots of functions that do something to them (compute total weekly
salary, highest salary, add an employee, ... .) When we need to split overtime
into time-and-a-half and double-overtime, nearly everything needs to be
rewritten to account for it. Even worse, we may have a list of free-lance
employees that we made by copying all of the employee stuff and making a few
changes. All of that needs to be updated as well. With enough of these
(suddenly employee addresses need to include country) it can be a huge mess.
An object-oriented language forces you to limit who can look directly at
the data-structure, making everyone else use them. For example, a function
salary(x) would compute the salary of employee x. The totalSalary and
highSalary functions would use it. This makes them slightly slower, takes a
little longer to program, but will make it much easier when you need to change
the way salary is computed. This is called information hiding.
For free-lance employees, object-oriented languages allow you to say they
are an employee, with these few changes. You don't need to copy any of the
employee code. This is called inheritance (free-lance
employees inherit the data structure and functions of employees.) Now, if there
is a chance in how taxes are deducted, you can change it for employees and it
will be automatically changed for free-lance ones.
10. Niche Web languages :
1. MS visual basic :
Forms and javascript are now probably the most popular way to make pretty
Graphical User Interfaces (over the web.) Before they became standardized,
Microsoft, in 1991, introduced it's "GUI-language" visual basic (no
relationship to basic -- it could have been called visual fortran, or pascal,
etc... .) It has been revised 5 times (to VB6, in 1998) as it had to handle
more common graphics features, interact with different versions of MS-Windows,
etc... .
Like javascript, most of VB is about how to make various types of boxes,
looking various ways and how to get the data from them.
VBA (visual basic for applications) is similar to VB, and is for writing
snippits of code that can be run from Excel, MS-Word, etc... as a sort of
"macro."
2. Javascript :
During the
browser fight between NetScape and InternetExplorer/MicroSoft, in 1995, NetScape
introduced a small computer language, built into its browser, that could run
little programs to "cool up" a web page (or make it incredibly
annoying.) This was named LiveScript, but renamed JavaScript for no good
reason. Even though MS put Netscape out of business, JavaScript stayed popular.
IEEE's history site: http://www.computer.org/history/development/1952.htm
Go To: the
story of the math majors, bridge players, engineers, chess wizards, maverick
scientists and iconoclasts -- the programmers who created the software
revolution; Steve Lohr, Basic Books, 2001
No comments:
Post a Comment