COMPUTER HISTORY
As we know, there are history behind everything. Same with computer, it has it's own history.
The early computer aren't like nowadays computers. It was limited to
only do single job that is calculating. There are few type of ancient
calculating device like abacus and arithmometer. But, as time passes
through years, the modern computer was confirmed invented in the year of
1940 known as the vacuum tube. The modern computer keep growing until
now to ensure the pleasure among the computer users.abacus |
arithmometer |
1940 - 1956
The first generation of computer is big in physical size but small in storage size. It is to big, until can fit in a room just for the processor. The first generation of computer used vacuum tube for circuitry and magnetic drums for memory.
The most popular computer at the time named UNIVAC and ENIAC
VACUUM TUBE COMPUTER |
TRANSISTOR
1956 - 1963
Transistor computer design was started in 1952. It was clear that the
project could provide valuable experience in the use of the recently
introduced transistors. It was built even though the germanium point
transistors were more unreliable than valves, because semi-conductors
held out the promise of lower power consumption, higher operating
speeds, smaller size and greater reliability in the future.
The experimental Transistor Computer was first operational in November 1953 and it is believed to be the first transistor computer to come into operation anywhere in the world.
INTEGRATED CIRCUIT
1964 - 1971
In designing a complex electronic machine like a computer it was always
necessary to increase the number of components involved in order to make
technical advances. The monolithic (formed from a single crystal)
integrated circuit placed the previously separated transistors,
resistors, capacitors and all the connecting wiring onto a single
crystal (or 'chip') made of semiconductor material. Kilby used germanium
and Noyce used silicon for the semiconductor material.
In 1961 the first commercially available integrated circuits came from
the Fairchild Semiconductor Corporation. All computers then started to
be made using chips instead of the individual transistors and their
accompanying parts.The original IC had only one transistor, three
resistors and one capacitor and was the size of an adult's pinkie
finger. Today an IC smaller than a penny can hold 125 million
transistors.
MICROPROCESSOR
1971 - PRESENT
As with many advances in technology, the microprocessor was an idea
whose time had come. Three projects arguably delivered a complete
microprocessor at about the same time, Intel's 4004, Texas Instruments'
TMS 1000, and Garrett AiResearch's Central Air Data Computer.
In a easy way, microprocessor hold thousand of integrated circuit in one small body of silicon chip.
early microprocessor |
latest microprocessor |
ARTIFICIAL INTELLIGENCE
PRESENT AND BEYOND
Artificial intelligence (AI) is the intelligence of machines and the branch of computer science that aims to create it. AI textbooks define the field as "the study and design of intelligent agents"
where an intelligent agent is a system that perceives its environment
and takes actions that maximize its chances of success. John McCarthy,
who coined the term in 1955, defines it as "the science and engineering of making intelligent machines."
AI research is highly technical and specialized, deeply divided
into subfields that often fail to communicate with each other. Some of
the division is due to social and cultural factors: subfields have grown
up around particular institutions and the work of individual
researchers. AI research is also divided by several technical
issues. There are subfields which are focussed on the solution of
specific problems, on one of several possible approaches, on the use of
widely differing tools and towards the accomplishment of particular
applications. The central problems of AI include such traits as reasoning, knowledge, planning, learning, communication, perception and the ability to move and manipulate objects. General intelligence (or "strong AI") is still among the field's long term goals. Currently popular approaches include statistical methods, computational intelligence and traditional symbolic AI. There are an enormous number of tools used in AI, including versions of search and mathematical optimization, logic, methods based on probability and economics, and many others.