Evalution Of Computer.docx

  • Uploaded by: dheeraj
  • 0
  • 0
  • October 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Evalution Of Computer.docx as PDF for free.

More details

  • Words: 16,451
  • Pages: 78
Submitted in partial fulfillment of the award of the degree in Bachelor of Business Administration [Computer Aided Management] of MAHARSHI DAYANAND UNIVERSITY, ROHTAK Session 2014-2015 Under the guidance of Mrs. Jyoti Malhotra (Lecturer - BBA Department)

Submitted By: Lao Prakash Mishra BBA (CAM) - 6TH Sem. Roll No. : 22039 University Roll No.:

DAV CENTENARY COLLEGE NH-3, N.I.T., Faridabad (Haryana)

ACKNOWLEDGEMENT I am very much thankful to Mrs. Jyoti Malhotra (PROJECT GUIDE) for giving me opportunity and her guidance which helps me throughout preparing this report. She has also provided me a valuable suggestions and excellence guidance about this project which proved very helpful to me to utilize my theoretical knowledge in practical field.

I am thankful to M.D University, Rohtak for putting me to this valuable exposure into the field of Research Methodology.

I would also like to thanks my family for motivating me and supporting me at every part of my life.

At last I am also thankful to my friends, who have given me their constructive advice, educative suggestion, encouragement, co-operation and motivation to prepare this report.

(LAO PRAKASH MISHRA)

PREFACE The title of my project is “Evolution of Computers”

This project report is on how computers have evolved from the earlier twentieth century till today. It contains generation of computers, programming languages and programming codes used in a computer to make it more user-friendly. This report also tells you why computers become an essential need in today’s world and how they are influencing our life. It also tells about the business uses of computers. And what is the scope and uses of computers and also what are the professions related to the computers in a person can make his career.

(LAO PRAKASH MISHRA)

CONTENTS

S.No.

Topic

1.

Introduction To The Topic

2.

Review of literature

3.

Research Methodology a) Objectives of the study b) Scope of the study c) Data Collection d) Limitations of the study

4.

Data Analysis & Interpretation

5.

Conclusion & Suggestions

6.

Bibliography

PAGE NO.

CHAPTER-1 INTRODUCTION TO THE TOPIC

INTRODUCTION TO THE TOPIC

Charles Babbage, FRS (/ˈbæbɪdʒ/; 26 December 1791 – 18 October 1871) was an English polymath. A mathematician, philosopher, inventor and mechanical engineer, Babbage is best remembered for originating the concept of a programmable computer. Considered a "father of the computer", Babbage is credited with inventing the first mechanical computer that eventually led to more complex designs. His varied work in other fields has led him to be described as "pre-eminent" among the many polymaths of his century. Parts of Babbage's uncompleted mechanisms are on display in the London Science Museum. In 1991, a perfectly functioning difference engine was constructed from Babbage's original plans. Built to tolerances achievable in the 19th century, the success of the finished engine indicated that Babbage's machine would have worked.

The word "computer" was first used The word "computer" was first recorded as being used in 1613 and originally was used to describe a human who performed calculations or computations. The definition of a computer remained the same until the end of the 19th century when people began to realize machines never get tired and can perform calculations much faster and more accurately than any team of human computers ever could. A computer is a general purpose device that can be programmed to carry out a set of arithmetic or logical operations automatically. Since a sequence of operations can be readily changed, the computer can solve more than one kind of problem.

Conventionally, a computer consists of at least one processing element, typically a central processing unit (CPU), and some form of memory. The processing element carries out arithmetic and logic operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices allow information to be retrieved from an external source, and the result of operations saved and retrieved. Mechanical analog computers started appearing in first century and were later used in the medieval era for astronomical calculations. In World War II, mechanical analog computers were used for specialized military applications. During this time the first electronic digital computers were developed. Originally they were the size of a large room, consuming as much power as several hundred modern personal computers (PCs). Modern computers based on integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space. Simple computers are small enough to fit into mobile devices, and mobile computers can be powered by small batteries. Personal computers in their various forms are icons of the Information Age and are what most people think of as “computers.” However, theembedded computers found in many devices from MP3 players to fighter aircraft and from toys to industrial robots are the most numerous. Over the past four decades the computer industry has experienced four generations of development. The first generation used Vacuum Tubes (1940 – 1950s) to discrete diodes to transistors (1950 – 1960s), to small and medium scale integrated circuits (1960 – 1970s) and to very large scale integrated devices (1970s and beyond). Increases in device speed and reliability and reduction in hardware cost and physical size have greatly enhanced computer performance. Attempts by humans to develop a tool to manipulate data go back as far as 2600 BC when the Chinese came up with the abacus. The slide rule was invented in 1621 and remained widely used until the emergence of electronic calculators in the 1970s. Both these examples of early devices were mechanical and on a human scale. In 1830 the English mathematician Charles Babbage conceived an analytical engine, which could be programmed with punched cards to carry out calculations. It was different from its predecessors because it was able to make decisions based on its own computations, such as sequential control, branching and looping. Almost all computers in use today follow this basic

idea laid out by Babbage, which is why he is often referred to as 'the father of computers.' The analytical engine was so complex that Babbage was never able to build a working model of his design. It was finally built more than 100 years later by the London Science Museum.

In Rural India: How Computers and Internet Can Change the Game If the rural population in India learns to Google, the face and demographics of our motherland shall technically and ironically go under a plastic surgery. With the setting up of Computer Centers in villages in the rural India, there will be a dramatic and strong wave of advancement that will fuel development. If the Government and any non-governmental organization happen to succeed in taking computers to all villages, with internet facility duly available, the 20% share of the 80% rural areas in the national income shall do away with all economic and social obstacles in growth and development.

With the idea of setting up computer centers, one thing should be made very clear that all the inhabitants of the villages should get a fair chance to learn, discover and use this valuable invention in a constructive way for the betterment of their country and their own socio-economic and educational status. Every girl, every married or unmarried woman, widow, spinster, should be given an opportunity with some kind of compulsion to learn how to use computers and know what wonders can they do to their lives. Because the rural areas in India are still male dominated, hence the women along with the men should stand an equal chance to use and learn how to operate computers. The 60 percent of Indians whose primary occupation is agriculture should be individually taught how to use internet and computers as a medium to improve their agricultural practices and use the technological advancement to the best of their interests. Via video conferencing, agricultural experts can communicate with a large number of farmers to make them understand the new yielding technologies, and suggest measures in case the food grains fail to grow on time. The graphical presentations or the 3-D diagrams of various agricultural lands and crops with animation should be shown to the rural citizens to understand visually in an easy manner as to how can they improve upon the rudimentary methods of farming.

Besides farming, setting up of computer centers can be astoundingly beneficial for the students and children who fall in the school going age-group. They can be shown and taught with various encyclopedias and videos on the Internet to make learning an easy process and visually appealing. Diagrams and photos of anything and everything will surely make them understand and learn the subjects better. Beside other subjects, the setting up of computers can be highly beneficial to introduce and teach computer applications and computer science including languages and programming in java and C++. The rural students can hence get a better chance to study the latest subjects in the right manner. Another great outcome of computer centers can be the management of mails and data for and by the citizens for all surveys and postal. Any important information can travel within a fraction of a second to the rural areas, adding to their benefits. They can also watch news channels and TV via Live TV or 3-G technology on the internet which will be a life-changing experience for those places in India who have lost all hope to get cable connection. Moreover, this can connect the rest of the world with local Panchayats once they have their very own websites. Computer centers can change the rural infrastructure, education and standard of living dramatically, only if it is implemented and timely managed by people in authority, duly catering to any kind of problem and hindrance in communication and usage. The rural population should be forced and persuaded to use this technology to their benefit, which requires immense honesty on the policy formulators’ part. The mere set up shall not do wonders. It will be the timely management and service to offer help and guidance to the rural areas, especially making it mandatory for women to participate under strict security, free from rural pressures and orthodox ideology of their family members. Because it is the women who are capable of carrying forward a great future for development; besides, they can benefit from it by taking healthcare tips and mutual discussions with and on internet under the able guidance of faraway doctors and experts. With organizations now taking the plunge into rural technology and building businesses around it, the digital media revolution has already begun.

Uses of Computers: Computers have become an essential part of modern human life. Since the invention of computer they have evolved in terms of increased computing power and decreased size. Owing to the widespread use of computers in every sphere, Life in today’s world would be unimaginable without computers. They have made human lives better and happier. There are many computer uses in different fields of work. Engineers, architects, jewelers, and filmmakers all use computers to design things. Teachers, writers, and most office workers use computers for research, word processing and emailing. Small businesses can use computers as a point of sale and for general record keeping.

1) Computers Aid at Education: Computers have its dominant use in the education field which can significantly enhance performance in learning. Even distance learning is made productive and effective through internet and video-based classes. Researchers have massive usage of these computers in their work from the starting to till the end of their scholarly work.

2) Computers in our Health and Medicine: Most of the medical information can now be digitized from the prescription to reports. Computation in the field of medicine allows us tooffer varied miraculous therapies to the patients. ECG’s, radiotherapy wasn’t possible without computers.

3) Aid of Computers at Financial Institutions: We know well that computers are being used by the financial institutions like banks for different purposes. The foremost important thing is to store information about different account holders in a database to be available at any time. Keeping the records of the cash flow, giving the information regarding your account,

4) Computers for our Pass time: Computers are now the major entertainers and the primary pass time machines. We can use computers for playing games, watching movies, listening to music, drawing pictures.

5) Computers are a part of our Transport System: With internet on computers we can know the details of the buses or trains or the flight available to our desired destination. The timings and even the updates on the delay can also be known through these computers.We can book our tickets through online. Staff of the transport system will keep a track of the passengers, trains or flight details, departure and arrival timings by using computers.

6) Inevitable use of Computers in Business and Corporate Stages: Every single information shared can be recorded by using computer. Official deals and the issues were made even through online. We use email system to exchange the information. It has wide uses in marketing, stock exchanges and bank. Even the departmental stores can’t run effectively without computer.

7) Wonders of Computer in E-Commerce: Electronic mail is the revolutionary service offered by the computes. Video Conferencing is also another major advantage. Electronic shopping through online shopping added favor to purchaser and merchants. Electronic banking is now at your hand where every bank has online support for transaction of monetary issues. You can easily transfer your money anywhere even from your home.

8) Computer at our Defense: Computers are the main tools which help in developing missiles and other equipment in the deference system. Designing and the maintenance are possible only through computers.

Computer builds the links between the soldiers and commanders through the satellite. Construction of weapons and controlling their function is not possible without the aid of computers. The list of the criminals and the records of the cops are maintained regularly in the system.

9) Computer is today’s Designer: As per the title,computers aid in designing buildings, magazines, prints, newspapers, books and many others. The construction layouts are designed beautifully on system using different tools and software’s.

CHAPTER – 2 REVIEW OF LITERATURE

History of Computers A computer is an electronic machine that accepts information, stores it, processes it according to the instructions provided by a user and then returns the result. Today, we take computers for granted, and they have become part of our everyday activities. While computers as we know them today are relatively recent, the concepts and ideas behind computers have quite a bit of history - time for a whirlwind tour of how we got to the age of email, YouTube and Facebook.

Pre-twentieth century

The Ishango bone Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was probably a form of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, probably livestock or grains, sealed in hollow unbaked clay containers. The use of counting rods is one example.

Suanpan (the number represented on this abacus is 6,302,715,408)

The abacus was early used for arithmetic tasks. What we now call the Roman abacus was used in Babylonia as early as 2400 BC. Since then, many other forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money. The ancient Greek-designedAntikythera mechanism, dating between 150 to 100 BC, is the world's oldest analog computer. The Antikythera mechanism is believed to be the earliest mechanical analog "computer", according to Derek J. de Solla Price. It was designed to calculate astronomical positions. It was discovered

in

1901

in

the Antikythera

wreckoff

the

Greek

island

of Antikythera,

between Kythira and Crete, and has been dated to circa 100 BC. Devices of a level of complexity comparable to that of the Antikythera mechanism would not reappear until a thousand years later. Many mechanical aids to calculation and measurement were constructed for astronomical and navigation use. The plan sphere was a star chart invented by AbuRayan al-Bruin in the early 11th century. The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BC and is often attributed to Hipparchus. A combination of the plan sphere and diopter, the astrolabe was effectively an analog computer capable of working out several different kinds mechanical calendar computer and gear-wheels was invented by Abi Bakr of Isfahan, Persia in 1235. AbuRayan al-Bruininvented the first mechanical geared lunisolar calendar astrolabe, an early fixed-wired knowledge processing machine with a gear train and gear-wheels, circa 1000 AD. The sector, a calculating instrument used for solving problems in proportion, trigonometry, multiplication and division, and for various functions, such as squares and cube roots, was developed in the late 16th century and found application in gunnery, surveying and navigation. The plan meter was a manual instrument to calculate the area of a closed figure by tracing over it with a mechanical linkage.

A slide rule The slide rule was invented around 1620–1630, shortly after the publication of the concept of the logarithm. It is a hand-operated analog computer for doing multiplication and division. As slide rule development progressed, added scales provided reciprocals, squares and square roots, cubes and cube roots, as well as transcendental functions such as logarithms and exponentials, circular and hyperbolic trigonometry and other functions. Aviation is one of the few fields where slide rules are still in widespread use, particularly for solving time–distance problems in light aircraft. To save space and for ease of reading, these are typically circular devices rather than the classic linear slide rule shape. A popular example is the E6B. In the 1770s Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll (automata) that could write holding a quill pen. By switching the number and order of its internal wheels different letters, and hence different messages, could be produced. In effect, it could be mechanically "programmed" to read instructions. Along with two other complex machines, the doll is at the Muse d'Art et d'Histoire of Neuchâtel, Switzerland, and still operates. The tide-predicting machine invented by Sir William Thomson in 1872 was of great utility to navigation in shallow waters. It used a system of pulleys and wires to automatically calculate predicted tide levels for a set period at a particular location. The differential analyzer, a mechanical analog computer designed to solve differential equations by integration, used wheel-and-disc mechanisms to perform the integration. In 1876 Lord Kelvin had already discussed the possible construction of such calculators, but he had been stymied by the limited output torque of the ball-and-disk integrators. In a differential analyzer, the output of one integrator drove the input of the next integrator, or a graphing output. The torque amplifier was the advance that allowed these machines to work. Starting in the 1920s, Vinegar Bush and others developed mechanical differential analyzers.

First general-purpose computing device

A portion of Babbage's Difference engine. Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer. Considered the "father of the computer", he conceptualized and invented the first mechanical computer in the early 19th century. After working on his revolutionary difference engine, designed to aid in navigational calculations, in 1833 he realized that a much more general design, anAnalytical Engine, was possible. The input of programs and data was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. The Engine incorporated an arithmetic logic unit,control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete. The machine was about a century ahead of its time. All the parts for his machine had to be made by hand - this was a major problem for a device with thousands of parts. Eventually, the project was dissolved with the decision of the British Government to cease funding. Babbage's failure to complete the analytical engine can be chiefly attributed to difficulties not only of politics and financing, but also to his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow. Nevertheless his son, Henry Babbage, completed a simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906.

Later Analog computers

Sir William Thomson's third tide-predicting machine design, 1879–81 During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers. The first modern analog computer was a tide-predicting machine, invented by Sir William Thomson in 1872. The differential analyzer, a mechanical analog computer designed to solve differential equations by integration using wheel-and-disc mechanisms, was conceptualized in 1876 by James Thomson, the brother of the more famous Lord Kelvin. The art of mechanical analog computing reached its zenith with the differential analyzer, built by H. L. Hazen and Vinegar at MITstarting in 1927. This built on the mechanical integrators of James Thomson and the torque amplifiers invented by H. W. Nieman. A dozen of these devices were built before their obsolescence became obvious. By the 1950s the success of digital electronic computers had spelled the end for most analog computing machines, but analog computers remain in use in some specialized applications such as education (control systems) and aircraft (slide rule).

The digital computer age begins The principle of the modern computer was first described by computer scientist Alan Turing, who set out the idea in his seminal 1936 paper, On Computable Numbers. Turing reformulated Kurt Gödel's 1931 results on the limits of proof and computation, replacing Gödel's universal arithmetic-based formal language with the formal and simple hypothetical devices that became known as Turing machines. He proved that some such machine would be capable of performing any conceivable mathematical computation if it were representable as an algorithm. He went on to prove that there was no solution to the Entscheidungsproblem by first showing that the halting problem for Turing machines isundecidable: in general, it is not possible to decide algorithmically whether a given Turing machine will ever halt. He also introduced the notion of a 'Universal Machine' (now known as a Universal Turing machine), with the idea that such a machine could perform the tasks of any other machine, or in other words, it is provably capable of computing anything that is computable by executing a program stored on tape, allowing the machine to be programmable. Von Neumann acknowledged that the central concept of the modern computer was due to this paper. Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine.

The first electromechanical computers By 1938 the United States Navy had developed an electromechanical analog computer small enough to use aboard a submarine. This was the Torpedo Data Computer, which used trigonometry to solve the problem of firing a torpedo from a boat to a moving target. During World War II similar devices were developed in other countries as well.

Replica of Zuse'sZ3, the first fully automatic, digital (electromechanical) computer. Early digital computers were electromechanical; electric switches drove mechanical relays to perform the calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric computers, originally using vacuum tubes. The Z2, created by German engineer Konrad Zuse in 1939, was one of the earliest examples of an electromechanical relay computer. In 1941,

Zuse followed his

earlier machine up with

the Z3, the world's

first

working electromechanical programmable, fully automatic digital computer. The Z3 was built with 2000 relays, implementing a 22 bit word length that operated at a clock frequency of about 5–10 Hz.Program code and data were stored on punched film. It was quite similar to modern machines in some respects, pioneering numerous advances such as floating point numbers. Replacement of the hard-to-implement decimal system (used in Charles Babbage's earlier design) by the simpler binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time. The Z3 was probably a complete Turing machine.

The introduction of digital electronic programmable computers with vacuum tubes Purely electronic circuit elements soon replaced their mechanical and electromechanical equivalents, at the same time that digital calculation replaced analog. The engineerTommy Flowers, working at the Post Office Research Station in London in the 1930s, began to explore the possible use of electronics for the telephone exchange. Experimental equipment that he built in 1934 went into operation 5 years later, converting a portion of the telephone exchange network into an electronic data processing system, using thousands of vacuum tubes. In the US, John Vincent Atanasoff and Clifford E. Berry of Iowa State University

developed and tested the Atanasoff–Berry Computer (ABC) in 1942, the first "automatic electronic digital computer". This design was also all-electronic and used about 300 vacuum tubes, with capacitors fixed in a mechanically rotating drum for memory.

Colossus was the first electronicdigital programmable computing device, and was used to break German ciphers during World War II. During World War II, the British at Bletchley Park achieved a number of successes at breaking encrypted German military communications. The German encryption machine, Enigma, was first attacked with the help of the electro-mechanical bombes. To crack the more sophisticated German Lorenz

SZ 40/42 machine,

used

for

high-level

Army communications, Max

Newman and his colleagues commissioned Flowers to build the Colossus. He spent eleven months from early February 1943 designing and building the first Colossus. After a functional test in December 1943, Colossus was shipped to Bletchley Park, where it was delivered on 18 January 1944 and attacked its first message on 5 February. Colossus was the world's first electronic digital programmable computer. It used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of boolean logical operations on its data, but it was not Turing-complete. Nine Mk II Colossi were built (The Mk I was converted to a Mk II making ten machines in total). Colossus Mark I contained 1500 thermionic valves (tubes), but Mark II with 2400 valves, was both 5 times faster and simpler to operate than Mark 1, greatly speeding the decoding process.

ENIAC was the first Turing-complete device, and performed ballistics trajectory calculations for the United States Army. The US-built ENIAC (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the US. Although the ENIAC was similar to the Colossus it was much faster and more flexible. It was unambiguously a Turing-complete device and could compute any problem that would fit into its memory. Like the Colossus, a "program" on the ENIAC was defined by the states of its patch cables and switches, a far cry from the stored program electronic machines that came later. Once a program was written, it had to be mechanically set into the machine with manual resetting of plugs and switches. It combined the high speed of electronics with the ability to be programmed for many complex problems. It could add or subtract 5000 times a second, a thousand times faster than any other machine. It also had modules to multiply, divide, and square root. High speed memory was limited to 20 words (about 80 bytes). Built under the direction of John Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIAC's development and construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, using 200 kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors.

Stored program computers eliminate the need for re-wiring

A section of the Manchester Small-Scale Experimental Machine, the first stored program computer. Early computing machines had fixed programs. Changing its function required the re-wiring and re-structuring of the machine. With the proposal of the stored-program computer this changed. A stored-program computer includes by design an instruction set and can store in memory a set of instructions (a program) that details the computation. The theoretical basis for the storedprogram computer was laid by Alan Turing in his 1936 paper. In 1945 Turing joined the National Physical Laboratory and began work on developing an electronic stored-program digital computer. His 1945 report ‘Proposed Electronic Calculator’ was the first specification for such a device. John von Neumann at the University of Pennsylvania also circulated his First Draft of a Report on the EDVAC in 1945.

Ferranti Mark 1, c. 1951.

The Manchester Small-Scale Experimental Machine, nicknamed Baby, was the world's first stored-program computer. It was built at the Victoria University of Manchester by Frederic C. Williams, Tom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948. It was digital storage device. Although the computer was considered "small and primitive" by the standards of its time, it was the first working machine to contain all of the elements essential to a modern electronic computer. As soon as the SSEM had demonstrated the feasibility of its design, a project was initiated at the university to develop it into a more usable computer, the Manchester Mark 1. The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the world's first commercially available general-purpose computer.Built by Ferranti, it was delivered to the University of Manchester in February 1951. At least seven of these later machines were delivered between 1953 and 1957, one of them to Shell labs in Amsterdam. In October 1947, the directors of British catering company J. Lyons & Company decided to take an active role in promoting the commercial development of computers. The LEO I computer became operational in April 1951 and ran the world's first regular routine office computer job.

Transistors replace vacuum tubes in computers

A bipolar junction transistor The bipolar transistor was invented in 1947. From 1955 onwards transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Silicon junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space.

At the University of Manchester, a team under the leadership of Tom Kilburn designed and built a machine using the newly developedtransistors instead of valves. Their first transistorized computer and the first in the world, was operational by 1953, and a second version was completed there in April 1955. However, the machine did make use of valves to generate its 125 KHZ clock waveforms and in the circuitry to read and write on its magnetic drum memory, so it was not the first completely transistorized computer. That distinction goes to the Harwell CADET of 1955, built by the electronics division of the Atomic Energy Research Establishment at Harwell.

Integrated circuits replace transistors The next great advance in computing power came with the advent of the integrated circuit. The idea of the integrated circuit was first conceived by a radar scientist working for the Royal Radar Establishment of the Ministry of Defense,Geoffrey W.A. Dummer.Dummer presented the first public description of an integrated circuit at the Symposium on Progress in Quality Electronic Components in Washington, D.C. on 7 May 1952. The first practical ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor. Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958. In his patent application of 6 February 1959, Kilby described his new device as “a body of semiconductor material ... wherein all the components of the electronic circuit are completely integrated.” Noyce also came up with his own idea of an integrated circuit half a year later than Kilby. His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium. This new development heralded an explosion in the commercial and personal use of computers and led to the invention of the microprocessor. While the subject of exactly which device was the first microprocessor is contentious, partly due to lack of agreement on the exact definition of the term "microprocessor", it is largely undisputed that the first single-chip microprocessor was the Intel 4004, designed and realized by Ted Hoff, Federico Faggin, and Stanley Mazor at Intel.

The Evolution of the Computer

1. First Generation (1939-1954) - vacuum tube 2. Second Generation Computers (1954-1959) - transistor 3. Third Generation Computers (1959-1971) - IC 4. Fourth Generation (1971-Present) - microprocessor 5. Fifth Generation (Present and Beyond)

1. First Generation (1939-1954) - vacuum tube

1937 - John V. Atanasoff designed the first digital electronic computer 1939 - Atanasoff and Clifford Berry demonstrate in Nov. the ABC prototype 1941 - Konrad Zuse in Germany developed in secret the Z3 1943 - In Britain, the Colossus was designed in secret at Bletchley Park to decode German Messages. 1944 - Howard Aiken developed the Harvard Mark I mechanical computer for the Navy 1945 - John W. Mauchly and J. Presper Eckert built ENIAC at U of PA for the U.S. Army

1946 -Mauchly and Eckert start Electronic Control Co., received grant from National Bureau of Standards to build an ENIAC-type computer with magnetic tape input/output, renamed UNIVAC in 1947 but run out of money, formed in Dec. 1947 the new company Eckert-Mauchly Computer Corporation (EMCC).

1948 - Howard Aiken developed the Harvard Mark III electronic computer with 5000 tubes 1948 - U of Manchester in Britain developed the SSEM Baby electronic computer with CRT memory.

1949 - Mauchly and Eckert in March successfully tested the BINAC stored-program computer for Northrop Aircraft, with mercury delay line memory and a primitive magnetic tape drive; Remington Rand bought EMCC Feb. 1950 and provided funds to finish UNIVAC

1950- Commander William C. Norris led Engineering Research Associates to develop the Atlas, based on the secret code-breaking computers used by the Navy in WWII; the Atlas was 38 feet long, 20 feet wide, and used 2700 vacuum tubes

1951 - S. A. Lebedev developed the MESM computer in Russia

1951 - Remington Rand successfully tested UNIVAC March 30, 1951, and announced to the public its sale to the Census Bureau June 14, 1951, the first commercial computer to feature a magnetic tape storage system, the eight UNISERVO tape drives that stood separate from the CPU and control console on the other side of a garage-size room. Each tape drive was six feet high and three feet wide, used 1/2-inch metal tape of nickel-plated bronze 1200 feet long, recorded data on eight channels at 100 inches per second with a transfer rate of 7,200 characters per second. The complete UNIVAC system weighed 29,000 pounds, included 5200 vacuum tubes, and an offline typewriter-printer UNIPRINTER with an attached metal tape drive. Later, a punched card-to-tape machine was added to read IBM 80-column and Remington Rand 90column cards.

1952 - Remington Rand bought the ERA in Dec. 1951 and combined the UNIVAC product line in 1952: the ERA 1101 computer became the UNIVAC 1101. The UNIVAC I was used in November to calculate the presidential election returns and successfully predict the winner, although it was not trusted by the TV networks who refused to use the prediction.

1954 - The SAGE aircraft-warning system was the largest vacuum tube computer system ever built. It began in 1954 at MIT's Lincoln Lab with funding from the Air Force. The first of 23 Direction Centers went online in Nov. 1956, and the last in 1962. Each Center had two 55,000tube computers built by IBM, MIT, AND Bell Labs. The 275-ton computers known as "Clyde" were based on Jay Forrester's Whirlwind I and had magnetic core memory, magnetic drum and magnetic tape storage. The Centers were connected by an early network, and pioneered development of the modem and graphics display.

2.Second Generation Computers (1954 -1959) – transistor

1950 - National Bureau of Standards (NBS) introduced its Standards Eastern Automatic Computer (SEAC) with 10,000 newly developed germanium diodes in its logic circuits, and the first magnetic disk drive designed by Jacob Rainbow

1953 - Tom Watson, Jr., led IBM to introduce the model 604 computer, its first with transistors, that became the basis of the model 608 of 1957, the first solid-state computer for the commercial market. Transistors were expensive at first, cost $8 vs. $.75 for a vacuum tube. But Watson was impressed with the new transistor radios and gave them to his engineers to study. IBM also developed the 650 Magnetic Drum Calculator, the first by IBM to use magnetic drum memory rather punched cards, and began shipment of the 701 scientific "Defense Calculator" that was the first of the Model 700 line that dominated main frame computers for the next decade

1955 - IBM introduced the 702 business computer; Watson on the cover of Time magazine March 28

1956 - Bendix G-15A small business computer sold for only $45,000, designed by Harry Husky of NBS

1959 - General Electric Corporation delivered its Electronic Recording Machine Accounting (ERMA) computing system to the Bank of America in California; based on a design by SRI, the ERMA system employed Magnetic Ink Character Recognition (MICR) as the means to capture data from the checks and introduced automation in banking that continued with ATM machines in 1974

3. Third Generation Computers (1959 -1971) – IC

1959 - Jack Kilby of Texas Instruments patented the first integrated circuit in Feb. 1959; Kilby had made his first germanium IC in Oct. 1958;

Robert Noyce at Fairchild used planar process to make connections of components within a silicon IC in early 1959;The first commercial product using IC was the hearing aid in Dec. 1963; General Instrument made LSI chip (100+ components) for Hammond organs 1968

1964 - IBM produced SABRE, the first airline reservation tracking system for American Airlines; IBM announced the System/360 all-purpose computer, using 8-bit character word length (a "byte") that was pioneered in the 7030 of April 1961 that grew out of the AF contract of Oct. 1958 following Sputnik to develop transistor computers for BMEWS

1968 - DEC introduced the first "mini-computer", the PDP-8, named after the mini-skirt; DEC was founded in 1957 by Kenneth H. Olsen who came for the SAGE project at MIT and began sales of the PDP-1 in 1960

1969 - Development began on ARPAnet, funded by the DOD

1971 - Intel produced large scale integrated (LSI) circuits that were used in the digital delay line, the first digital audio device

4. Fourth Generation (1971-Present) – microprocessor

1971 - Gilbert Hyatt at Micro Computer Co. patented the microprocessor; Ted Hoff at Intel in February introduced the 4-bit 4004, a VSLI of 2300 components, for the Japanese company Busicom to create a single chip for a calculator; IBM introduced the first 8-inch "memory disk", as it was called then, or the "floppy disk" later; Hoffmann-La Roche patented the passive LCD display for calculators and watches; in November Intel announced the first microcomputer, the MCS-4; Nolan Bushnell designed the first commercial arcade video game "Computer Space"

1972 - Intel made the 8-bit 8008 and 8080 microprocessors; Gary Kendall wrote his Control Program/Microprocessor (CP/M) disk operating system to provide instructions for floppy disk drives to work with the 8080 processor. He offered it to Intel, but was turned down, so he sold it

on his own, and soon CP/M was the standard operating system for 8-bit microcomputers; Bushnell created Atari and introduced the successful "Pong" game

1973 - IBM developed the first true sealed hard disk drive, called the "Winchester" after the rifle company, using two 30 Mb platters; Robert Metcalfe at Xerox PARC created Ethernet as the basis for a local area network, and later founded 3COM

1974 - Xerox developed the Alto workstation at PARC, with a monitor, a graphical user interface, a mouse, and an ethernet card for networking

1975 - The Altair personal computer is sold in kit form, and influenced Steve Jobs and Steve Wozniak

1976 - Jobs and Wozniak developed the Apple personal computer; Alan Shugart introduced the 5.25-inch floppy disk

1977 - Nintendo in Japan began to make computer games that stored the data on chips inside a game cartridge that sold for around $40 but only cost a few dollars to manufacture. It introduced its most popular game "Donkey Kong" in 1981, Super Mario Bros in 1985 1978 –VisiCalc spreadsheet software was written by Daniel Bricklin and Bob Frankston 1979 – Micropro released Wordstar that set the standard for word processing software

1980 - IBM signed a contract with the Microsoft Co. of Bill Gates and Paul Allen and Steve Ballmer to supply an operating system for IBM's new PC model. Microsoft paid $25,000 to Seattle Computer for the rights to QDOS that became Microsoft DOS, and Microsoft began its climb to become the dominant computer company in the world.

1984 - Apple Computer introduced the Macintosh personal computer January 24.

1987 - Bill Atkinson of Apple Computers created a software program called HyperCard that was bundled free with all Macintosh computers. This program for the first time made hypertext popular and useable to a wide number of people. Ted Nelson coined the terms "hypertext" and "hypermedia" in 1965 based on the pre-computer ideas of Vinegarpublished in his "As We May Think" article in the July 1945 issue of The Atlantic Monthly.

5. Fifth Generation (Present and Beyond)

1991 - World-Wide Web (WWW) was developed by Tim Berners-Lee and released by CERN.

1993 - The first Web browser called Mosaic was created by student Marc Andreesen and programmer Eric Bina at NCSA in the first 3 months of 1993. The beta version 0.5 of X Mosaic for UNIX was released Jan. 23 1993 and was instant success. The PC and Mac versions of Mosaic followed quickly in 1993.

Mosaic was the first software to interpret a new IMG tag, and to display graphics along with text. Berners-Lee objected to the IMG tag, considered it frivolous, but image display became one of the most used features of the Web. The Web grew fast because the infrastructure was already in place: the Internet, desktop PC, home modems connected to online services such as AOL and CompuServe

1994 - Netscape Navigator 1.0 was released Dec. 1994, and was given away free, soon gaining 75% of world browser market.

1996 - Microsoft failed to re-cognized the importance of the Web, but finally released the much improved browser Explorer 3.0 in the summer.

CHAPTER-3 RESEARCH METHODOLOGY

RESEARCH METHODOLOGY The process used to collect information and data for the purpose of making business decisions. The methodology may

include publication research, interviews, surveys and

other

research

techniques and could include both present and historical information.

Research Methodologyis a way to find out the result of a given problem on a specific matter orproblem that is also referred as research problem. In Methodology, researcher uses differentcriteria for solving/searching the given research problem. Different sources use different type ofmethods for solving the problem. If we think about the word “Methodology”, it is the way of searching or solving the research problem.

In Research Methodology, researcher always tries to search the given question systematically inour own way and find out all the answers till conclusion. If research does not worksystematically on problem, there would be less possibility to find out the final result. For finding or exploring research questions, a researcher faces lot of problems that can be effectivelyresolved with using correct research methodology.

OBJECTIVES OF THE STUDY  To gather knowledge about how much computers have been evolved from the 20th century till today  To learn about the programming codes and programming languages used in computers.  To know the advantages and disadvantages of computers in our life’s.  How computers play an important role for the growth of the society.

SCOPE OF THE STUDY This study comprises of in-depth coverage of History of Computers, Evolution of computer technologythrough various generations (Gen-1to Gen-5). A little bit of programming languages and programming codes have also been covered. However this study is limited to the extent that it is only related to the study of history of computers, evolution of computers through various generations and a little bit of programming languages and codes. This study does not focus on the wide spectrum impact of computers which it has on our social lives and businesses. There is no doubt in saying that technology is changing our view of living but side by side it has some side effects or we can say bad effects especially on our children and society and it can’t be neglected.

DATA COLLECTION The various types of research designs which are used in collection of data are:

1. Exploratory Research Design 2. Descriptive Research Design 3. Diagnostic Research Design 4. Experimental Research Design

This is an Exploratory Study The type of research design which is used in this study is exploratory research design

Depending on the source, statistical data are classified under two categories:

Primary Data: Primary data are obtained by a study specifically designed to fulfill the data needs of the problem at hand. Such data are original in character and are generated in large no. of survey conducted with a sample.

Secondary Data: These data are not originally collected but rather obtained from published or unpublished source The source of data collection is Secondary Data.

The various sources which I used for collecting data for this study involve: 

The use of internet



Books



Newspapers



Magazines

These sources really help me a lot in gathering and collecting knowledge about computers which helped me in completing my project on the topic evolution of computers successfully.

Limitations of the Study 

This report is related to only evolution of computers



It is one sided study only which involves basically how computers emerge, develop and evolve from 20th century till today and what’s their scope in future.



This study does not focus on the impact of computers on the society. How much computers are beneficial for the society and what’s their harmful side.

CHAPTER-4 DATA ANALYSIS AND INTERPRETATION

COMPUTER INTRODUCTION An amazing machine! We are living in the computer age today and most of our day to day activities cannot be accomplished without using computers. Sometimes knowingly and sometimes unknowingly we use computers. Computer has become an indispensable and multipurpose tool. We are breathing in the computer age and gradually computer has become such a desire necessity of life that it is difficult to imagine life without it. DEFINITION For most of the people, computer is a machine used for a calculation or a computation, but actually it is much more than that. Precisely, “Computer is an electronic device for performing arithmetic and logical operation.” Or “Computer is a device or a flexible machine to process data and converts it into information.” To know about the complete process that how computer works, we will have to come across the various terms such as Data, Processing and Information. First of all we will have to understand these terms in true sense.

DATA “Data” is nothing but a mare collection of basic facts and figure without any sequence. When the data is collected as facts and figure, it has no meaning at that time, for example, name of student, names of employees etc.

PROCESSING

‘Processing’ is the set of instruction given by the user or the related data to output the meaningful information. Which can be used by the user? The work of processing may be the calculation, comparisons or the decision taken by the computer.

INFORMATION ‘Information ’is the end point or the final output of any processed work. When the output data is meaning it is called information

DEVELOPMENT OF COMPUTER Actually speaking electronic data processing does not go back more than just half a century i.e. they are in existence merely from early 1940’s. In early days when our ancestor used to reside in cave the counting was a problem. Still it is stated becoming difficult. When they started using stone to count their animals or the possession they never knew that this day will lead to a computer of today. People today started following a set of procedure to perform calculation with these stones, which later led to creation of a digital counting device, which was the predecessor the first calculating device invented, was known as ABACUS.

1. THE ABACUS Abacus is known to be the first mechanical calculating device. Which was used to be performed addition and subtraction easily and speedily? This device was a first develop Ed by the Egyptians in the 10th century B.C, but it was given it final shape in the 12th century A.D. by the Chinese educationists. Abacus is made up of wooden frame in which rod where fitted across with rounds beads sliding on the rod. It id dividing into two parts called ‘Heaven’ and ‘Earth’. Heaven was the upper part and Earth was the lower one. Thus any no. can be represented by placing the beads at proper place.

2. NAPIER’S BONES As the necessity demanded, scientist started inventing better calculating device. In thus process John Napier’s of Scotland invented a calculating device, in the year 1617 called the Napier Bones. In the device, Napier’s used the bone rods of the counting purpose where some no. is printed on these rods. These rods that one can do addition, subtraction, multiplication and division easily.

3. PASCAL’S CALCULATOR In the year 1642, Blaise Pascal a French scientist invented an adding machine called Pascal’s calculator, which represents the position of digit with the help of gears in it.

4. LEIBNZ CALCULATOR In the year 1671, a German mathematics, Gottfried Leibniz modified the Pascal calculator and he developed a machine which could perform various calculation based on multiplication and division as well.

5. ANALYTICAL ENGINE In the year 1833, a scientist form England knows to be Charles Babbage invented such a machine. Which could keep our data safely? This device was called Analytical engine and it deemed the first mechanical computer. It included such feature which is used in today’s computer language. For this great invention of the computer, Sir Charles Babbage is also known as the father of the computer.

GENERATION OF COMPUTER As the time passed, the device of more suitable and reliable machine was need which could perform our work more quickly. During this time, in the year 1946, the first successful electronic computer called ENIAC was developed and it was the starting point of the current generation of computer

1) FIRST GENRATION ENIAC was the world first successful electronic computer which was develops by the two scientists namely J. P. Eckert and J. W. Mauchy. It was the beginning of first generation computer. The full form of ENIAC is “Electronic Numeric Integrated and Calculator” ENIAC was a very huge and big computer and its weight was 30 tones. It could store only limited or small amount of information. Initially in the first generation computer the concept of vacuum

tubes was used. A vacuum tube was such an electronic component which had very less work efficiency and so it could not work properly and it required a large cooling system.

2) SECOND GENERATION As the development moved further, the second generation computers knocked the door. In this generation, transistors were used as the electronic component instead of vaccum tubes .A transistors is much smaller in the size than that of a vaccum tube. As the size of electrons components decreased from vaccum tube of transistor, the size of computer also decreased and it became much smaller than that of earlier computer.

3) THIRD GENERATION The third generation computers were invented in the year 1964. In this generation of computer, IC (Integrated circuits) was used as the electronic component for computers. The development of IC gave birth to a new field of microelectronics. The main advantage of IC is not only its small size but its superior performance and reliability than the previous circuits. It was first developed by T.S Kilby. This generation of computer has huge storage capacity and higher calculating speed.

4) FOURTH GENERATION This is the generation where we are working today. The computers which we see around us belong to the fourth generation computers. ‘Microprocessor’ is the main concept behind this generation of computer. A microprocessor is a single chip (L.S.I circuit), which is used in a computer for any arithmetical or logical functions to be performed in any program. The honor of developing microprocessor goes to Ted Hoff of U.S.A. He developed first micro-processor, the Intel 4004, as he was working for Intel Corporation, U.S.A with the use of microprocessor in the fourth generation computers, the size of computer become very fast and efficient.

It is evident that the next generation of computer i.e. fifth generation will be developed soon. In that generation, computer will possess artificial intelligence and it would be able to take selfdecisions like a human being.

SCOPE OF COMPUTER Certain characteristics of computer interaction can make computers well suited for distance learning. The features listed below the prospect of the computer use look more promising: 

Access to expert and respected peers.



One to One and much communication.



Active learner participation.



Linking of new learning to concrete on the job problems.



Follow up, feedback and implementation support from pears or experts.



Self-direction control over stop or start, time, pace and place of learning or communication activity.

USES OF A COMPUTER A computer is used in all human life. It has revolutionized all phases of human activities. The most important have been given as follows: a. Routine job handling The routine classical and stenotype jobs calculating and formality bits, salaries, updating stocks, tax return, reservation records and information.

b. Traffic control Controlling traffic, traffic lights. Television cameras are used to maintain traffic light routine.

c. Electronic money Automatic tellers machine (ATM) is very common in banks. You can deposit and withdraw money with the ATM.

d. Electronic office All type information are stored, manipulated and utilized in the electronic form. A document is sent to different place with FAX, internet and e-mail.

e. Industrial Application It plays an important role in production control. It is bringing efficiency it trade and industry.

f. Telephones With help computerized telephone through satellites STD and IST services have been introduced. It maintains the record of calls and does the billing for you.

g. Trade Every type of trade computer is used successfully. It is used in Banks, stock exchanges to control stocks and accounts.

h. Scientific research In every science, the research work becomes economical from time, energy, money point of new. A large data is analyzed very quickly.

i. Medicine There is wide use in medical science e. g. ECG, CAT scan, Ultra sound. The proper and accounts diagnosis is done with the help of computer. The medical apparatus are controlling computerized.

j. Space Science The satellite controlling me the space with the help of computer. The information’s are collected by using the computer from the space satellite.

k. Publication

The composing work is done speedily and economical with the help of computer. The designing work is also done by computer. The quality is maintained is publication by computer.

l. Communications The computer is used for sending message example printer, FAX, e-mail, Internet. The import and export work is done on internet.

m. Film industry It had influenced film industry such as animation; titling etc. The multimedia approach is used in film production with the help of computer. The cartoon films are developed by computers.

n. Education The computer is widely used in the field of education and independent study field of computer science has developed which is popular these days. At every stage computer is compulsory. The distance education is using computer for instructional purpose as multimedia approach. The computer makes teacher learning process effecting by involving audio and visual sense of learners.

Real life uses of Computers in Education Teaching Learning process 

Instructions: 

Instructing the students using PowerPoint slides, Word documents or Web pages and using hyperlinks for better concept clarity.



Helps in improving pronunciation of students by using microphones, headphones, speakers, specially prepared software and special dedicated websites.



Video conferencing, chat and email helps in better communication, hence better concept clarity. Also concept of E-tutor has given access to teachers instantly and given teachers a better chance to earn.



Current syllabus can be viewed through website of the concerned school board; made available to students if teacher has made a website and uploaded using Internet; and updating- using web could be done easily.



Inspiring students to express their imagination using Paint Brush



Encouraging the students to surf web pages and gather relevant detailed information through web pages.



Readymade software could give practice material to students

Learning: 

Collecting notes /pictures/videos from web pages for detailed information and projects/assignments.



Saving the documents as soft copy for future use



Learning through animations, as they are much near to the students



E-books/online libraries/online encyclopedias help to guide in minutes and save precious time and resources.



Creating videos using images, albums for better power point slides.



Simulated Learning gives them an idea of the real situation.



Publication of pamphlet/brochures for awareness with institution and among community members.

Testing and Evaluation process 

Keeping records of students for their academic scores



Keeping records in relation to personal history



Creating question bank for students



Using computers for testing by asking questions from question bank



Online Testing and Evaluation



Analysis and interpretation of the data



Previous year Question papers and sample papers using web sites.

Guidance purposes 

With reference to collective records of the students maintained year wise, stored in computers



Testing for aptitude, interest, psychology using computer data bases and internet.

Library 

Documents stored as soft copy for students/faculty members use



Online magazines , journals, brochures , research articles



Records of the books/record of the books maintained using special library software.



Records of the issues and returns of the books.

School Administration 

Records of students(personal, academic, financial)



Records of employees of school



Accounts of the institution



Decision making process



Aid to memory with minimum paper work



Eye on current regulations of government and affiliating school boards and related authorities



School canteen for billing



Fees collection and maintenance of fees record.



Circulation of instruction/notices and getting it in printed form



Preparation of school magazine.

LANGUAGES OF COMPUTER A language is defined as the medium of expression of thoughts. All the human beings in this world communicate with each other by a language. Similarly, computer also needs some expression medium to communicate with others A computer follows the instructions given by the programmer to perform a specific job. To perform a particular task, programmer prepares a sequence of instructions, known as programmed. A program written for a computer is known as Software. The programmed is stored in RAM. The CPU takes one instruction of the programmed at a time from RAM and executes it. The instructions are executed one by one in sequence and finally produce the desired result. The Journey of computer software machine language to high level languages to modern 4GL / 5GL languages is an interesting one. Let us talk about this in detail.

FIRST GENERATION LANGUAGES 1GLs (Machine language) When the human being stared programming the computer the instruction were given to it in a language that it could easily understand. And that language was machine language. The binary language a language, a language of Is and Os is known as Machine language. Any instruction in this language is given in the form of string of 1s and 0s. Where the symbol I stand for the presence of electrical pulse and 0 stands for the absence of electric pulse. A set of 1s and 0s as 11101101 has a specific meaning to a computer even through it appears as binary number to us. The writing of programmer in machine language is very cumbersome and complicated andthis was accomplished by experts only. All the instructions and input data are fed to the computer in numeric form, specifically a binary form.

SECOND GENERATION LANGUAGES 2GLs (Assembly Language) Lots of efforts are made during last 50 years to obviate the difficulties faced for using the machine language. The first language similar to English was developed in 1950 which was known as Assembly Language or Symbolic Programming Languages. After 1960, the High

Level Languages were developed which bought the common man very to the computer. And this was the main reason for tremendous growth in computer industry. The high level languages are also known as Procedure Oriented Languages.

THIRD GENERATION LANGUAGES (3GLs) (High Level Languages) The assembly language was easier to use compared with machine la language as it relieved the programmer from a burden of remembering the operation – codes and addresses of memory location. Even though the assembly languages proved to be great help to the programmer, a search was continued for still better languages nearer to the conventional English language. The languages developed which were nearer to the English language, for the use of writing the programmer in 1960 were known as High Level languages. The different high level languages which can be used by the common user are FORTRAN, COBOL, BASIC, PASCAL, PL-1 and many others. Each high level language was developed to fulfill some basic requirements for particular type of problems. But further developments are made in each language to widen its utility for different purposes.

FOURTH GENERATION LANGUAGES (4GLs) The 3GLs are procedural in nature i.e., HOW of the problem get coded i.e., the procedures require the knowledge of how the problem will be solved. Contrary to them, 4GLs are nonprocedural. That is only WHAT of the problem is coded i.e., only ‘What is required’ is to be specified and rest gets done on its own. Thus a big program of a 3GLs may get replaced by a single statement of a 4GLs. The main aim of 4GLs is to be cut down on developed and maintenance time and making it easier for users.

GUI BASED LANGUAGES With the invention and popularity of GUI based interfaces. GUI based languages include: 1. TCL/Tk 2. Visual basic

3. Visual C++ 4. C# (Pronounced as C sharp) 5. Visual basic.NET 6. Visual basic 2005

Early computer characteristics Defining characteristics of some early digital computers of the 1940s (In the history of computing hardware)

Name

First

Numeral

Computing

operational

system

mechanism

Turing

Programming

complete

Programcontrolled, Zuse Z3 (Germany) May 1941

by

Binary floating Electro-

punched 35 mm In

point

film stock (but no theory(1998)

mechanical

conditional branch)

Atanasoff–Berry Computer (US)

Colossus Mark 1 (UK)

Not 1942

Binary

Electronic

programmable—

No

single purpose

ProgramFebruary 1944 Binary

Electronic

controlled patch cables

by No

Programcontrolled by 24Harvard Mark I – IBM ASCC (US)

May 1944

Decimal

Electro-

channel punched

mechanical

paper tape (but no

Debatable

conditional branch)

ProgramColossus Mark2 June 1944

Binary

Electronic

(UK)

controlled

by In

patch cables and theory(2011) switches

ProgramZuse Z4 (Germany) March 1945

Binary floating Electro-

controlled

point

punched

mechanical

by 35 mm

Yes

film stock

ProgramJuly 1946

ENIAC (US)

Decimal

Electronic

controlled

by

patch cables and

Yes

switches

Manchester

Scale Experimental Machine(Baby) (UK )

Stored-

SmallJune 1948

Binary

Electronic

program in Willia ms cathode ray tube memory

Yes

Read-only stored programming Modified

September

ENIAC (US)

1948

mechanism using Decimal

Electronic

Yes

the Function Tables as program ROM

Stored-program in Manchester Mark1 (UK)

Williams cathode April 1949

Binary

Electronic

ray tube memory Yes andmagnetic drum memory

Stored-program in EDSAC (UK)

May 1949

Binary

Electronic

mercury delay

Yes

line memory

CSIRAC (Australia)

November 1949

Stored-program in Binary

Electronic

mercury

delay Yes

line memory

Programs The defining feature of modern computers which distinguishes them from all other machines is that they can be programmed. That is to say that some type of instructions (theprogram) can be given to the computer and it will process them. Modern computers based on the von Neumann architecture often have machine code in the form of animperative programming language.

In practical terms, a computer program may be just a few instructions or extend too many millions of instructions, as do the programs for word processors and web browsers for example. A typical modern computer can execute billions of instructions per second (gigaflops) and rarely makes a mistake over many years of operation. Large computer programs consisting of several million instructions may take teams of programmer’s years to write, and due to the complexity of the task almost certainly contain errors.

Stored program architecture Replica of the Small-Scale Experimental Machine (SSEM), the world's first stored-program computer, at the Museum of Science and Industryin Manchester, England This section applies to most common RAM machine-based computers. In most cases, computer instructions are simple: add one number to another, move some data from one location to another, send a message to some external device, etc. These instructions are read from the computer's memory and are generally carried out (executed) in the order they were given. However, there are usually specialized instructions to tell the computer to jump ahead or backwards to some other place in the program and to carry on executing from there. These are called “jump” instructions (or branches). Furthermore, jump instructions may be made to happen conditionally so that different sequences of instructions may be used depending on the result of some previous calculation or some external event. Many computers directly support subroutines by providing a type of jump that “remembers” the location it jumped from and another instruction to return to the instruction following that jump instruction. Program execution might be likened to reading a book. While a person will normally read each word and line in sequence, they may at times jump back to an earlier place in the text or skip sections that are not of interest. Similarly, a computer may sometimes go back and repeat the instructions in some section of the program over and over again until some internal condition is met. This is called the flow of control within the program and it is what allows the computer to perform tasks repeatedly without human intervention. Comparatively, a person using a pocket calculator can perform a basic arithmetic operation such as adding two numbers with just a few button presses. But to add together all of the numbers

from 1 to 1,000 would take thousands of button presses and a lot of time, with a near certainty of making a mistake. On the other hand, a computer may be programmed to do this with just a few simple instructions. For example:

mov No. 0, sum

; set sum to 0

mov No. 1, num loop: add num, sum add No. 1, num

; set num to 1 ; add num to sum ; add 1 to num

cmp num, #1000; compare num to 1000 ble loop halt

; if num <= 1000, go back to 'loop' ; end of program. stop running

Once told to run this program, the computer will perform the repetitive addition task without further human intervention. It will almost never make a mistake and a modern PC can complete the task in about a millionth of a second.

Machine code In most computers, individual instructions are stored as machine code with each instruction being given a unique number (its operation code or opcode for short). The command to add two numbers together would have one opcode; the command to multiply them would have a different opcode, and so on. The simplest computers are able to perform any of a handful of different instructions; the more complex computers have several hundred to choose from, each with a unique numerical code. Since the computer's memory is able to store numbers, it can also store the instruction codes. This leads to the important fact that entire programs (which are just lists of these instructions) can be represented as lists of numbers and can themselves be manipulated inside the computer in the same way as numeric data. The fundamental concept of storing programs in the computer's memory alongside the data they operate on is the crux of the von

Neumann, or stored program, architecture. In some cases, a computer might store some or its entire program in memory that is kept separate from the data it operates on. This is called the Harvard architecture after the Harvard Mark I computer. Modern von Neumann computers display some traits of the Harvard architecture in their designs, such as in CPU caches. While it is possible to write computer programs as long lists of numbers (machine language) and while this technique was used with many early computers,[57] it is extremely tedious and potentially error-prone to do so in practice, especially for complicated programs. Instead, each basic instruction can be given a short name that is indicative of its function and easy to remember – a mnemonic such as ADD, SUB, MULT or JUMP. These mnemonics are collectively known as a computer's assembly language. Converting programs written in assembly language into something the computer can actually understand (machine language) is usually done by a computer program called an assembler.

A 1970s punched card containing one line from a FORTRAN program. The card reads: “Z (1) = Y + W (1)” and is labeled “PROJ039” for identification purposes.

Program design Program design of small programs is relatively simple and involves the analysis of the problem, collection of inputs, using the programming constructs within languages, devising or using established procedures and algorithms, providing data for output devices and solutions to the problem as applicable. As problems become larger and more complex, features such as subprograms, modules, formal documentation, and new paradigms such as object-oriented programming are encountered. Large programs involving thousands of line of code and more require formal software methodologies. The task of developing large software systems presents a significant intellectual challenge. Producing software with an acceptably high reliability within a

predictable schedule and budget has historically been difficult; the academic and professional discipline of software engineeringconcentrates specifically on this challenge.

Bugs The actual first computer bug, a moth found trapped on a relay of the Harvard Mark II computer Errors in computer programs are called “bugs.” They may be benign and not affect the usefulness of the program, or have only subtle effects. But in some cases, they may cause the program or the entire system to “hang,” becoming unresponsive to input such as mouseclicks or keystrokes, to completely fail, or to crash. Otherwise benign bugs may sometimes be harnessed for malicious intent by an unscrupulous user writing an exploit, code designed to take advantage of a bug and disrupt a computer's proper execution. Bugs are usually not the fault of the computer. Since computers merely execute the instructions they are given, bugs are nearly always the result of programmer error or an oversight made in the program's design. Admiral Grace Hopper, an American computer scientist and developer of the first compiler, is credited for having first used the term “bugs” in computing after a dead moth was found shorting a relay in the Harvard Mark II computer in September 1947.

Components Video demonstrating the standard components of a "slimline" computer A general purpose computer has four main components: the arithmetic logic unit (ALU), the control unit, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by buses, often made of groups of wires. Inside each of these parts are thousands to trillions of small electrical circuits which can be turned off or on by means of an electronic switch. Each circuit represents a bit (binary digit) of information so that when the circuit is on it represents a “1”, and when off it represents a “0” (in positive logic representation). The circuits are arranged in logic gates so that one or more of the circuits may control the state of one or more of the other circuits.

The control unit, ALU, and registers are collectively known as a central processing unit (CPU). Early CPUs were composed of many separate components but since the mid-1970s CPUs have typically been constructed on a single integrated circuit called amicroprocessor.

Control unit

Diagram showing how a particularMIPS architecture instruction would be decoded by the control system The control unit (often called a control system or central controller) manages the computer's various components; it reads and interprets (decodes) the program instructions, transforming them into control signals that activate other parts of the computer. Control systems in advanced computers may change the order of execution of some instructions to improve performance. A key component common to all CPUs is the program counter; a special memory cell (a register) that keeps track of which location in memory the next instruction is to be read from. The control system's function is as follows note that this is a simplified description, and some of these steps may be performed concurrently or in a different order depending on the type of CPU: 1. Read the code for the next instruction from the cell indicated by the program counter. 2. Decode the numerical code for the instruction into a set of commands or signals for each of the other systems. 3. Increment the program counter so it points to the next instruction. 4. Read whatever data the instruction requires from cells in memory (or perhaps from an input device). The location of this required data is typically stored within the instruction code. 5. Provide the necessary data to an ALU or register.

6. If the instruction requires an ALU or specialized hardware to complete, instruct the hardware to perform the requested operation. 7. Write the result from the ALU back to a memory location or to a register or perhaps an output device. 8. Jump back to step (1). Since the program counter is (conceptually) just another set of memory cells, it can be changed by calculations done in the ALU. Adding 100 to the program counter would cause the next instruction to be read from a place 100 locations further down the program. Instructions that modify the program counter are often known as “jumps” and allow for loops (instructions that are repeated by the computer) and often conditional instruction execution (both examples of control flow). The sequence of operations that the control unit goes through to process an instruction is in itself like a short computer program, and indeed, in some more complex CPU designs, there is another yet smaller computer called a microsequencer, which runs a microcode program that causes all of these events to happen.

Arithmetic logic unit (ALU) The ALU is capable of performing two classes of operations: arithmetic and logic. The set of arithmetic operations that a particular ALU supports may be limited to addition and subtraction, or might include multiplication, division, trigonometry functions such as sine, cosine, etc., and square roots. Some can only operate on whole numbers (integers) whilst others use floating point to represent real numbers, albeit with limited precision. However, any computer that is capable of performing just the simplest operations can be programmed to break down the more complex operations into simple steps that it can perform. Therefore, any computer can be programmed to perform any arithmetic operation—although it will take more time to do so if its ALU does not directly support the operation. An ALU may also compare numbers and return boolean truth values (true or false) depending on whether one is equal to, greater than or less than the other (“is 64 greater than 65?”).

Logic operations involve Boolean logic: AND, OR, XOR and NOT. These can be useful for creating complicated conditional statements and processing boolean logic. Superscalar computers may contain multiple ALUs, allowing them to process several instructions simultaneously. Graphics processors and computers with SIMD and MIMDfeatures often contain ALUs that can perform arithmetic on vectors and matrices.

Memory Magnetic core memory was the computer memory of choice throughout the 1960s, until it was replaced by semiconductor memory. A computer's memory can be viewed as a list of cells into which numbers can be placed or read. Each cell has a numbered “address” and can store a single number. The computer can be instructed to “put the number 123 into the cell numbered 1357” or to “add the number that is in cell 1357 to the number that is in cell 2468 and put the answer into cell 1595.” The information stored in memory may represent practically anything. Letters, numbers, even computer instructions can be placed into memory with equal ease. Since the CPU does not differentiate between different types of information, it is the software's responsibility to give significance to what the memory sees as nothing but a series of numbers. In almost all modern computers, each memory cell is set up to store binary numbers in groups of eight bits (called a byte). Each byte is able to represent 256 different numbers (2^8 = 256); either from 0 to 255 or −128 to +127. To store larger numbers, several consecutive bytes may be used (typically, two, four or eight). When negative numbers are required, they are usually stored in two's complementnotation. Other arrangements are possible, but are usually not seen outside of specialized applications or historical contexts. A computer can store any kind of information in memory if it can be represented numerically. Modern computers have billions or even trillions of bytes of memory. The CPU contains a special set of memory cells called registers that can be read and written too much more rapidly than the main memory area. There are typically between two and one hundred registers depending on the type of CPU. Registers are used for the most frequently needed data items to avoid having to access main memory every time data is needed. As data is

constantly being worked on, reducing the need to access main memory (which is often slow compared to the ALU and control units) greatly increases the computer's speed. Computer main memory comes in two principal varieties: random-access memory or RAM and read-only memory or ROM. RAM can be read and written to anytime the CPU commands it, but ROM is preloaded with data and software that never changes, therefore the CPU can only read from it. ROM is typically used to store the computer's initial start-up instructions. In general, the contents of RAM are erased when the power to the computer is turned off, but ROM retains its data indefinitely. In a PC, the ROM contains a specialized program called the BIOS that orchestrates loading the computer's operating system from the hard disk drive into RAM whenever the computer is turned on or reset. Inembedded computers, which frequently do not have disk drives, all of the required software may be stored in ROM. Software stored in ROM is often called firmware, because it is notionally more like hardware than software. Flash memory blurs the distinction between ROM and RAM, as it retains its data when turned off but is also rewritable. It is typically much slower than conventional ROM and RAM however, so its use is restricted to applications where high speed is unnecessary. In more sophisticated computers there may be one or more RAM cache memories, which are slower than registers but faster than main memory. Generally computers with this sort of cache are designed to move frequently needed data into the cache automatically, often without the need for any intervention on the programmer's part.

Input/output (I/O) Hard disk drives are common storage devices used with computers. I/O is the means by which a computer exchanges information with the outside world. Devices that provide input or output to the computer are called peripherals. On a typical personal computer, peripherals include input devices like the keyboard and mouse, and output devices such as the display and printer. Hard disk drives, floppy disk drives and optical disc drives serve as both input and output devices. Computer networking is another form of I/O. I/O devices are often complex computers in their own right, with their own CPU and memory. A graphics processing unit might contain fifty or more tiny computers that perform the

calculations necessary to display 3D graphics. Modern desktop computers contain many smaller computers that assist the main CPU in performing I/O.

1. Multitasking While a computer may be viewed as running one gigantic program stored in its main memory, in some systems it is necessary to give the appearance of running several programs simultaneously. This is achieved by multitasking i.e. having the computer switch rapidly between running each program in turn. One means by which this is done is with a special signal called an interrupt, which can periodically cause the computer to stop executing instructions where it was and do something else instead. By remembering where it was executing prior to the interrupt, the computer can return to that task later. If several programs are running “at the same time,” then the interrupt generator might be causing several hundred interrupts per second, causing a program switch each time. Since modern computers typically execute instructions several orders of magnitude faster than human perception, it may appear that many programs are running at the same time even though only one is ever executing in any given instant. This method of multitasking is sometimes termed “time-sharing” since each program is allocated a “slice” of time in turn. Before the era of cheap computers, the principal use for multitasking was to allow many people to share the same computer. Seemingly, multitasking would cause a computer that is switching between several programs to run more slowly, in direct proportion to the number of programs it is running, but most programs spend much of their time waiting for slow input/output devices to complete their tasks. If a program is waiting for the user to click on the mouse or press a key on the keyboard, then it will not take a “time slice” until the event it is waiting for has occurred. This frees up time for other programs to execute so that many programs may be run simultaneously without unacceptable speed loss.

2. Multiprocessing Cray designed many supercomputers that used multiprocessing heavily. Some computers are designed to distribute their work across several CPUs in a multiprocessing configuration, a technique once employed only in large and powerful machines such as supercomputers, mainframe computers and servers. Multiprocessor and multi-core (multiple CPUs on a single integrated circuit) personal and laptop computers are now widely available, and are being increasingly used in lower-end markets as a result. Supercomputers in particular often have highly unique architectures that differ significantly from the basic stored-program architecture and from general purpose computers. They often feature thousands of CPUs, customized high-speed interconnects, and specialized computing hardware. Such designs tend to be useful only for specialized tasks due to the large scale of program organization required to successfully utilize most of the available resources at once. Supercomputers

usually

see

usage

in

large-scale simulation, graphics

rendering,

and cryptography applications, as well as with other so-called “embarrassingly parallel” tasks.

Computer architecture paradigms

While a single computer can be a useful tool, its benefits are amplified when tied together in a network, which then get tied into the Internet.

There are many types of computer architectures: 

Quantum computer vs Chemical computer



Scalar processor vs Vector processor



Non-Uniform Memory Access (NUMA) computers



Register machine vs Stack machine



Harvard architecture vs von Neumann architecture



Cellular architecture

Of all these abstract machines, a quantum computer holds the most promise for revolutionizing computing. Logic gates are a common abstraction which can apply to most of the above digital or analogparadigms. The ability to store and execute lists of instructions called programs makes computers extremely versatile, distinguishing them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a minimum capability (being Turingcomplete) is, in principle, capable of performing the same tasks that any other computer can perform. Therefore any type of computer (netbook, supercomputer,cellular automaton, etc.) is able to perform the same computational tasks, given enough time and storage capacity.

Misconceptions Women as computers in NACA High Speed Flight Station "Computer Room" A computer does not need to be electronic, nor even have a processor, nor RAM, nor even a hard disk. While popular usage of the word “computer” is synonymous with a personal electronic computer, the modern definition of a computer is literally “A device that computes, especially a programmable [usually] electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information.” Any device which processes information qualifies as a computer, especially if the processing is purposeful. Even a human is a computer, in this sense.

Unconventional computing Historically, computers evolved from mechanical computers and eventually from vacuum tubes to transistors. However, conceptually computational systems as flexible as a personal computer can be built out of almost anything. For example, a computer can be made out of billiard balls (billiard ball computer); an often quoted example. More realistically, modern computers are made out oftransistors made of photolithographed semiconductors.

Future There is active research to make computers out of many promising new types of technology, such as optical computers, DNA computers, neural computers, and quantum computers. Most computers are universal, and are able to calculate any computable function, and are limited only by their memory capacity and operating speed. However different designs of computers can give very different performance for particular problems; for example quantum computers can potentially break some modern encryption algorithms (by quantum factoring) very quickly.

Artificial intelligence A computer will solve problems in exactly the way it is programmed to, without regard to efficiency, alternative solutions, possible shortcuts, or possible errors in the code. Computer programs that learn and adapt are part of the emerging field of artificial intelligence and machine learning.

Hardware The term hardware covers all of those parts of a computer that are tangible objects. Circuits, displays, power supplies, cables, keyboards, printers and mice are all hardware.

History of computing hardware Pascal's Calculators

Arthmometer,

Difference engine, Quevedo’s analytical machines.

First generation (mechanical/electromechanical) Programmable devices

Calculators

Second

calculator,

generation

Jacquard loom, Analytical engine, IBM ASCC/Harvard Mark I, Harvard Mark II, IBM SSEC,Z1, Z2, Z3

Atanasoff–Berry

Computer, IBM

604, UNIVAC 60, UNIVAC 120

Colossus, ENIAC, Manchester

(vacuum

Scale

tubes)

Small-

Experimental

Programmable

Machine, EDSAC, Manchester

devices

1, Ferranti

Mark

Pegasus, Ferranti

Mercury, CSIRAC, EDVAC, UNIVAC I, IBM 701, IBM 702, IBM 650, Z22

Mainframes Third

generation

transistors

and

(discrete SSI,

Minicomputer

generation

7090, IBM

7080, IBM

System/360, BUNCH

MSI,

LSIintegrated circuits)

Fourth

IBM

(VLSI Minicomputer

PDP-8, PDP-11, IBM System/36

VAX, IBM System I

System/32, IBM

integrated circuits) 4-bit microcomputer Intel 4004, Intel 4040

8-bit microcomputer

Intel

8008, Intel

6800, Motorola 6809,

16-

Intel

bit microcomputer

65816/65802

32-

Intel

bit microcomputer

68000, ARM

64bit microcomputer

8080, Motorola

8088, Zilog

Z8000, WDC

80386, Pentium, Motorola

Alpha, MIPS, PARISC, PowerPC, SPARC, x8664, ARMv8-A

Embedded computer Intel 8048, Intel 8051

Desktop Personal computer

computer, Home

computer, Laptop digital

computer, Personal

assistant (PDA),Portable

computer, Tablet PC, Wearable computer

Quantum Theoretical/experimental

computer, Chemical computer, DNA computing,

Other hardware topics Mouse, keyboard, joystick, image

Input

Peripheral device (input/output)

scanner, webcam, graphics tablet, microphone

Output

Monitor, printer, loudspeaker

Floppy

Both

drive, hard

disk

drive, optical

disc drive, tele printer

Short range

Computer buses

disk

Long

RS-232, SCSI, PCI, USB

range

(computer

Ethernet, ATM, FDDI

networking)

Software Software refers to parts of the computer which do not have a material form, such as programs, data, protocols, etc. When software is stored in hardware that cannot easily be modified (such as BIOS ROM in an IBM PC compatible), it is sometimes called “firmware.”

Unix and BSD Operating

UNIX

System

V, IBM

AIX, HP-

UX, Solaris (SunOS), IRIX, List of BSD operating systems

system GNU/Linux

List of Linux distributions, Comparison of Linux distribution.

Windows

95, Windows

Microsoft Windows 2000, Windows

98, Windows

Me, Windows

NT, Windows XP, Windows

Vista, Windows 7, Windows 8

86-DOS (QDOS), IBM

DOS

PC

DOS, MS-DOS, DR-

DOS, FreeDOS

Mac OS

Mac OS classic, Mac OS X

Embedded and realtime

List of embedded operating systems

Experimental

Amoeba, Oberon/Bluebottle, Plan 9 from Bell Labs

Multimedia

DirectX, OpenGL, Open AL

Library Programming library C standard library, Standard Template Library

Protocol

TCP/IP, Kermit, FTP, HTTP, SMTP

File format

HTML, XML, JPEG, MPEG, PNG

Data

User

Graphical

user Microsoft

interface

interface(WIMP)

Photon, CDE, GEM, Aqua

Windows, GNOME, KDE, QNX

Text-based

user

interface

Command-line interface, Text user interface

Word Office suite

processing, Desktop

publishing, Presentation

program, Database management system, Scheduling & Time management, Spreadsheet, Accounting software

Browser, E-mail

Internet Access

Design manufacturing

client, Web

server, Mail

transfer

agent, Instant messaging

and

Computer-aided

design, Computer-aided

manufacturing,

Plant management, Robotic manufacturing, Supply chain management

Application Raster Graphics

graphics

editor, Vector

graphics

editor, 3D

modeler, Animation editor, 3D computer graphics, Video editing, Image processing

Audio

Digital

audio

editor, Audio

playback,

Mixing, Audio

synthesis, Computer music

Compiler, Assembler, Interpreter, Debugger, Text Software

editor, Integrated

engineering

performance

development analysis, Revision

configuration management

environment, Software control, Software

Educational

Games

Edutainment, Educational

game, Flight

simulator

Strategy,

Arcade, Puzzle,

Simulation, First-person

shooter, Platform, Massively multiplayer, Interactive fiction

Artificial Misc.

game, Serious

intelligence, Antivirus

scanner, Installer/Package

software, Malware

management

systems, File

manager

Languages There are thousands of different programming languages—some intended to be general purpose, others useful only for highly specialized applications.

Programming languages

Lists

of Timeline of programming languages, List of programming languages by

programming

category, Generational list of programming languages, List of programming

languages

languages, Non-English-based programming languages

Commonly used assembly

ARM, MIPS, x86

languages

Commonly used high-level

Ada, BASIC, C, C++, C#, COBOL, Fortran, Java, Lisp, Pascal, Object Pascal

programming languages

Commonly used scripting

Bourne script, JavaScript, Python, Ruby, PHP, Perl

languages

Professions and organizations As the use of computers has spread throughout society, there are an increasing number of careers involving computers.

Computer-related professions

Hardwarerelated

Softwarerelated

Electrical

engineering, Electronic

engineering, Telecommunications

engineering, Computer engineering, Optical

engineering, Nanoengineering

Computer science, Computer engineering, Desktop publishing, Human–computer interaction, Information technology, Information systems, science, Software, Video game industry, Web design

The need for computers to work well together and to be able to exchange information has spawned the need for many standards organizations, clubs and societies of both a formal and informal nature.

Organizations

Standards groups

ANSI, IEC, IEEE, IETF, ISO, W3C

Professional societies

ACM, AIS, IET, IFIP, BCS

Free/open software groups

source Free Software Foundation, Mozilla Foundation, Apache Software Foundation

Advantages of Computer Computer has made a very vital impact on society. It has changed the way of life. The use of computer technology has affected every field of life. People are using computers to perform different tasks quickly and easily. The use of computers makes different task easier. It also saves time and effort and reduces the overall cost to complete a particular task.

Many organizations are using computers for keeping the records of their customers. Banks are using computers for maintaining accounts and managing financial transactions. The banks are also providing the facility of online banking. The customers can check their account balance from using the internet. They can also make financial transaction online. The transactions are handled easily and quickly with computerized systems.

People are using computers for paying their bills, managing their home budgets or simply having some break and watching a movie, listening to songs or playing computer games. Online services like skype or social media websites are used for communication and information sharing purposes.

Computer can be used as a great educational tool. Students can have access to all sort of information on the internet. Some great websites like Wikipedia, Khan’s Academy, Code Academy, Byte-Notes provides free resources for students & professionals. Moreover, the computer is being used in every field of life such as medical, business, industry, airline and weather forecasting.

Disadvantages of computer The use of computer has also created some problems in society which are as follows:

Unemployment Different tasks are performed automatically by using computers. It reduces the need of people and increases unemployment in society.

Wastage of time and energy Many people use computers without positive purpose. They play games and chat for a long period of time. It causes wastage of time and energy. Young generation is now spending more times on the social media websites like Facebook, Twitter etc. or texting their friends all night through smartphones which is bad for both studies and their health. And it also has adverse effects on the social life.

Data Security The data stored on a computer can be accessed by unauthorized persons through networks. It has created serious problems for the data security.

Computer Crimes People use the computer for negative activities. They hack the credit card numbers of the people and misuse them or they can steal important data from big organizations.

Privacy violation The computers are used to store personal data of the people. The privacy of a person can be violated if the personal and confidential records are not protected properly.

Health risks The improper and prolonged use of computer can results in injuries or disorders of hands, wrists, elbows, eyes, necks and back. The users can avoid health risks by using the computer in proper position. They must also take regular breaks while using the computer for longer period of time. It is recommended to take a couple of minutes break after 30 minutes of computer usage.

Impact on Environment The computer manufacturing processes and computer waste are polluting the environment. The wasted parts of computer can release dangerous toxic materials. Green computer is a method to reduce the electricity consumed and environmental waste generated when using a computer. It includes recycling and regulating manufacturing processes. The used computers must be donated or disposed of properly.

CHAPTER-5 CONCLUSION & SUGGESTIONS

CONCLUSION 1) Computers have been evolved from vacuum tubes to transistors, transistors to integrated circuits, integrated circuits to microprocessors and still they are evolving day by day. There are 5 generations of computers generation starting from generation first, second, three four and fifth which shows our or tell us really how much computers have been evolved from the first generation till today and we can’t say that only computers have been evolved, our lives have also been evolved with the evolution in computers and technology. Computers have really made our lives very easier and they have been contributing a lot for the development of our society and businesses. We can’t deny the fact that where we are standing today in this universe is only because of the modern technology he have and even we can’t imagine our lives without technology in today’s world. 2) The programs or data and information stored in a computer are in binary form(0,1) and it’s very difficult for a common person to understand this concept, it can only be understood by the experts. So to make it easy or understandable for us programming languages and codes are been used to make the computers more and more user-friendly. The programming language which are used in a computer are machine language (first generation language), assembly language (second generation language), high level language (third generation language). Almost in every computers in today’s world high level languages are been used to make it user-friendly. 3) It’s a fact that computers play a very important role in our lives but still computers have some disadvantages also. Such as: 

Unemployment



Wastage of time and energy



Data Security



Computer Crimes



Privacy violation



Health risks



Impact on Environment

4) Computers play an important role for the growth of our society. In future we will be experiencing a technology which is beyond our imaginations right now.Computers are very important and useful today. Some of the uses of computers are as follows: a) Computers Aid at Education b) Computers in our Health and Medicine c) Aid of Computers at Financial Institutions d) Computers for our Pass time e) Computers are a part of our Transport System f) Inevitable use of Computers in Business and Corporate Stages g) Wonders of Computer in E-Commerce h) Computer at our Defense i) Computer is today’s Designer

SUGGESTION Instead of containing basically only detail knowledge about computers and their evolutions and generations, this project also provides all the in depth detail about computers what are the programming codes and programming languages used in the computers to make it more user friendly. One can also learn about the impact of computers on society and what are the various professions related to computers.

This study involves all the essential details which a person might need to fully understand about the computers and even what are the different fields of professions which a person can choose as a career opportunity for himself.

CHAPTER-6 BIBLIOGRAPHY

BIBLIOGRAPHY

TEXTBOOKS: 

Fuegi, J. and Francis, J. "Lovelace & Babbage and the creation of the 1843 'notes'". IEEE Annals of the History of Computing 25 No. 4 (October–December 2003): Digital Object Identifier



Kempf, Karl (1961). "Historical Monograph: Electronic Computers Within the Ordnance Corps". Aberdeen Proving Ground (United States Army).



Phillips, Tony (2000). "The Antikythera Mechanism I". American Mathematical Society. Retrieved 5 April 2006.



Shannon, Claude Elwood (1940). "A symbolic analysis of relay and switching circuits". Massachusetts Institute of Technology.



Digital Equipment Corporation (1972). PDP-11/40 Processor Handbook (PDF). Maynard, MA: Digital Equipment Corporation.



Verma, G.; Mielke, N. (1988). "Reliability performance of ETOX based flash memories". IEEE International Reliability Physics Symposium.

WEBSITES:  http://newint.org/books/reference/world-development/casestudies/2013/03/14/computers-cellphones-in-developing-world/  http://www.youthkiawaaz.com/2011/06/computers-in-rural-india/  http://wikieducator.org/History_of_Computer_Development_%26_Generation_of_Comp uter  http://www.byte-notes.com/advantages-and-disadvantages-computers  http://www.informationq.com/uses-of-computers-in-different-fields-areas-sectorsindustries-education/

Related Documents


More Documents from "rajesh"

Computer Underground
December 2019 31
Mo 4 Iso Mgmt 2
December 2019 19
Des Haw 4
April 2020 14
Wimaxworld_2008
December 2019 24
5 Motivation
December 2019 25