Get Paid To Promote, Get Paid To Popup, Get Paid Display Banner

Computer Science

In the 19th century, the term computer referred to people who performed mathematical computations. But mechanical tabulating machines and calculators began to appear in the late 19th century and early 20th century, and in 1946, engineers J. Presper Eckert (1919-95) and John Mauchly (1907-80) built one of the first modern electronic computers, known as the Electronic Numerical Integrator and Computer (ENIAC). ENIAC was an important advance but had some disadvantages - it was the size of a room, ran slowly, and often suffered failures in its electrical components. But since the 1940s, computers have evolved into fast and efficient machines that fill almost every niche in today's society.

The expanding role of computers has begun to encroach on tasks that require substantial thought - at least for a person. For example, in 1997, a computer called Deep Blue defeated Garry Kasparov, the reigning World Chess Champion at the time, in a chess match. Chess-playing computer programs have been routinely defeating novice chess players since the 1970s, but Deep Blue beat one of the best.

No one is certain how much more powerful - and possibly intelligent - computers will become in the 21st century. Computer Science, one volume of the multivolume Frontiers of Science set, explores six prominent topics in computer science research that address issues concerning the capacity of computers and their applications.

Although a computer may perform intelligent tasks, the performance of most machines today reflects the skill of computer engineers and programmers. None of the applications mentioned above would have been possible without the efforts of computer engineers who design the machines, and computer programmers who write the programs to provide the necessary instructions. Most computers today perform a series of simple steps, and must be told exactly which steps to perform and in what order. Deep Blue, for example, did not think as a person does, but instead ran a program to search for the best move, as determined by complicated formulas. A fast computer such as Deep Blue can zip through these instructions so quickly that it is capable of impressive feats of "intelligence."

But some computer scientists are working on making computers smarter - and more like humans. The human brain consists of complex neural networks that process sensory information, extract important features, and solve problems.

Speedy computations are essential in many of these operations, and fast computers can find solutions to complicated problems. Deep Blue's program, for instance, churned through millions of instructions every second to find the optimal chess move. But certain kinds of problems have remained intractable, even with the fastest computers. Many of these problems, such as factoring integers or finding the shortest distances in certain routes, have important practical applications for engineering and science, as well as for computer networks and economics. People can program computers to address these problems on a small scale - factoring a small number such as 20, or finding a route with only three cities to visit - but problems involving larger numbers require too much time.

An efficient method to solve these problems, if one is ever found, would have a tremendous impact, especially on the Internet. Personal and confidential information, such as credit card numbers, gets passed from computer to computer every day on the Internet. This information must be protected by making the information unreadable to all except the intended recipient. The science of writing and reading secret messages is called cryptology, and many techniques today could be broken - and their secrets exposed.

One of the most important human senses is vision. Images provide a wealth of information that is difficult or cumbersome to put into words. These days, images are often processed in digital form - arrays of numbers that computers can store and process. As computers become faster and smarter, people have started using these machines to perform functions similar to human vision, such as reading.

Searching for patterns is an integral part of many computer appli- cations - for example, looking for clues to crack a secret message, or sifting through the features of an image to find a specific object. Biologists have recently amassed a huge quantity of data involving genetics. Patterns in this kind of information contain vital clues about how organisms develop, what traits they have, and how certain diseases arise and progress. Overwhelmed by the sheer size of these data, which is the equivalent of thousands of encyclopedia volumes, biologists have turned to computer science for help.

Computers have made life easier in many ways, relieving people of boring and time-consuming tasks, but computers have also made life more complicated, forcing people to keep up with technological developments.

A fundamental element of research in computer science is the computer itself. Despite the efficiency of today's machines, the computer remains a frontier of science. The reason for this is the same as it was during the early years of computational technology.

In 1790, marshals of the newly formed government of the United States set out on horseback to perform the important mission of counting the country's population. Taking an accurate census was essential in order to apportion the number of congressional delegates for each district, as specified by the U.S. Constitution. According to the U.S. Census Bureau, the census-takers manually compiled a list of 3,929,214 people in less than a year. Officials took another census each decade, and by 1880 the population had grown to 50,155,783. But census-takers had reached the breaking point - it took them almost the whole decade to finish tabulating the 1880 census, and the country continued to grow at an astonishing rate. Government officials feared that the 1890 census would not be completed before they had to begin the 1900 census.

The solution to this problem was automation. In response to a competition sponsored by the Bureau of the Census, Herman Hollerith (1860-1929), a young engineer, designed an automatic "census counting machine." Census personnel collected data - the plural of a Latin word, datum, meaning information - and encoded the information in the positions of holes punched in cards. These cards were the same size as dollar bills of the time, meaning that a stack of cards conveniently fit into boxes used by the Treasury Department. When operators inserted the cards into the machine, an electromechanical process automatically tabulated the population figures. Using Hollerith's machines, the 1890 census of 62,979,766 people was counted within a few months, and the statistical tables were completed two years later.

Hollerith formed a company, the Tabulating Machine Company, in 1896. The company changed its name in 1924 to International Business Machines (IBM) Corporation. IBM thrived, and is presently one of the world's largest companies.

Computational machines have also thrived. The need for speed and efficiency - the same needs of the 1890 census - motivated the development of computers into the ubiquitous machines they are today. Computers are in homes, offices, cars, and even spacecraft, and people carry portable computers known as notebooks or laptops whenever they travel. Yet the evolution of computers is by no means finished. One of the most active frontiers of computer science is the development of faster and more efficient computers, which may eventually transform the world as drastically as their predecessors did.

0 komentar:

Posting Komentar