Welcome to my site don't forget to subscribe! keep visiting for latest technology updates.

Articles

IBM Model 1401 computer (Inventions)

The invention: A relatively small, simple, and inexpensive computer that is often credited with having launched the personal computer age.

The people behind the invention: 

Howard H. Aiken (1900-1973), an American mathematician Charles Babbage (1792-1871), an English mathematician and inventor Herman Hollerith (1860-1929), an American inventor

Computers: From the Beginning:

Computers evolved into their modern form over a period of thousands of years as a result of humanity’s efforts to simplify the process of counting. Two counting devices that are considered to be very simple, early computers are the abacus and the slide rule. These calculating devices are representative of digital and analog computers, respectively, because an abacus counts numbers of things, while the slide rule calculates length measurements.
The first modern computer, which was planned by Charles Bab-bage in 1833, was never built. It was intended to perform complex calculations with a data processing/memory unit that was controlled by punched cards. In 1944, Harvard University’s Howard H. Aiken and the International Business Machines (IBM) Corporation built such a computer—the huge, punched-tape-controlled Automatic Sequence Controlled Calculator, or Mark I ASCC, which could perform complex mathematical operations in seconds. During the next fifteen years, computer advances produced digital computers that used binary arithmetic for calculation, incorporated simplified components that decreased the sizes of computers, had much faster calculating speeds, and were transistorized.
Although practical computers had become much faster than they had been only a few years earlier, they were still huge and extremely expensive. In 1959, however, IBM introduced the Model 1401 computer. Smaller, simpler, and much cheaper than the multimillion-dollar computers that were available, the IBM Model 1401 computer was also relatively easy to program and use. Its low cost, simplicity of operation, and very wide use have led many experts to view the IBM Model 1401 computer as beginning the age of the personal computer.

Computer Operation and IBM’s Model 1401

Modern computers are essentially very fast calculating machines that are capable of sorting, comparing, analyzing, and outputting information, as well as storing it for future use. Many sources credit Aiken’s Mark I ASCC as being the first modern computer to be built. This huge, five-ton machine used thousands of relays to perform complex mathematical calculations in seconds. Soon after its introduction, other companies produced computers that were faster and more versatile than the Mark I. The computer development race was on.
All these early computers utilized the decimal system for calculations until it was found that binary arithmetic, whose numbers are combinations of the binary digits 1 and 0, was much more suitable for the purpose. The advantage of the binary system is that the electronic switches that make up a computer (tubes, transistors, or chips) can be either on or off; in the binary system, the on state can be represented by the digit 1, the off state by the digit 0. Strung together correctly, binary numbers, or digits, can be inputted rapidly and used for high-speed computations. In fact, the computer term bit is a contraction of the phrase “binary digit.”
A computer consists of input and output devices, a storage device (memory), arithmetic and logic units, and a control unit. In most cases, a central processing unit (CPU) combines the logic, arithmetic, memory, and control aspects. Instructions are loaded into the memory via an input device, processed, and stored. Then, the CPU issues commands to the other parts of the system to carry out computations or other functions and output the data as needed. Most output is printed as hard copy or displayed on cathode-ray tube monitors, or screens.
The early modern computers—such as the Mark I ASCC—were huge because their information circuits were large relays or tubes. Computers became smaller and smaller as the tubes were replaced— first with transistors, then with simple integrated circuits, and then with silicon chips. Each technological changeover also produced more powerful, more cost-effective computers.
In the 1950′s, with reliable transistors available, IBM began the development of two types of computers that were completed by about 1959. The larger version was the Stretch computer, which was advertised as the most powerful computer of its day. Customized for each individual purchaser (for example, the Atomic Energy Commission), a Stretch computer cost $10 million or more. Some innovations in Stretch computers included semiconductor circuits, new switching systems that quickly converted various kinds of data into one language that was understood by the CPU, rapid data readers, and devices that seemed to anticipate future operations.
 

CONSEQUENCES 

The IBM Model 1401 was the first computer sold in very large numbers. It led IBM and other companies to seek to develop less expensive, more versatile, smaller computers that would be sold to small businesses and to individuals. Six years after the development of the Model 1401, other IBM models—and those made by other companies—became available that were more compact and had larger memories. The search for compactness and versatility continued. A major development was the invention of integrated circuits by Jack S. Kilby of Texas Instruments; these integrated circuits became available by the mid-1960′s. They were followed by even smaller “microprocessors” (computer chips) that became available in the 1970′s. Computers continued to become smaller and more powerful.
Input and storage devices also decreased rapidly in size. At first, the punched cards invented by Herman Hollerith, founder of the Tabulation Machine Company (which later became IBM), were read by bulky readers. In time, less bulky magnetic tapes and more compact readers were developed, after which magnetic disks and compact disc drives were introduced.
Many other advances have been made. Modern computers can talk, create art and graphics, compose music, play games, and operate robots. Further advancement is expected as societal needs change. Many experts believe that it was the sale of large numbers of IBM Model 1401 computers that began the trend.
See also Apple II computer; BINAC computer; Colossus computer; ENIAC computer; Personal computer; Supercomputer; UNIVAC computer.

 Via[what-when-how]

 

FIBER-OPTICS (Inventions)

The invention: The application of glass fibers to electronic communications and other fields to carry large volumes of information quickly, smoothly, and cheaply over great distances.

The people behind the invention: 

Samuel F. B. Morse (1791-1872), the American artist and inventor who developed the electromagnetic telegraph system, Alexander Graham Bell (1847-1922), the Scottish American inventor and educator who invented the telephone and the photophone, Theodore H. Maiman (1927- ), the American physicist and engineer who invented the solid-state laser Charles K. Kao (1933- ), a Chinese-born electrical engineer Zhores I. Alferov (1930- ), a Russian physicist and mathematician.

The Singing Sun

In 1844, Samuel F. B. Morse, inventor of the telegraph, sent his famous message, “What hath God wrought?” by electrical impulses traveling at the speed of light over a 66-kilometer telegraph wire strung between Washington, D.C., and Baltimore. Ever since that day, scientists have worked to find faster, less expensive, and more efficient ways to convey information over great distances.
At first, the telegraph was used to report stock-market prices and the results of political elections. The telegraph was quite important in the American Civil War (1861-1865). The first transcontinental telegraph message was sent by Stephen J. Field, chief justice of the California Supreme Court, to U.S. president Abraham Lincoln on October 24, 1861. The message declared that California would remain loyal to the Union. By 1866, telegraph lines had reached all across the North American continent and a telegraph cable had been laid beneath the Atlantic Ocean to link the Old World with the New World.

Zhores I. Alferov

To create a telephone system that transmitted with light, perfecting fiber-optic cables was only half the solution. There also had to be a small, reliable, energy-efficient light source. In the 1960′s engineers realized that lasers were the best candidate. However, early gas lasers were bulky, and semiconductor lasers, while small, were temperamental and had to be cooled in liquid nitrogen. Nevertheless, the race was on to devise a semiconductor laser that produced a continuous beam and did not need to be cooled. The race was between a Bell Labs team in the United States and a Russian team led by Zhores I. Alferov, neither of which knew much about the other.
Alferov was born in 1930 in Vitebsk, Byelorussia, then part of the Soviet Union. He earned a degree in electronics from the V. I. Ulyanov (Lenin) Electrotechnical Institute in Leningrad (now St. Petersburg). As part of his graduate studies, he became a researcher at the A. F. Ioffe Physico-Technical Institute in the same city, receiving a doctorate in physics and mathematics in 1970. By then he was one of the world’s leading experts in semiconductor lasers.
Alferov found that he could improve the laser’s performance by sandwiching very thin layers of gallium arsenide and metal, insulated in silicon, in such a way that electrons flowed only along a 0.03 millimeter strip, producing light in the process. This double heterojunction narrow-stripe laser was the answer, producing a steady beam at room temperature. Alferov published his results a month before the American team came up with almost precisely the same solution.
The question of who was first was not settled until much later, during which time both Bell Labs and Alferov’s institute went on to further refinements of the technology. Alferov rose to become a dean at the St. Petersburg Technical University and vice-president of the Russian Academy of Sciences. In 2000 he shared the Nobel Prize in Physics.
Another American inventor made the leap from the telegraph to the telephone. Alexander Graham Bell, a teacher of the deaf, was interested in the physical way speech works. In 1875, he started experimenting with ways to transmit sound vibrations electrically. He realized that an electrical current could be adjusted to resemble the vibrations of speech. Bell patented his invention on March 7,1876. On July 9, 1877, he founded the Bell Telephone Company.
In 1880, Bell invented a device called the “photophone.” He used it to demonstrate that speech could be transmitted on a beam of light. Light is a form of electromagnetic energy. It travels in a vibrating wave. When the amplitude (height) of the wave is adjusted, a light beam can be made to carry messages. Bell’s invention included a thin mirrored disk that converted sound waves directly into a beam of light. At the receiving end, a selenium resistor connected to a headphone converted the light back into sound. “I have heard a ray of sun laugh and cough and sing,” Bell wrote of his invention.
Although Bell proved that he could transmit speech over distances of several hundred meters with the photophone, the device was awkward and unreliable, and it never became popular as the telephone did. Not until one hundred years later did researchers find important practical uses for Bell’s idea of talking on a beam of light.
Two other major discoveries needed to be made first: development of the laser and of high-purity glass. Theodore H. Maiman, an American physicist and electrical engineer at Hughes Research Laboratories in Malibu, California, built the first laser. The laser produces an intense, narrowly focused beam of light that can be adjusted to carry huge amounts of information. The word itself is an acronym for /ight amplification by the stimulated emission of radiation.
It soon became clear, though, that even bright laser light can be broken up and absorbed by smog, fog, rain, and snow. So in 1966, Charles K. Kao, an electrical engineer at the Standard Telecommunications Laboratories in England, suggested that glass fibers could be used to transmit message-carrying beams of laser light without disruption from weather.

Fiber Optics Are Tested

Optical glass fiber is made from common materials, mostly silica, soda, and lime. The inside of a delicate silica glass tube is coated with a hundred or more layers of extremely thin glass. The tube is then heated to 2,000 degrees Celsius and collapsed into a thin glass rod, or preform. The preform is then pulled into thin strands of fiber. The fibers are coated with plastic to protect them from being nicked or scratched, and then they are covered in flexible cable.

Fiber optic strands. (PhotoDisc)

The earliest glass fibers contained many impurities and defects, so they did not carry light well. Signal repeaters were needed every few meters to energize (amplify) the fading pulses of light. In 1970, however, researchers at the Corning Glass Works in New York developed a fiber pure enough to carry light at least one kilometer without amplification.

The telephone industry

quickly became involved in the new fiber-optics technology. Researchers believed that a bundle of optical fibers as thin as a pencil could carry several hundred telephone calls at the same time. Optical fibers were first tested by telephone companies in big cities, where the great volume of calls often overloaded standard underground phone lines.
On May 11, 1977, American Telephone & Telegraph Company (AT&T), along with Illinois Bell Telephone, Western Electric, and Bell Telephone Laboratories, began the first commercial test of fiber-optics telecommunications in downtown Chicago. The system consisted of a 2.4-kilometer cable laid beneath city streets. The cable, only 1.3 centimeters in diameter, linked an office building in the downtown business district with two telephone exchange centers. Voice and video signals were coded into pulses of laser light and transmitted through the hair-thin glass fibers. The tests showed that a single pair of fibers could carry nearly six hundred telephone conversations at once very reliably and at a reasonable cost.
Six years later, in October, 1983, Bell Laboratories succeeded in transmitting the equivalent of six thousand telephone signals through an optical fiber cable that was 161 kilometers long. Since that time, countries all over the world, from England to Indonesia, have developed optical communications systems.

Consequences

Fiber optics has had a great impact on telecommunications. A single fiber can now carry thousands of conversations with no electrical interference. These fibers are less expensive, weigh less, and take up much less space than copper wire. As a result, people can carry on conversations over long distances without static and at a low cost.
One of the first uses of fiber optics and perhaps its best-known application is the fiberscope, a medical instrument that permits internal examination of the human body without surgery or X-ray techniques. The fiberscope, or endoscope, consists of two fiber bundles. One of the fiber bundles transmits bright light into the patient, while the other conveys a color image back to the eye of the physician. The fiberscope has been used to look for ulcers, cancer, and polyps in the stomach, intestine, and esophagus of humans. Medical instruments, such as forceps, can be attached to the fiber-scope, allowing the physician to perform a range of medical procedures, such as clearing a blocked windpipe or cutting precancerous polyps from the colon.
See also Cell phone; Community antenna television; Communications satellite; FM radio; Laser; Long-distance radiotelephony; Long-distance telephone; Telephone switching.

  Via[what-when-how]

 

Hard disk (Inventions)

The invention: A large-capacity, permanent magnetic storage device built into most personal computers.

The people behind the invention:

Alan Shugart (1930- ), an engineer who first developed the floppy disk Philip D. Estridge (1938?-1985), the director of IBM’s product development facility Thomas J. Watson, Jr. (1914-1993), the chief executive officer of IBM.

The Personal Oddity 

When the International Business Machines (IBM) Corporation introduced its first microcomputer, called simply the IBM PC (for “personal computer”), the occasion was less a dramatic invention than the confirmation of a trend begun some years before. A number of companies had introduced microcomputers before IBM; one of the best known at that time was Apple Corporation’s Apple II, for which software for business and scientific use was quickly developed. Nevertheless, the microcomputer was quite expensive and was often looked upon as an oddity, not as a useful tool.
Under the leadership of Thomas J. Watson, Jr., IBM, which had previously focused on giant mainframe computers, decided to develop the PC. A design team headed by Philip D. Estridge was assembled in Boca Raton, Florida, and it quickly developed its first, pacesetting product. It is an irony of history that IBM anticipated selling only one hundred thousand or so of these machines, mostly to scientists and technically inclined hobbyists. Instead, IBM’s product sold exceedingly well, and its design parameters, as well as its operating system, became standards.
The earliest microcomputers used a cassette recorder as a means of mass storage; a floppy disk drive capable of storing approximately 160 kilobytes of data was initially offered only as an option. While home hobbyists were accustomed to using a cassette recorder
for storage purposes, such a system was far too slow and awkward for use in business and science. As a result, virtually every IBM PC sold was equipped with at least one 5.25-inch floppy disk drive.

 Memory Requirements 

All computers require memory of two sorts in order to carry out their tasks. One type of memory is main memory, or random access memory (RAM), which is used by the computer’s central processor to store data it is using while operating. The type of memory used for this function is built typically of silicon-based integrated circuits that have the advantage of speed (to allow the processor to fetch or store the data quickly), but the disadvantage of possibly losing or “forgetting” data when the electric current is turned off. Further, such memory generally is relatively expensive.
To reduce costs, another type of memory—long-term storage memory, known also as “mass storage”—was developed. Mass storage devices include magnetic media (tape or disk drives) and optical media (such as the compact disc, read-only memory, or CD-ROM). While the speed with which data may be retrieved from or stored in such devices is rather slow compared to the central processor’s speed, a disk drive—the most common form of mass storage used in PCs—can store relatively large amounts of data quite inexpensively.
Early floppy disk drives (so called because the magnetically treated material on which data are recorded is made of a very flexible plastic) held 160 kilobytes of data using only one side of the magnetically coated disk (about eighty pages of normal, double-spaced, typewritten information). Later developments increased storage capacities to 360 kilobytes by using both sides of the disk and later, with increasing technological ability, 1.44 megabytes (millions of bytes). In contrast, mainframe computers, which are typically connected to large and expensive tape drive storage systems, could store gigabytes (millions of megabytes) of information.
While such capacities seem large, the needs of business and scientific users soon outstripped available space. Since even the mailing list of a small business or a scientist’s mathematical model of a chemical reaction easily could require greater storage potential than early PCs allowed, the need arose for a mass storage device that could accommodate very large files of data.
The answer was the hard disk drive, also known as a “fixed disk drive,” reflecting the fact that the disk itself is not only rigid but also permanently installed inside the machine. In 1955, IBM had envisioned the notion of a fixed, hard magnetic disk as a means of storing computer data, and, under the direction of Alan Shugart in the 1960′s, the floppy disk was developed as well.
As the engineers of IBM’s facility in Boca Raton refined the idea of the original PC to design the new IBM PC XT, it became clear that chief among the needs of users was the availability of large-capability storage devices. The decision was made to add a 10-megabyte hard disk drive to the PC. On March 8,1983, less than two years after the introduction of its first PC, IBM introduced the PC XT. Like the original, it was an evolutionary design, not a revolutionary one. The inclusion of a hard disk drive, however, signaled that mass storage devices in personal computers had arrived.

Consequences

Above all else, any computer provides a means for storing, ordering, analyzing, and presenting information. If the personal computer is to become the information appliance some have suggested it will be, the ability to manipulate very large amounts of data will be of paramount concern. Hard disk technology was greeted enthusiastically in the marketplace, and the demand for hard drives has seen their numbers increase as their quality increases and their prices drop.
It is easy to understand one reason for such eager acceptance: convenience. Floppy-bound computer users find themselves frequently changing (or “swapping”) their disks in order to allow programs to find the data they need. Moreover, there is a limit to how much data a single floppy disk can hold. The advantage of a hard drive is that it allows users to keep seemingly unlimited amounts of data and programs stored in their machines and readily available.
Also, hard disk drives are capable of finding files and transferring their contents to the processor much more quickly than a floppy drive. A user may thus create exceedingly large files, keep
them on hand at all times, and manipulate data more quickly than with a floppy. Finally, while a hard drive is a slow substitute for main memory, it allows users to enjoy the benefits of larger memories at significantly lower cost.
The introduction of the PC XT with its 10-megabyte hard drive was a milestone in the development of the PC. Over the next two decades, the size of computer hard drives increased dramatically. By 2001, few personal computers were sold with hard drives with less than three gigabytes of storage capacity, and hard drives with more than thirty gigabytes were becoming the standard. Indeed, for less money than a PC XT cost in the mid-1980′s, one could buy a fully equipped computer with a hard drive holding sixty gigabytes—a storage capacity equivalent to six thousand 10-megabyte hard drives. See also Bubble memory; Compact disc; Computer chips; Floppy disk; Optical disk; Personal computer.

  Via[what-when-how]

Related Posts Plugin for WordPress, Blogger...