Ðåôåðàòû

Èñòîðèÿ êîìïüþòåðà è êîìïüþòåðíîé òåõíèêè

Èñòîðèÿ êîìïüþòåðà è êîìïüþòåðíîé òåõíèêè

ESSAY

The Comparative Analisis Of The History Of The Computer Science And

The Computer Engineering In The USA And Ukraine.

.

USA.

HOWARD H. AIKEN AND THE COMPUTER

[pic]OWARD AIKEN’S CONTRIBUTIONS TO THE DEVELOPMENT OF THE COMPUTER

-NOTABLY THE HARVARD MARK I (IBM ASSC) MACHINE, AND ITS SUCCESSOR THE

MARK II - ARE OFTEN EXCLUDED FROM THE MAINSTREAM HISTORY OF COMPUTERS ON

TWO TECHNICALITIES. THE FIRST IS THAT MARK I AND MARK II WERE ELECTRO-

MECHANICAL RATHER THAN ELECTRONIC; THE SECOND ONE IS THAT AIKEN WAS NEVER

CONVINCED THAT COMPUTER PROGRAMS SHOULD BE TREATED AS DATA IN WHAT HAS COME

TO BE KNOWN AS THE VON NEUMANN CONCEPT, OR THE STORED PROGRAM.

It is not proposed to discuss here the origins and significance of the

stored program. Nor I wish to deal with the related problem of whether the

machines before the stored program were or were not “computers”. This

subject is complicated by the confusion in actual names given to machines.

For example, the ENIAC, which did not incorporate a stored program, was

officially named a computer: Electronic Numeral Integrator And Computer.

But the first stored-program machine to be put into regular operation was

Maurice Wiles’ EDSAC: Electronic Delay Storage Automatic Calculator. It

seems to be rather senseless to deny many truly significant innovations (by

H.H.Aiken and by Eckert and Mauchly), which played an important role in the

history of computers, on the arbitrary ground that they did not incorporate

the stored-program concept. Additionally, in the case of Aiken, it is

significant that there is a current computer technology that does not

incorporate the stored programs and that is designated as (at least by

TEXAS INSTRUMENTS() as “Harvard architecture”, though, it should more

properly be called “Aiken architecture”. In this technology the program is

fix and not subject to any alteration save by intent - as in some computers

used for telephone switching and in ROM.

OPERATION OF THE ENIAC.

Aiken was a visionary, a man ahead of his times. Grace Hopper and

others remember his prediction in the late 1940s, even before the vacuum

tube had been wholly replaced by the transistor, that the time would come

when a machine even more powerful than the giant machines of those days

could be fitted into a space as small as a shoe box.

Some weeks before his death Aiken had made another prediction. He

pointed out that hardware considerations alone did not give a true picture

of computer costs. As hardware has become cheaper, software has been apt to

get more expensive. And then he gave us his final prediction: “The time

will come”, he said, “when manufacturers will gave away hardware in order

to sell software”. Time alone will tell whether or not this was his final

look ahead into the future.

9

THE DEVELOPMENT OF COMPUTERS IN THE USA

[pic]N THE EARLY 1960S, WHEN COMPUTERS WERE HULKING MAINFRAMES THAT

TOOK UP ENTIRE ROOMS, ENGINEERS WERE ALREADY TOYING WITH THE THEN -

EXTRAVAGANT NOTION OF BUILDING A COMPUTER INTENDED FOR THE SOLE USE OF ONE

PERSON. BY THE EARLY 1970S, RESEARCHES AT XEROX’S POLO ALTO RESEARCH CENTER

(XEROX PARC) HAD REALIZED THAT THE PACE OF IMPROVEMENT IN THE TECHNOLOGY OF

SEMICONDUCTORS - THE CHIPS OF SILICON THAT ARE THE BUILDING BLOCKS OF

PRESENT-DAY ELECTRONICS - MEANT THAT SOONER OR LATER THE PC WOULD BE

EXTRAVAGANT NO LONGER. THEY FORESAW THAT COMPUTING POWER WOULD SOMEDAY BE

SO CHEAP THAT ENGINEERS WOULD BE ABLE TO AFFORD TO DEVOTE A GREAT DEAL OF

IT SIMPLY TO MAKING NON-TECHNICAL PEOPLE MORE COMFORTABLE WITH THESE NEW

INFORMATION - HANDLING TOOLS. IN THEIR LABS, THEY DEVELOPED OR REFINED MUCH

OF WHAT CONSTITUTES PCS TODAY, FROM “MOUSE” POINTING DEVICES TO SOFTWARE

“WINDOWS”.

Although the work at Xerox PARC was crucial, it was not the spark that

took PCs out of the hands of experts and into the popular imagination. That

happened inauspiciously in January 1975, when the magazine Popular

Electronics put a new kit for hobbyists, called the Altair, on its cover.

for the first time, anybody with $400 and a soldering iron could buy and

assemble his own computer. The Altair inspired Steve Wosniak and Steve Jobs

to build the first Apple computer, and a young college dropout named Bill

Gates to write software for it. Meanwhile. the person who deserves the

credit for inventing the Altair, an engineer named Ed Roberts, left the

industry he had spawned to go to medical school. Now he is a doctor in

small town in central Georgia.

To this day, researchers at Xerox and elsewhere pooh-pooh the Altair

as too primitive to have made use of the technology they felt was needed to

bring PCs to the masses. In a sense, they are right. The Altair

incorporated one of the first single-chip microprocessor - a semiconductor

chip, that contained all the basic circuits needed to do calculations -

called the Intel 8080. Although the 8080 was advanced for its time, it was

far too slow to support the mouse, windows, and elaborate software Xerox

had developed. Indeed, it wasn’t until 1984, when Apple Computer’s

Macintosh burst onto the scene, that PCs were powerful enough to fulfill

the original vision of researchers. “The kind of computing that people are

trying to do today is just what we made at PARC in the early 1970s,” says

Alan Kay, a former Xerox researcher who jumped to Apple in the early 1980s.

MACINTOSH PERFORMA 6200/6300

Researchers today are proceeding in the same spirit that motivated Kay

and his Xerox PARC colleagues in the 1970s: to make information more

accessible to ordinary people. But a look into today’s research labs

reveals very little that resembles what we think of now as a PC. For one

thing, researchers seem eager to abandon the keyboard and monitor that are

the PC’s trademarks. Instead they are trying to devise PCs with

interpretive powers that are more humanlike - PCs that can hear you and see

you, can tell when you’re in a bad mood and know to ask questions when they

don’t understand something.

It is impossible to predict the invention that, like the Altair,

crystallize new approaches in a way that captures people’s imagination.

Top 20 computer systems

&

[pic]ROM SOLDERING IRONS TO SPARCSTATIONS, FROM MITS TO MACINTOSH,

PERSONAL COMPUTERS HAVE EVOLVED FROM DO-IT-YOURSELF KITS FOR ELECTRONIC

HOBBYISTS INTO MACHINES THAT PRACTICALLY LEAP OUT OF THE BOX AND SET

THEMSELVES UP. WHAT ENABLED THEM TO GET FROM THERE TO HERE? INNOVATION AND

DETERMINATION. HERE ARE TOP 20 SYSTEMS THAT MADE THAT RAPID EVOLUTION

POSSIBLE.

. MITS Altair 8800

There once was a time when you could buy a top-of-the-line computer

for $395. The only catch was that you had to build it yourself. Although

the Altair 8800 wasn’t actually the first personal computer (Scelbi

Computer Consulting`s 8008-based Scelbi-8H kit probably took that honor in

1973), it grabbed attention. MITS sold 2000 of them in 1975 - more than any

single computer before it.

Based on Intel`s 8-bit 8080 processor, the Altair 8800 kit included

256 bytes of memory (upgradable, of course) and a toggle-switch-and-LED

front panel. For amenities such as keyboard, video terminals, and storage

devices, you had to go to one of the companies that sprang up to support

the Altair with expansion cards. In 1975, MITS offered 4- and 8-KB Altair

versions of BASIC, the first product developed by Bill Gates` and Paul

Allen`s new company, Microsoft.

If the personal computer hobbyists movement was simmering, 1975 saw it

come to a boil with the introduction of the Altair 8800.

. Apple II

Those of you who think of the IBM PC as the quintessential business

computers may be in for a surprise: The Apple II (together with VisiCalc)

was what really made people to look at personal computers as business

tools, not just toys.

The Apple II debuted at the first West Coast Computer Fair in San

Francisco in 1977. With built-in keyboard, graphics display, eight readily

accessible expansion slots, and BASIC built-into ROM, the Apple II was

actually easy to use. Some of its innovations, like built-in high-

resolution color graphics and a high-level language with graphics commands,

are still extraordinary features in desk top machines.

With a 6502 CPU, 16 KB of RAM, a 16-KB ROM, a cassette interface that

never really worked well (most Apple It ended up with the floppy drive the

was announced in 1978), and color graphics, the Apple II sold for $1298.

. Commondore PET

Also introduced at the first West Coast Computer Fair, Commondore`s

PET (Personal Electronic Transactor) started a long line of expensive

personal computers that brought computers to the masses. (The VIC-20 that

followed was the first computer to sell 1 million units, and the Commondore

64 after that was the first to offer a whopping 64 KB of memory.)

The keyboard and small monochrome display both fit in the same one-

piece unit. Like the Apple II, the PET ran on MOS Technology’s 6502. Its

$795 price, key to the Pet’s popularity supplied only 4 KB of RAM but

included a built-in cassette tape drive for data storage and 8-KB version

of Microsoft BASIC in its 14-KB ROM.

. Radio Shack TRS-80

Remember the Trash 80? Sold at local Radio Shack stores in your choice

of color (Mercedes Silver), the TRS-80 was the first ready-to-go computer

to use Zilog`s Z80 processor.

The base unit was essentially a thick keyboard with 4 KB of RAM and 4

KB of ROM (which included BASIC). An optional expansion box that connected

by ribbon cable allowed for memory expansion. A Pink Pearl eraser was

standard equipment to keep those ribbon cable connections clean.

Much of the first software for this system was distributed on

audiocassettes played in from Radio Shack cassette recorders.

. Osborne 1 Portable

By the end of the 1970s, garage start-ups were pass. Fortunately there

were other entrepreneurial possibilities. Take Adam Osborne, for example.

He sold Osborne Books to McGraw-Hill and started Osborne Computer. Its

first product, the 24-pound Osborne 1 Portable, boasted a low price of

$1795.

More important, Osborne established the practice of bundling software

- in spades. The Osborne 1 came with nearly $1500 worth of programs:

WordStar, SuperCalc, BASIC, and a slew of CP/M utilities.

Business was looking good until Osborne preannounced its next version

while sitting on a warehouse full of Osborne 1S. Oops. Reorganization under

Chapter 11 followed soon thereafter.

. Xerox Star

This is the system that launched a thousand innovations in 1981. The

work of some of the best people at Xerox PARC (Palo Alto Research Center)

went into it. Several of these - the mouse and a desktop GUI with icons -

showed up two years later in Apple`s Lisa and Macintosh computers. The Star

wasn’t what you would call a commercial success, however. The main problem

seemed to be how much it cost. It would be nice to believe that someone

shifted a decimal point somewhere: The pricing started at $50,000.

. IBM PC

Irony of ironies that someone at mainframe-centric IBM recognized the

business potential in personal computers. The result was in 1981 landmark

announcement of the IBM PC. Thanks to an open architecture, IBM’s clout,

and Lotus 1-2-3 (announced one year later), the PC and its progeny made

business micros legitimate and transformed the personal computer world.

The PC used Intel`s 16-bit 8088, and for $3000, it came with 64 KB of

RAM and a 51/4-inch floppy drive. The printer adapter and monochrome

monitor were extras, as was the color graphics adapter.

. Compaq Portable

Compaq’s Portable almost single-handedly created the PC clone market.

Although that was about all you could do with it single-handedly - it

weighed a ton. Columbia Data Products just preceded Compaq that year with

the first true IBM PC clone but didn’t survive. It was Compaq’s quickly

gained reputation for engineering and quality, and its essentially 100

percent IBM compatibility (reverse-engineering, of course), that

legitimized the clone market. But was it really designed on a napkin?

. Radio Shack TRS-80 Model 100

Years before PC-compatible subnotebook computers, Radio Shack came out

with a book-size portable with a combination of features, battery life,

weight, and price that is still unbeatable. (Of course, the Z80-based Model

100 didn’t have to run Windows.)

The $800 Model 100 had only an 8-row by 40-column reflective LCD

(large at the time) but supplied ROM-based applications (including text

editor, communications program, and BASIC interpreter), a built-in modem,

I/O ports, nonvolatile RAM, and a great keyboard. Wieghing under 4 pounds,

and with a battery life measured in weeks (on four AA batteries), the Model

100 quickly became the first popular laptop, especially among journalists.

With its battery-backed RAM, the Model 100 was always in standby mode,

ready to take notes, write a report, or go on-line. NEC`s PC 8201 was

essentially the same Kyocera-manufectured system.

. Apple Macintosh

Whether you saw it as a seductive invitation to personal computing or

a cop-out to wimps who were afraid of a command line, Apple`s Macintosh and

its GUI generated even more excitement than the IBM PC. Apple`s R&D people

were inspired by critical ideas from Xerox PARK (and practiced on Apple`s

Lisa) but added many of their own ideas to create a polished product that

changed the way people use computers.

The original Macintosh used Motorola’s 16-bit 68000 microprocessor. At

$2495, the system offered a built-in-high-resolution monochrome display,

the Mac OS, and a single-button mouse. With only 128 KB of RAM, the Mac was

underpowered at first. But Apple included some key applications that made

the Macintosh immediately useful. (It was MacPaint that finally showed

people what a mouse is good for.)

. IBM AT

George Orwell didn’t foresee the AT in 1984. Maybe it was because Big

Blue, not Big Brother, was playing its cards close to its chest. The IBM AT

set new standards for performance and storage capacity. Intel`s blazingly

fast 286 CPU running at 6 MHz and 16-bit bus structure gave the AT several

times the performance of previous IBM systems. Hard drive capacity doubled

from 10 MB to 20 MB (41 MB if you installed two drives - just donut ask how

they did the math), and the cost per megabyte dropped dramatically.

New 16-bit expansion slots meant new (and faster) expansion cards but

maintained downward compatibility with old 8-bit cards. These hardware

changes and new high-density 1.2-MB floppy drives meant a new version of PC-

DOS (the dreaded 3.0).

The price for an AT with 512 KB of RAM, a serial/parallel adapter, a

high-density floppy drive, and a 20-MB hard drive was well over $5000 - but

much less than what the pundits expected.

. Commondore Amiga 1000

The Amiga introduced the world to multimedia. Although it cost only

$1200, the 68000-based Amiga 1000 did graphics, sound, and video well

enough that many broadcast professionals adopted it for special effects.

Its sophisticated multimedia hardware design was complex for a personal

computer, as was its multitasking, windowing OS.

. Compaq Deskrpo 386

While IBM was busy developing (would “wasting time on” be a better

phrase?) proprietary Micro Channel PS/2 system, clone vendors ALR and

Compaq wrestled away control of the x86 architecture and introduced the

first 386-based systems, the Access 386 and Deskpro 386. Both systems

maintained backward compatibility with the 286-based AT.

Compaq’s Deskpro 386 had a further performance innovation in its Flex

bus architecture. Compaq split the x86 external bus into two separate

buses: a high-speed local bus to support memory chips fast enough for the

16-MHz 386, and a slower I/O bus that supported existing expansion cards.

. Apple Macintosh II

When you first looked at the Macintosh II, you may have said, “But it

looks just like a PC. ”You would have been right. Apple decided it was

wiser to give users a case they could open so they could upgrade it

themselves. The monitor in its 68020-powered machine was a separate unit

that typically sat on top of the CPU case.

. Next Nextstation

UNIX had never been easy to use , and only now, 10 years later, are we

getting back to that level. Unfortunately, Steve Job’s cube never developed

the software base it needed for long-term survival. Nonetheless, it

survived as an inspiration for future workstations.

Priced at less than $10,000, the elegant Nextstation came with a 25-

MHz 68030 CPU, a 68882 FPU, 8 MB of RAM, and the first commercial magneto-

optical drive (256-MB capacity). It also had a built-in DSP (digital signal

processor). The programming language was object-oriented C, and the OS was

a version of UNIX, sugarcoated with a consistent GUI that rivaled Apple`s.

. NEC UltraLite

Necks UltraLite is the portable that put subnotebook into the lexicon.

Like Radio Shack’s TRS-80 Model 100, the UltraLite was a 4-pounder ahead of

its time. Unlike the Model 100, it was expensive (starting price, $2999),

but it could run MS-DOS. (The burden of running Windows wasn’t yet thrust

upon its shoulders.)

Fans liked the 4.4-pound UltraLite for its trim size and portability,

but it really needed one of today’s tiny hard drives. It used battery-

backed DRAM (1 MB, expandable to 2 MB) for storage, with ROM-based

Traveling Software’s LapLink to move stored data to a desk top PC.

Foreshadowing PCMCIA, the UltraLite had a socket that accepted credit-

card-size ROM cards holding popular applications like WordPerfect or Lotus

1-2-3, or a battery-backed 256-KB RAM card.

Sun SparcStation 1

It wasn’t the first RISK workstation, nor even the first Sun system to

use Sun’s new SPARC chip. But the SparcStation 1 set a new standard for

price/performance, churning out 12.5 MIPS at a starting price of only $8995

- about what you might spend for a fully configured Macintosh. Sun sold

lots of systems and made the words SparcStation and workstation synonymous

in many peoples minds.

The SparcStation 1 also introduced S-Bus, Sun’s proprietary 32-bit

synchronous bus, which ran at the same 20-MHz speed as the CPU.

. IBM RS/6000

Sometimes, when IBM decides to do something, it does it right.(Other

times... Well, remember the PC jr.?)The RS/6000 allowed IBM to enter the

workstation market. The RS/6000`s RISK processor chip set (RIOS) racked up

speed records and introduced many to term suprscalar. But its price was

more than competitive. IBM pushed third-party software support, and as a

result, many desktop publishing, CAD, and scientific applications ported to

the RS/6000, running under AIX, IBM’s UNIX.

A shrunken version of the multichip RS/6000 architecture serves as the

basis for the single-chip PowerPC, the non-x86-compatible processor with

the best chance of competing with Intel.

Apple Power Macintosh

Not many companies have made the transition from CISC to RISK this

well. The Power Macintosh represents Apple`s well-planned and successful

leap to bridge two disparate hardware platforms. Older Macs run Motorola’s

680x0 CISK line, which is running out of steam; the Power Macs run existing

680x0-based applications yet provide Power PC performance, a combination

that sold over a million systems in a year.

IBM ThinkPad 701C

It is not often anymore that a new computer inspires gee-whiz

sentiment, but IBM’s Butterfly subnotebook does, with its marvelous

expanding keyboard. The 701C`s two-part keyboard solves the last major

piece in the puzzle of building of usable subnotebook: how to provide

comfortable touch-typing.(OK, so the floppy drive is sill external.)

With a full-size keyboard and a 10.4-inch screen, the 4.5-pound 701C

compares favorably with full-size notebooks. Battery life is good, too.

Q

THE DEVELOPMENT OF COMPUTERS IN UKRAINE AND THE FORMER USSR

[pic]HE GOVERNMENT AND THE AUTHORITIES HAD PAID SERIOUS ATTENTION TO

THE DEVELOPMENT OF THE COMPUTER INDUSTRY RIGHT AFTER THE SECOND WORLD WAR.

THE LEADING BODIES CONSIDERED THIS TASK TO BE ONE OF THE PRINCIPAL FOR THE

NATIONAL ECONOMY.

Up to the beginning of the 1950s there were only small productive

capacities which specialized in the producing accounting and account-

perforating (punching) machines. The electronic numerical computer

engineering was only arising and the productive capacities for it were

close to the naught.

The first serious steps in the development of production base were

made initially in the late 1950s when the work on creating the first

industry samples of the electronic counting machines was finished and there

were created M-20, “Ural-1”, “Minsk-1”, which together with their semi-

conductor successors (M-220, “Ural-11-14”, “Minsk-22” and “Minsk-32”)

created in the 1960s were the main ones in the USSR until the computers of

the third generation were put into the serial production, that is until the

early 1970s.

In the 1960s the science-research and assembling base was enlarged. As

the result of this measures, all researches connected with creating and

putting into the serial production of semi-conductor electronic computing

machines were almost finished. That allowed to stop the production of the

first generation machines beginning from the 1964.

Next decades the whole branch of the computer engineering had been

created. The important steps were undertaken to widen the productive

capacities for the 3d generation machines.

=

ÊIEV

THE HOMECITY OF MESM

[pic]ESM WAS CONCEIVED BY S.A.LEBEDEV TO BE A MODEL OF A BIG

ELECTRONIC COMPUTING MACHINE (BESM). AT FIRST IT WAS CALLED THE MODEL OF

THE BIG ELECTRONIC COMPUTING MACHINE, BUT ,LATER, IN THE PROCESS OF ITS

CREATION THERE APPEARED THE EVIDENT EXPEDIENCY OF TRANSFORMING IT IN A

SMALL COMPUTER. FOR THAT REASON THERE WERE ADDED: THE IMPUTE-OUTPUT

DEVICES, MAGNETIC DRUM STORAGE, THE REGISTER CAPACITY WAS ENHANCED; AND THE

WORD “MODEL” WAS CHANGED FOR “MALAYA” (SMALL).

S.A.Lebedev was proposed to head the Institute of Energetics in Kiev.

After a year; when the Institute of was divided into two departments: the

electronical one and the department of heat-and-power engineering, Lebedev

became the director of the first one. He also added his laboratory of

analogue computation to the already existing ones of the electronical type.

At once he began to work on computer science instead of the usual, routine

researches in the field of engineering means of stabilization and

structures of automated devices. Lebedev was awarded the State Prize of the

USSR. Since autumn 1948 Lebedev directed his laboratory towards creating

the MESM. The most difficult part of the work was the practical creation of

MESM. It might be only the many-sided experience of the researches that

allowed the scientist to fulfill the task perfectly; whereas one inaccuracy

was made: the hall at the ground-floor of a two-storied building was

assigned for MESM and when, at last, the MESM was assembled and switched

on, 6,000 of red-hot electronic lamps created the “tropics” in the hall, so

they had to remove a part of the ceiling to decrease the temperature.

In autumn 1951 the machine executed a complex program rather stabile.

ÒÍÅ MESM WITH SOME OF THE PERSONAL (KIEV, 1951)

Finally all the tests were over and on December, 15 the MESM was put

into operation.

If to remember those short terms the MESM was projected, assembled,

and debugged - in two years - and taking into consideration that only 12

people (including Lebedev) took part in the creating who were helped by 15

engineers we shall see that S.A.Lebedev and his team accomplished a feat

(200 engineers and many workers besides 13 main leaders took part in the

creation of the first American computer ENIAC).

As life have showed the foundations of the computer-building laid by

Lebedev are used in modern computers without any fundamental changes.

Nowadays they are well known:

. such devices an arithmetic and memory input-output and control ones

should be a part of a computer architecture;

. the program of computing is encoded and stored in the memory as

numbers;

. the binary system should be used for encoding the numbers and

commands;

. the computations should be made automatically basing on the program

stored in the memory and operations on commands;

. besides arithmetic, logical operations are used: comparisons,

conjunction, disjunction, and negation;

. the hierarchy memory method is used;

. the numerical methods are used for solving the tasks.

the main fault of The 70s

or

the years of “might-have-been hopes”

4

[pic]HE GREAT ACCUMULATED EXPERIENCE IN CREATING COMPUTERS, THE

PROFOUND COMPARISON OF OUR DOMESTIC ACHIEVEMENTS WITH THE NEW EXAMPLES OF

FOREIGN COMPUTER TECHNIQUE PROMPTED THE SCIENTISTS THAT IT IS POSSIBLE TO

CREATE THE COMPUTING MEANS OF NEW GENERATION MEETING THE WORLD STANDARDS.

OF THAT OPINION WERE MANY OUTSTANDING UKRAINIAN SCIENTISTS OF THAT TIME -

LEBEDEV, DORODNITSIN, GLUSHKOV AND OTHERS. THEY PROCEEDED FROM QUITE A

FAVORABLE SITUATION IN THE COUNTRY.

The computerization of national economy was considered as one of the

most essential tasks. The decision to create the United system of computers

- the machines of new generation on integrals.

The USA were the first to create the families of computers. In 1963-64

the IBM Company worked out the IBM-360 system. It comprised the models with

different capacities for which a wide range of software was created.

A decision concerning the third generation of computers (their

structure and architecture) was to be made in the USSR in the late 60s.

But instead of making the decision based on the scientific grounds

concerning the future of the United system of computers the Ministry of

Electronic Industry issued the administrative order to copy the IBM-360

system. The leaders of the Ministry did not take into consideration the

opinion of the leading scientists of the country.

Despite the fact that there were enough grounds for thinking the 70s

would bring new big progresses, those years were the step back due to the

fault way dictated by the highest authorities from above.

1

THE COMPARISON OF THE COMPUTER DEVELOPMENT

IN THE USA AND UKRAINE

[pic]T THE TIME WHEN THE COMPUTER SCIENCE WAS JUST UPRISING THIS TWO

COUNTRIES WERE ONE OF THE MOST NOTICEABLY INFLUENTIAL. THERE WERE A LOT OF

TALENTED SCIENTISTS AND INVENTORS IN BOTH OF THEM. BUT THE SITUATION IN

UKRAINE (WHICH AT THAT TIME WAS ONE OF 15 REPUBLICS OF THE FORMER USSR) WAS

COMPLICATED, ON ONE HAND, WITH THE CONSEQUENCES OF THE SECOND WORLD WAR

AND, ON THE OTHER HAND, AT A CERTAIN PERIOD CYBERNETICS AND COMPUTER

SCIENCE WERE NOT ACKNOWLEDGED. OF CAUSE, LATER IT WENT TO THE PAST, BUT

NEVERTHELESS IT PLAYED A NEGATIVE ROLE ON THE UKRAINIAN COMPUTER

DEVELOPMENT.

It also should be noticed that in America they paid more attention to

the development of computers for civil and later personal use. But in

Ukraine the attention was mainly focused on the military and industrial

needs.

Another interesting aspect of the Ukrainian computer development was

the process of the 70s when “sovietizing” of the IBM-360 system became the

first step on the way of weakening of positions achieved by the Soviet

machinery construction the first two decades of its development. The next

step that led to the further lag was the mindless copying by the SU

Ministry of Electronic Industry and putting into production the next

American elaborations in the field of microprocessor equipment.

The natural final stage was buying in enormous quantities of foreign

computers last years and pressing to the deep background our domestic

researches, and developments, and the computer-building industry on the

whole.

Another interesting aspect of the Ukrainian computer development was

the process of the 70s when the “sovietising” of the IBM-360 system became

the first step on the way of weakening of positions, achieved by the Soviet

machinery construction of the first two decades of its development. The

next step that led to the further lag was the mindless copying of the next

American elaborations in the field of microprocessor technique by the

Ministry of Computer Industry.

:

CONCLUSION

[pic]AVING ANALYZED THE DEVELOPMENT OF COMPUTER SCIENCE IN TWO

COUNTRIES I HAVE FOUND SOME SIMILAR AND SOME DISTINCTIVE FEATURES IN THE

ARISING OF COMPUTERS.

First of all, I would like to say that at the first stages the two

countries rubbed shoulders with each other. But then, at a certain stage

the USSR was sadly mistaken having copied the IBM-360 out of date

technology. Estimating the discussion of possible ways of the computer

technique development in the former USSR in late 1960s - early 1970s from

the today point of view it can be noticed that we have chosen a worse if

not the worst one. The only progressive way was to base on our domestic

researches and to collaborate with the west-European companies in working

out the new generation of machines. Thus we would reach the world level of

production, and we would have a real base for the further development

together with leading European companies.

Unfortunately the last twenty years may be called the years of

“unrealized possibilities”. Today it is still possible to change the

situation; but tomorrow it will be too late.

Will the new times come? Will there be a new renaissance of science,

engineering and national economy as it was in the post-war period? Only one

thing remains for us - that is to wait, to hope and to do our best to reach

the final goal.

bibliography:

1. Á.Ì.ÌÀËÈÍÎÂÑÜÊÈÉ “²ÑÒÎгÿ ÎÁ÷ÈÑËÞÂÀËÜÍο ÒÅÕͳÊÈ Â ÎÑÎÁÀÕ”, ÊÈ¿Â,

1995.

2. Stephen G. Nash “A History of Scientific Computing”, ACM Press

History Series, New York, 1990.

3. Åíöèêëîïåä³ÿ ê³áåðíåòèêè, Êè¿â, 1985.

4. The America House Pro-Quest Database: “Byte” Magazine, September,

1995.

5. William Aspray, Charles Babbage Institute Reprint Series in the

History of Computing 7, Los Angeles, 1985.

6. D.J.Frailey “Computer Architecture” in Encyclopedia of Computer

Science.

7. Stan Augarten “Bit by Bit: An Illustrated History of Computers”, New

York, 1984.

8. Michael R. Williams “A History of Computing Technology”, Englewood

Cliffs, New Jersey, 1985.

“³ä ÁÅÑÌ äî ñóïåð-ÅÎÌ. Ñòîð³íêè ³ñòî𳿠²íñòèòóòó ²ÒÌ òà ÎÒ ³ì.

Ñ.Î. Ëåáåäåâà ÀÍ ÓÐÑÐ ó ñïîãàäàõ ñï³âðîá³òíèê³â” ï³ä ðåäàêö³ºþ Ã.Ã.

Ðÿáîâî¿, 1988.


© 2010 Ñîâðåìåííûå ðåôåðàòû