top of page
Uday Ramteerthkar

Computer History As I have Experienced


With the fast-paced world of

computers and AI, it is hard to

believe that my first encounter with

computer was in the form of a small

minicomputer – only a server unit with

paper tape as its input! This was in early

1980s while I was pursuing my engineering.

It used to take assembly instructions on

a paper tape as input and give its output

on the paper tape itself. No display, only

few red LEDs blinking very fast to show

that some vigorous computing was being

done! It was obvious that we lost interest

soon enough after trying out a few add/

subtract types of programs. However,

the final year of engineering turned

out to be more interesting since we got

to experiment with microprocessors;

they were the real introduction to minicomputers

and laid the base to try out

many things. As a final year project, we

developed EPROM programmer using a

Zylog microprocessor kit.

In 1982, I joined IIT Bombay for MTech.

There I got a lot more exposure to

computing as well as large computers. The

computer science department had the EC-

1030 Russian computer – 3rd generation

LSI technology (equivalent to IBM 360).

I still remember that this huge computer

used to be in a caged room where only

a few privileged people (admins) were

allowed to enter. We could hardly see

this computer from outside – we were

allowed only till the reception room

where we used to go submit our card

decks as programs (in Fortran). We used

to come to know whether the program

was successfully compiled and executed

only a few days later, as there used to be

a long queue. So, it was very important

to review the code (punched instructions

on the card) carefully so that we would

not have to visit many times to get it

executed successfully! In fact, we used

to feel embarrassed to even visit twice

We indeed have come a long way...

Uday Ramteerthkar

to execute a program successfully. So

were the days, life was tough. It is really

astounding that computers have come

a long way and gotten converted into

laptops, Ipads and realise how much the

current generation is lucky to view the

programs and outputs.. on a nice GUI..!

After completing MTech, I had a choice

to join either software or hardware

company. My role model advised me to

choose software as he envisioned more

opportunities and a successful future

in it. Obviously, I joined a software

company, with the hope that I would

learn both hardware (on my own!) and

software. Initial days were hard, not

very encouraging because I had to work

with the same type of computers with

card decks interface. But, once we got

allocated into system support group,

we had an opportunity to work with

sophisticated systems and software. We

were using Burroughs computers with

DMSII network database and ALGOL

as programming language. We were

supposed to maintain and enhance

ALGOL based generators for report

and edit programs. It was an extremely

challenging assignment as the generator

software in ALGOL and macros ran

into 4-5 thousand lines and was tough

to understand. But the concept of a

specification driven generation was

intriguing and that remained my source

of inspiration throughout the rest of my

software career of 35+ years!

In 1986-87, I got an opportunity to work

on the state-of-the-art platform and

software. Our team developed a CASE

(Computer Aided Software Engineering)

tool on IBM 3090 with DB2 – one of the

first few commercial relational databases

then. In fact, this tool was already

developed on Burroughs platform with

DMSII network database. Our job was

to migrate this to IBM platform and

DB2 with many more enhancements. I

would say this software was well ahead

of its time. It had support for many

modelling techniques (Data Modelling,

Data Flow diagramming, Extensible

modelling framework etc), coupled

with reverse engineering modules to

create models from legacy code. Later

on, this became a foundation for many

tools for model driven development,

maintenance as well as reverse

engineering platforms. In fact, we could

see its major utility in resolving Y2K

problems during the century turnaround.

In 1990s, personal computers and

desktops (on intel 386, 486, Pentiums)

came in a big way that shifted power

to individual programmers. Object

oriented programming, Graphic User

Interfaces became a way of developing

applications with more complexity at

the same time with smooth and easyto-

use user interfaces. I remember that

the Super minis/micros (AS400, DEC..)

were getting replaced by intel-based PC/

Servers with more computing power.

STAY UPBEAT

126 Pensive E-Magazine 127

Migrating to Windows, Unix, GUI

based applications with C, C++, Java

as application programming languages

became the main theme for developers

around the world. While we continued

to enhance meta model-based tools,

program generation for Java, GUI,

object-relational database access layers,

we could see many 3GL, 4GL tools

were mushrooming in the market to

improve the developer’s productivity.

Next 10 years saw a focus on developing

large and medium size applications

to address most of the domains.

Subsequently, the development pace

decreased while the focus shifted on

maintaining and enhancing existing

applications in a robust and consistent

way. Huge data centres with all kinds

of applications were set up for large

organizations. The software services

companies started focusing on providing

cost effective solutions to manage these

centres efficiently and cost effectively.

Meanwhile, parallelly during 1990s,

Artificial Intelligence (AI) also emerged

as one stream to handle complex

reasoning problems. Prolog, Lisp based

modules were developed to handle rulebased

functionality. Somehow this theme

did not catch up fast and remained

dormant for some time. But now that

the compute capacity is increasing

multi-fold… handling of large data (Big

Data), associated analytics and reasoning.

using neural networks has become

very much possible. AI has picked up

significantly and machine learning /

deep learning have become state-of-theart

technology now, to resolve problems

& recommend solutions in many

domains – such as Health / Medicine,

Space research, Utilities, Infrastructure,

Finance etc. For tackling large data as

well as complex rules, parallel processing,

GPUs are also being tried by multiple

organizations

I had an opportunity to work on a

framework in recent times (2015-20)

which combines model-based generation,

analytics, AI techniques to automate

many of the mundane tasks as well as

forecast/predict the behaviour issues and

recommend appropriate solutions..

Currently, the cloud computing era is in

vogue. Organizations are moving towards

clouds to have agile, cost effective

operations as well as to avail latest tools,

software, large data processing capacity

etc. Future for computing sees immense

possibilities. AI, Data Science, and cloud

computing will definitely be in the main

stay at least in this decade.

As I look back, I consider myself very lucky to traverse

and experience through all types of computing systems,

databases, programming techniques and new paradigms such

as model driven systems and AI. It has been a very intriguing,

interesting, and enlightening journey all along!

STAY UPBEAT


25 views

Recent Posts

See All

Comments


bottom of page