Classic Computer Magazine Archive COMPUTE! ISSUE 123 / NOVEMBER 1990 / PAGE 1

PC VIEW

CLIFTON KARNES

On June 23, 1969, IBM announced that it would no longer sell its mainframe hardware and software together as a package. Until that time, machine makers like IBM supplied all the software their customers needed. This "unbundling" marked the beginning of the software industry.

For the next decade, software sales for the mainframe and minicomputer markets grew steadily. But with the introduction of the IBM PC in 1981, the software scene erupted. In a short time, there were millions of hungry PCs waiting to be fed. Competition among software companies became fierce, and the power and quality of software products improved dramatically.

This improvement in software has come at a price, however. For every enhancement in program design, there's been an increase in program size.

Take our operating system, DOS. The first version of DOS—version 1.0, released in 1981—used 13,312 bytes for the three files that make up the operating system kernal—IBMBIO, IBM-DOS, and COMMAND.COM. By 1983, DOS 2.0 was soaking up 40,960 bytes. In 1984, DOS 3.0 clocked in at 60,416 bytes. And DOS 4.0, released in 1988, used a whopping 108,270 bytes! In eight years, DOS grew more than 800 percent.

You see this hyper-growth in program size everywhere. My first C compiler fit on one 360K floppy disk along with an editor and source files. This wasn't unusual for software in 1983. Microsoft's new C 6.0, with all the bells and whistles installed, uses about 14MB of hard disk space. That's almost a 4000-percent increase in program size in seven years.

It's certainly true that both DOS and C have matured and become much more powerful, but are they 800 and 4000 percent more powerful?

Software's size and complexity is simply straining the hardware engine. If you're running applications that use megabytes of hard disk space, you need a huge hard disk. A 20MB disk isn't enough. And 40MB is too small if you use more than a couple of programs. (And who doesn't?)

The rule of thumb I follow is that any program that uses more than ten percent of your hard disk real estate is too big for your system. By that standard, C 6.0 requires a 140MB hard disk not to cramp your style.

Beyond the sheer size of the installed program, more and more software is demanding extra memory beyond the conventional 640K. Many programs require a megabyte or more of extended or expanded memory. And this trend is sure to continue.

And because the size of programs is soaring, installing new software has changed from a simple matter of copying the files on one or two disks to an event that can literally take hours and can easily involve 20 or more disks.

It's important to remember that it doesn't have to be this way. Software developers have gradually replaced assembly language, which creates the smallest, fastest programs, with C or Pascal, which are both easier languages to program and maintain but produce code that's larger and slower.

There are certain trends, however, that may hold back the ever-increasing bloat in program size. One is the laptop market. There are 2 million laptops out there, and three-fourths of them are floppy-only systems. If the laptop market grows and a large percentage of these machines are floppy-based systems, the software industry will certainly rise to serve them.

Another important factor in this equation is the home market. If home computer use explodes with machines like IBM's new 10-MHz 286-based PS/1 and Tandy's new 10-MHz 8086-based 1000 RL, software developers will encounter another huge market that's physically incapable of running most of its software.

If this market is large enough—and it has the potential to be—the resulting competition to serve it could create a renaissance in lean, fast applications. If that happens, all PC users will benefit.