I’ve spent the last 20 years developing software, managing software development, and doing software systems analysis full-time. I’ve been programming since I was 16. The first time I went to college, my major was computer science. Since then, I’ve seen one major revolution in the computer industry, which essentially blew up everything that came before it.
That first revolution was the advent of the PC. I started college in 1982, when there were very, very few PCs. Our computer lab had a couple of Apple IIs, Osbornes and TRS-80s. We played with them. They were cute. But we did our business on a Digital DEC 1133 mainframe with 50 dumb terminals and an old IBM 64 that only took punch card input. Programming was done by an elite group of specialists Who logged into the mainframe and began writing in plain text, or who pushed up to a massive IBM card punch machine, and typed in their programs one line at a time, punching a card for each line of code.
The punch cards were the worst. At least, on the DEC, you you could compile your program while writing it. With the punch cards, you’d stack them up all in order—being very careful to write the card number on the back in magic marker, in case you dropped your card stack and had to re-sort the cards—and turn them into the computing center. Then you’d go back the next day to learn that you’d forgotten to type a comma in card 200, and you had to re-type the card. You’d turn your stack in for another overnight wait, only to learn you’d missed a comma in card 201.
It was worse than being stabbed.
The PC killed all that. It killed the idea of programmers being this small cadre of elite specialists. And once PCs could talk to each other via a network, they killed the idea that software development had to take a long time, with shadowy specialists toiling away on one large machine in the bowels of the building. By 1995, self-taught amateurs were programming database applications to handle their small business inventory in FoxPro. Corporations employed hordes of programmers to build complicated database applications.
In fact, for all the snide attitudes they get, the people at Microsoft pretty much single-handedly created the worldwide software development community we have today. The Visual Basic for Applications programming interface for Microsoft Office, Visual Basic, and, eventually, the .NET Framework allowed millions of people to learn programming and create applications. Practically every corporation with more than 100 people has their own programming team, building custom desktop software for the organization. Millions of people are employed as free-lance software developers. Yes, there are other programming technologies out there, but none of them—none—have had the impact on democratizing software development that Microsoft’s has had.
There are still some mainframe computers, of course. IBM makes 90% of them. It’s a small, mostly irrelevant market, except for governments, universities, and very large corporations. Every computer used to be a mainframe. Now, almost none of them are.
We’ve gotten used to this software development landscape. Even the advent of the Internet—while revolutionary in many ways—has not been particularly revolutionary in terms of software development. Until now, the changes to software development caused by the internet have been evolutionary—building on existing technologies. In fact, in some ways, the internet has been devolutionary. Thirty years ago, workers logged in to dumb terminals with no processing power, using a mainframe to do all the work. Similarly, using the internet until now has mainly meant opening up a dumb web browser like Firefox or Internet Explorer to talk to a web server where all the magic happens. The browser has really been just a display device. The web server takes a request from a browser, retrieves information, gets database data, formats it, and sends it back in simple text so the browser can display it.
It’s still in it’s infancy. All of the smooth developer productivity tools we’ve become used to aren’t there yet. It’s still a bit of a pain to program, and it takes longer than we’ve been used to. Indeed, it’s very much like programming was back in the 1990s, when the developer had to code everything.
For instance, in Microsoft’s .NET development environment today, developers have become used to just dragging a text box or check box onto a form and having the software development environment write hundreds of lines of code for them. In a Windows application, a data-driven form that connects to a SQL Server database can be created almost entirely through a drag and drop interface, where the development environment writes thousands of lines of code behind the scenes. The developer has to actually write about 15 lines of code to finish up the form so it works.
This is the second revolution in computing that will change everything we’ve become used to. Right now, a software application generally has to be installed on your computer. A software application designed for Windows won’t install on a Mac or Linux machine. Well, that’s all going away. Browser-based software is platform independent, which is to say, it doesn’t matter what your computer’s operating system is. Do you need an office suite? Google already has a browser-based one online, and there’s probably someone, somewhere in the world, working on a desktop-based version right now. Need to access your database to crunch some data? No need for Access or FileMaker Pro. We can do that in the browser, too. In fact, we’re pretty much at the point where there is no commonly-done task that can’t be done in the browser.
We can now make almost—almost—any software application you think of, and anyone, anywhere in the world, can run it, no matter what computer they’ve got. This is the second revolution in computing that I’ve seen in my lifetime, and it’s going to change everything about how applications are developed. Software that is dependent on the operating system is essentially as dead as the mainframe. Microsoft’s advantage in building a software development community? That’s dead now. Desktop applications? Dead. In 5 years there’ll be nothing that you can’t do in a browser. On a phone. I think that means the days of Windows having a natural monopoly on corporate computing is now dead, too.
There’ll still be desktop systems, of course. And, as long as your system can act as a web server—and pretty much any desktop or laptop can—you’ll still have software that you run on the desktop. After all, you may want to work locally instead of on a big server run by someone like Google or Microsoft, who’ll be doing God knows what with any data you store on their servers. But your choice of computer or operating system will not be driven by whether the software you want to use is available for it. In fact, in 10 years, when you think of desktop, you may just be thinking of a keyboard and display monitor that you plug your phone into to work more conveniently. Assuming that you just don’t have to talk into it to get things done.
If you’re working in software development, and aren’t embracing the coming wave of platform independent, browser-based programming, you’re not doing yourself any favors. It may take another 10 years or so, but the technology you’re working on right now is dying. For someone like me, who’s invested decades in Windows applications development, it’s a bit sad to see all that accumulated knowledge and experience passing away. It’s not easy to move into an entirely new development technology and go through all of it’s growing pains. But I don’t see any choice.
Thirty years ago, everything I learned in college about how to work with computers got tossed out the window. All of those hours struggling to write RPG programs in IBM punch cards, learning about mainframes…all of it was utterly useless within a few years when the PC came out. Now it’s happening again.
I remember how, back in the 90s, all the old mainframe Unix guys were grumpy about having to kiss their Unix machines good-bye. In 10 years, I’m not going to be one of those old Unix guys.