Free Markets, Free People

Welcome to the future

I’ve spent the last 20 years developing software, managing software development, and doing software systems analysis full-time. I’ve been programming since I was 16. The first time I went to college, my major was computer science.  Since then, I’ve seen one major revolution in the computer industry, which essentially blew up everything that came before it.

That first revolution was the advent of the PC. I started college in 1982, when there were very, very few PCs. Our computer lab had a couple of Apple IIs, Osbornes and TRS-80s. We played with them. They were cute. But we did our business on a Digital DEC 1133 mainframe with 50 dumb terminals and an old IBM 64 that only took punch card input.  Programming was done by an elite group of specialists Who logged into the mainframe and began writing in plain text, or who pushed up to a massive IBM card punch machine, and typed in their programs one line at a time, punching a card for each line of code.

The punch cards were the worst. At least, on the DEC, you you could compile your program while writing it. With the punch cards, you’d stack them up all in order—being very careful to write the card number on the back in magic marker, in case you dropped your card stack and had to re-sort the cards—and turn them into the computing center.  Then you’d go back the next day to learn that you’d forgotten to type a comma in card 200, and you had to re-type the card. You’d turn your stack in for another overnight wait, only to learn you’d missed a comma in card 201.

It was worse than being stabbed.

The PC killed all that. It killed the idea of programmers being this small cadre of elite specialists. And once PCs could talk to each other via a network, they killed the idea that software development had to take a long time, with shadowy specialists toiling away on one large machine in the bowels of the building.  By 1995, self-taught amateurs were programming database applications to handle their small business inventory in FoxPro.  Corporations employed hordes of programmers to build complicated database applications.

In fact, for all the snide attitudes they get, the people at Microsoft pretty much single-handedly created the worldwide software development community we have today. The Visual Basic for Applications programming interface for Microsoft Office, Visual Basic, and, eventually, the .NET Framework allowed millions of people to learn programming and create applications. Practically every corporation with more than 100 people has their own programming team, building custom desktop software for the organization.  Millions of people are employed as free-lance software developers. Yes, there are other programming technologies out there, but none of them—none—have had the impact on democratizing software development that Microsoft’s has had.

There are still some mainframe computers, of course. IBM makes 90% of them. It’s a small, mostly irrelevant market, except for governments, universities, and very large corporations. Every computer used to be a mainframe. Now, almost none of them are.

We’ve gotten used to this software development landscape. Even the advent of the Internet—while revolutionary in many ways—has not been particularly revolutionary in terms of software development. Until now, the changes to software development caused by the internet have been evolutionary—building on existing technologies. In fact, in some ways, the internet has been devolutionary. Thirty years ago, workers logged in to dumb terminals with no processing power, using a mainframe to do all the work.  Similarly, using the internet until now has mainly meant opening up a dumb web browser like Firefox or Internet Explorer to talk to a web server where all the magic happens. The browser has really been just a display device. The web server takes a request from a browser, retrieves information, gets database data, formats it, and sends it back in simple text so the browser can display it.

Until now.

In the past year or so, a number of technologies have hit the market that make the browser the center of processing instead of the web server.  Now, instead of running programming code on the server to do all the hard processing work, the browser can run JavaScript code and do all the processing locally. This has long been theoretically possible, but it was…difficult. Now, an entirely new kind of programming is possible, with JavaScript tools like Node.JS, Google’s AngularJS, and database technology that departs from the traditional relational database like MongoDB. (Though not, maybe MongoDB itself, for a variety of reasons.)

It’s still in it’s infancy. All of the smooth developer productivity tools we’ve become used to aren’t there yet. It’s still a bit of a pain to program, and it takes longer than we’ve been used to. Indeed, it’s very much like programming was back in the 1990s, when the developer had to code everything.

For instance, in Microsoft’s .NET development environment today, developers have become used to just dragging a text box or check box onto a form and having the software development environment write hundreds of lines of code for them. In a Windows application, a data-driven form that connects to a SQL Server database can be created almost entirely through a drag and drop interface, where the development environment writes thousands of lines of code behind the scenes. The developer has to actually write about 15 lines of code to finish up the form so it works.

We don’t have that yet with the current generation of JavaScript tools. We have to type a lot more code to do fairly simple things. So, for someone like me, who’s lived in the .NET world since 2001, it’s a bit of a pain. But it’s a thousand times better than it was just two years ago. Five years from now, the tools for rapid application development for browser-based applications will be everywhere.

This is the second revolution in computing that will change everything we’ve become used to. Right now, a software application generally has to be installed on your computer. A software application designed for Windows won’t install on a Mac or Linux machine. Well, that’s all going away. Browser-based software is platform independent, which is to say, it doesn’t matter what your computer’s operating system is. Do you need an office suite? Google already has a browser-based one online, and there’s probably someone, somewhere in the world, working on a desktop-based version right now.  Need to access your database to crunch some data? No need for Access or FileMaker Pro. We can do that in the browser, too. In fact, we’re pretty much at the point where there is no commonly-done task that can’t be done in the browser.

We can now make almost—almost—any software application you think of, and anyone, anywhere in the world, can run it, no matter what computer they’ve got. This is the second revolution in computing that I’ve seen in my lifetime, and it’s going to change everything about how applications are developed.  Software that is dependent on the operating system is essentially as dead as the mainframe. Microsoft’s advantage in building a software development community? That’s dead now. Desktop applications? Dead. In 5 years there’ll be nothing that you can’t do in a browser. On a phone. I think that means the days of Windows having a natural monopoly on corporate computing is now dead, too.

There’ll still be desktop systems, of course. And, as long as your system can act as a web server—and pretty much any desktop or laptop can—you’ll still have software that you run on the desktop.  After all, you may want to work locally instead of on a big server run by someone like Google or Microsoft, who’ll be doing God knows what with any data you store on their servers. But your choice of computer or operating system will not be driven by whether the software you want to use is available for it. In fact, in 10 years, when you think of desktop, you may just be thinking of a keyboard and display monitor that you plug your phone into to work more conveniently. Assuming that you just don’t have to talk into it to get things done.

If you’re working in software development, and aren’t embracing the coming wave of platform independent, browser-based programming, you’re not doing yourself any favors. It may take another 10 years or so, but the technology you’re working on right now is dying. For someone like me, who’s invested decades in Windows applications development, it’s a bit sad to see all that accumulated knowledge and experience passing away. It’s not easy to move into an entirely new development technology and go through all of it’s growing pains. But I don’t see any choice.

Thirty years ago, everything I learned in college about how to work with computers got tossed out the window.  All of those hours struggling to write RPG programs in IBM punch cards, learning about mainframes…all of it was utterly useless within a few years when the PC came out. Now it’s happening again.

I remember how, back in the 90s, all the old mainframe Unix guys were grumpy about having to kiss their Unix machines good-bye. In 10 years, I’m not going to be one of those old Unix guys.

17 Responses to Welcome to the future

  • RunBasic is not .js, but it lets you do the same sort of things.
    I think if that guy got his toolset to generate .js, that would be cool.
    And would suddenly raise his sales.

  • Remember, you have to have it run in the cars you test.  😉

  • I am making my third transition in programming. PL/1 and Fortran on mainframes to C++/X on Unix or Java on PC, and now to HTML5 and Javascript.
    I really like my IDEs and was wondering if there is a HTML6 WYSIWYG and Javascript IDE combination that people like. (Free would be good, as I am currently self-unemployed)

    • Well, I don’t know of an really good free ones, although Microsoft Visual Studio express for web works in a pinch, as it does javascript code completion.  The hot new Javascript IDE that all the cool kids are using these days is called Webstorm from JetBrains. It costs $50 for a license, but it’s a pretty comprehemsive javascrip code-writing solution.

      • Mwaaahahahaha – I’m going to ride the COBOL/C/(TAL)  horse all the way to retirement!
        Dead language programming for major corporations who shall not be named.

  • I’ve worked for nearly four decades on embedded systems.
    Most of that time, into today, was writing high performance assembly code in medical equipment, imaging systems and telephone infrastructure. We are a small lot but the rest of computing has not really affected us, except to make more demands for even higher performance.

    • Yes, embedded coding close to the metal is not going away anytime soon. Certainly not going to be replaced by the web-based languages.
      I’d also argue that high performance tools, like database engines for example, will never be written in the web-based languages and the OS will still matter.

      • I’d agree with both of those points, neither of which matter to the desktop user.

      • Heh, try explaining to the young turks why XML for proprietary msg exchange is a huge waste of cycles.  They just keep assuming the hardware will (soon! soon!) overcome their horrendously inefficient behaviors.
        I admit the hardware has saved them, and I further admit that a user on the down stream display doesn’t perceive the difference between response in 5 tenths of a second response and 15 tenths.
        Still.   🙂   No point in wasting bandwidth and cycles in my opinion.   Why…..that power could be used for more movie and music streaming and better texting at work!

        • Well, the XML for message passing is not too bad (big/bloated/bandwith-hog though) … it’s the processing/queries you have to run to do anything. If you can’t do it with SAX (serially) I just say “no”. At least it’s more self-describing than EDI though…

  • “Mainframe UNIX guys?”
    I’m older than you and out of the software biz for quite a while, but my experience on UNIX was DEC PDP 11/44s, then Sun boxes, HP/UX, IBM AIX, VAX UNIX, and this version of BSD running underneath the MAC/OS user interface I’m typing through — that is, nothing I would call a ‘mainframe’.  It’s interesting how terms change meaning.
    Oh, and nobody would ever need more than 640K memory, right? 🙂

    • Yeah, mainframes were big iron IBM, BURROUGHS, Sperry/Univacs.    Mini-computers – DEC VAX/PDP, WANG, DG, TANDEM, STRATUS. (and I forgot…Prime!) and then there were them weird ‘micro’ computer thingies which were just a foolish crazy pipe dream that never caught on!
      I still laugh when people call Tandem “mainframe” and think of my IBM ‘Roadrunner’ first boss spinning in his grave at the idea.  We were essentially an intelligent pre processing tape drive for the mainframe.

  • MS’s Visual ___ lineup stole heavily from Borland’s lineup.  In fact, the right-click context menu usex everywhere was a  Borland invention.

  • My first engineering management job began in 1983.  It was the Gulf Range Drone Control Upgrade System (GRDCUS).  We used 6 Vax 11/785 connected to three shared memory units directly of with PC11B busses.  We had a custom operating system – the Eglin Real Time OS – written by a local GS14.  Our requirement was to fly 4 QF100 target drones over the horizon in formations pulling 6 Gs in missile breaks trying to dodge AIM 120 AMRAAMs.
    The software (75,000 lines) was written in COBOL and Fortran by IBMers who wrote the Space Shuttle software.  They were pretty POed that we didn’t buy hardware from Big Blue.  We finished on time but $12M over budget. Fortunately for me, I was a USAF major instead of a civilian manager.
    Ironically, the Mac Laptop I’m using has more computing horsepower that the whole GRDCUS system.

    • That’s a cool story.  I’m guessing the iPhone has more computing power as well.

  • I wrote my first computer program in 1967, and have worked in both enterprise and embedded real-time environments since. I too have been through at least three revolutions, and expect to live through another one or two since I have no particular desire to retire from the field. Interpretive languages (Javascript, Python, Ruby, Scheme, VB etc.) definitely offer some conveniences to folks who don’t have demanding computational and data exchange requirements. As others have noted in this thread, where there are requirements for signal processing, parallelism, rigorous timelines, challenging size/weight/power constraints and the like, there will be a need for people who can go back to first principles and address the complexity of those environments.