The Future of Personal Computing, Part 1

By James Kwak

This week, Apple passed Microsoft to become the most valuable technology company in the world (measured by the market value of its stock).* I’ve been wondering about Apple and, in particular, why “apps” — which at first glance struck me as a giant step backward in computing technology — have gotten so much buzz in the media. Then I bought an iPad, and while I understand apps a little better, I’m still perplexed. But since this isn’t a particularly technology-savvy audience, this is going to take some setting up. The background is here in Part 1; Part 2 will be coming shortly.

(Note that here I’m talking about personal computing, which is what people like you and I do on our own; enterprise computing is something very different that I’ve written about before, and still largely takes place on mainframe computers.)

A Little Background

Rather than recap the entire history of computing (hilarious synopsis here, hat tip Brad DeLong), I’ll start in the early 1990s. At this point, many people had personal computers, but for the most part they weren’t connected to anything except maybe a printer. (Actually, in the early 1980s my father brought home one of those primitive modems where you actually placed your phone receiver into a socket to communicate, so we could log into the mainframe at his university, but that was the exception.)

A personal computer has an operating system (Windows, OS X, Linux, etc.). This isn’t quite correct, but you can think of the OS as the software that manages  the physical parts of a computer: it runs the internal parts, like the CPU and the hard disk drive, and it controls the interface to the parts that you interact with, like the keyboard and the screen. There are also applications that run on a computer (Excel, PhotoShop, Half-Life, etc.). These applications don’t directly manage the physical parts of the computer; instead, they talk to the operating system, which in turn talks to the physical parts. They do this via the application programming interface, or API, that is published (made accessible) by the operating system.

For our purposes, there are two important features of this structure. First, each operating system has a different API, so you have to write programs differently for each OS. That doesn’t mean every line of code has to be different, but the way you call lower-level functions will differ across operating systems. On top of this, each OS developer (Microsoft, Apple, etc.) provides a different set of tools that you use to write programs for its OS. Software developers tend to become better at using one set of tools than another, and hence more likely to write programs for one OS than another.

Second, programs that can access the operating system’s API can do a lot of different things to your computer — this is what makes software powerful. At the same time, that means they can do damage to you.

So in the early to mid-1990s, we had self-contained personal computers (Windows or Mac) that ran programs that were written specifically for the operating systems they ran on. (A given program, like Excel, might exist in both Windows and Mac versions, but those were two completely different pieces of software that just looked the same on the outside.) Microsoft dominated this world for a couple of reasons, most importantly that many more programs were being written for Windows than for Mac. I believe this is partly because it was easier to write programs for Windows (Microsoft did a better job providing tools for developers), and partly because the Windows installed base was a lot bigger than the Mac installed base, so a new Windows application had a lot more potential buyers. The Windows installed base was bigger, in turn, because of Microsoft’s business model: it licensed Windows to any hardware manufacturer who wanted it, and therefore you had more diversity, more innovation, and lower price points for Windows PCs than for Macs. There were other factors as well, but those are the basics.

The Internet

Then Tim Berners-Lee gave us the Internet, and Marc Andreesen gave us the browser, and everything changed.

Ever since the mid-1990s, the Internet has played a bigger and bigger role in our daily computing. And so the most important application of all became the Internet browser (Internet Explorer, Netscape, Firefox, Safari, Chrome). This is an application that has the ability to find, display, and interact with resources on the Internet. Like all applications, it talks to the operating system via its API. But it’s special in a few respects.

  • One is simply that many people spend more time in their browsers than in all their other applications put together.
  • Another is that the Internet is largely built around a few basic standards, like HTML (a language that web pages are written in). All browsers have to be able to interpret those standards. So if you build web pages using those standards, you know that all browsers will be able to access them; you don’t have to worry about what operating system your visitor’s computer is running.**
  • A third is that the browser can be designed in such a way as to minimize risk to the computer it is running on. Ordinarily, browsers do not have the ability to modify data on your filesystem. This is for security reasons; the goal is to prevent web sites from automatically launching attacks on your computer. Of course, web sites are constantly asking if you want to save files to your computer, and then you’re on your own. And there are technologies that can be added to a browser, like ActiveX, that give programs on web sites the ability to get at your hard drive. But in principle, it is harder for a program that lives on a web site and runs inside a browser to do damage than for a program that you install on your computer and that has direct access to the operating system via the API.

The result was the golden age of web-based computing. Around a decade ago, during the Internet boom, the idea became popular in the technology community that all computing would move “to the Web.” That is, instead of installing standalone applications that ran on directly on our computers and accessed the operating system’s API directly, the interesting software would live on web sites on the Internet, would conform to Internet standards, and would therefore run properly in any browser. This was supposed to have several benefits:

  • Computing would be safer, since our computers would be protected by our browsers.***
  • People wouldn’t have to worry about installing and updating software — just about keeping track of their bookmarks.
  • Programs would be easier to learn and use for ordinary people, since browsers offer a consistent and intuitive way of interacting with programs.
  • We wouldn’t have to worry about carrying our data around, backing it up, and syncing it between computers, because it would all be on the Internet.
  • Developers would only have to write each program once, because then it would automatically work on all browsers (assuming everyone conformed to standards) and hence on all operating systems.
  • As a corollary, the Age of Microsoft would come to an end, since one pillar of its dominance — the huge community of developers writing for Windows — would now be irrelevant.

To some degree, this has happened. I’m writing this post using Firefox at The computers in my house have three different operating systems and I use three different browsers (Firefox, Safari, and Chrome), which I keep synchronized using XMarks. I spend the vast majority of my computer time in a browser, and not just for consuming information; besides the blog (WordPress), my email, tasks, calendar, and contacts all belong to Google, I try to do most of my lightweight work in Google Documents, I share photos using Flickr, etc. Much of the modern, interactive computing that people do (like Facebook) is done in a browser.

This is, roughly speaking, what Google is all about: a world where the OS and the browser don’t matter because they are just tools to get us onto the Internet, where we keep our data and do all our work. It’s why Google is writing two operating systems, Android and Chrome, that will both be free, and is developing a suite of Web-based “productivity” applications; they want to cripple Microsoft’s business model by giving away their versions of the two things that make Microsoft so profitable: Windows and Office.

Microsoft is still a big, profitable company, because PCs will be around for a long time, most companies use Windows, Office, and other Microsoft products for networking, email, etc., and those products can be very sticky, especially in a corporate environment. But the world is moving away from the 1990s model. Microsoft recognizes this, of course. This is why they fought so hard to crush Netscape in the 1990s — they wanted control of the browser. And it’s why they’ve spent so much money — Hotmail, MSN, .NET, Windows Live, Bing — trying to establish a presence on the Internet. But they just haven’t been very good at it.

So at a high level, this is the story of personal computing over the past fifteen years. But recently there has been a new plot twist, which will be subject of Part 2.

* Great quote by Steve Ballmer in the New York Times story: “Windows phone – boom! We have to deliver devices with our partners this Christmas.” Does he realize that he talks like Ari Gold on Entourage?

** This can be thought of as a kind of isolation layer. With Windows, software developers don’t need to worry about whether the customer has a Dell, HP, or Acer computer; as long as it has Windows, it will behave in a predictable way. With Internet standards, now you don’t need to worry about what OS the customer has, just what browser she has.

*** Yes, browsers have security flaws, so this isn’t a perfect system.

38 thoughts on “The Future of Personal Computing, Part 1

  1. One of the strengths of Mircosoft has been their development platform which allows the windows developers develop new and exciting solutions. Just look at visual studio 2010.

    They’ve been a bit slow with internet search and social networking, but they’ve got some cool development projects

    Live mesh – to sync information between devices which includes mobile devices and cloud

    Streaminsight – pretty cool tech to perform real like analysis and monitoring.

    and others.

  2. Tim Berners-Lee gave us the World Wide Web… not the Internet. I think Al Gore did that one (smile)

  3. We need to understand that hardware part of computer “revolution” by and large is close to completion. There is not much more you can squeeze from chips as quantum effects became prominent.

    I am not sure that software part of revolution is close to completion but applications stabilized a lot. For example Windows 7 is definitely “good enough” desktop. Office is good enough office suit. SO does it make sense to look at alternatives? Probably not that much unless you have some special needs (for example suitability for legal documents or simply can’t afford Office). Both Windows and Office has tremendous staying power — the power of standard de facto so it is next to impossible to dislodge them, especially if you have a fraction of financial resources. Even if you have a lot of financial resources this is an uphill battle as Google now really well with its Office and Exchange replacement efforts.

    Software looks like a field with national monopoly tendencies pronounced (the winner takes it all). So new leader can emerge when either a new stage of technology arrive or old leader self destruct.

    So in my opinion Microsoft can only self-destruct like any aging company when the initial set of visionary leadership (Gates Ballmer) are replaced by “Me too” kind of executives.

    Google actually has the same danger despite being younger company. That have close of monopoly of search but as soon as leaders will leave company might stagnate. For many areas Google is not the best engine even now. For example Windows software is naturally is better covered by Bing (I would say much better).

    Actually even now Google is not best for many types of computer science searches and Google actually is a biggest polluter of Internet stimulating creation of all kinds of fake sites directed of leaching of searches to extract advertising revenue.

    Many people now replace Google with something else just out of protest.

    Apple is a different story. It is very closed company that positioned itself as a seller of quality goods for top prices much like BMW or Mercedes in auto industry. But the problem with Apple is that they cannot become bigger in market share then certain small percentage of the market because at this point they will lose their “upscale” status. Even now price of iPad which is functionally inferior to netbooks is twice higher. Reasonable businesses will never pay such a premium and that’s why penetration to corporate market of Apple is far from a sure thing. Applications are also more expensive for Apple, as they should be as this is a smaller market.

    All this makes further Apple growth rather problematic as luxury segment of computer gadgets is limited and already dominated by Apple. Taken into account forthcoming changing of guard in Apple betting of growth is very problematic.

  4. It’s not so hard to understand. Google wants you to spend ALL your time on the Web (where the ads are), not in Microsoft’s applications. Google’s main business, search-based ads, requires a scale of computing that makes the cost of providing GMail, Google Docs, etc., dmall by comparison. It’s an investment in capturing mindshare.

  5. The main popular use for computers – in whatever form – is now entertainment. Years ago it was for business and improved productivity but once Microsoft got a monopoly the development of things like work processors and spreadsheet came to a halt. Those things are the same now as they were 15 years ago.

  6. For another humorous insight into the PC industry, in which I’ve had decades of experience, you might enjoy my essay from 1984: The Computer as God, which suggests we were then replaying the Protestant Reformation with Steve Jobs as Martin Luther and IBM as the Roman Catholic Church.

    Click to access cmpgod.pdf

  7. I couldn’t agree less.
    For one thing, Bing is a really awful search engine — I’ve tried it thoroughly, and I’ve almost always been disappointed. (Fun experiment: try searching “” on Bing — the official site with the obvious URL doesn’t even appear on the first page. On Google, it’s the first result.) I think Google still holds the top place — that’s why people use it and stick to it. Even with Windows software it seems to be better (the “” experiment proves my point).

    “Google actually is a biggest polluter of Internet stimulating creation of all kinds of fake sites directed of leaching of searches to extract advertising revenue.”
    Whaaatt??? Any evidence, or is this pure speculation?
    (Besides, take a look at Microsoft, which even spams the e-mails of its own Hotmail users.)

    “But the problem with Apple is that they cannot become bigger in market share then certain small percentage of the market because at this point they will lose their “upscale” status.”
    Not really. Apple gets its reputation based on the quality of its products, not the price. Microsoft’s Zunes have always copied Apple’s pricing. Have those gained an “upscale” status that prevents them from spreading?
    Besides, the last time I checked, the iPod has been the world’s music player of choice, the iPhone has quickly become one of the most popular phones worldwide, and the iPad has been selling at dizzying numbers since its release. In terms of market share, they’re increasing exponentially.

    “Even now price of iPad which is functionally inferior to netbooks is twice higher. Reasonable businesses will never pay such a premium and that’s why penetration to corporate market of Apple is far from a sure thing. Applications are also more expensive for Apple, as they should be as this is a smaller market.”

    The iPad right now is a home consumer-oriented device, rather than business-oriented. Apple isn’t trying to break into the corporate market with the iPad, really.
    As for Application prices, it depends. Apple’s iWork suite costs thirty dollars on the iPad, ten per application. Compare that to Microsoft Office rates.
    But comparing desktop applications and touch-based applications is like comparing apples to oranges.

  8. One thing you seem to assume is that despite your operating system, everyone has the ability to interact with internet objects.

    I expect you to touch upon the Apple v. Flash battle currently being played up by the incompetent mainstream media and allow me to preemptively tell you why Apple happens to be right in this instance.

    I am a diehard Linux user. Linux is based on the same foundation as Mac OS – Unix, or more specifically, Berkley Unix. Sure flash ‘works’ on Linux (and most other *nix operation systems) but if you watch how much of the CPU/GPU it uses, it is disgusting. Window$ and Mac OS both, to a degree, pander to flash so that it runs better than on Linux, but Flash is really an extremely bloated and sloppy tool.

    Sure it is nice, and it is convenient, and it has the luxury of being on the market longer than all of its competitors and “everyone has it”, but none of that qualifies it to be a pseudo-standard like it has become. In short, everyone does not have Flash as a standard, especially Linux users. I personally feel that Jobs is calling this one the right way. Adobe in general makes sloppy programs, Acrobat being another such example. If it cannot learn the lessons of making fast, reliable, and non-CPU-intensive programs, it deserves to survive as a company.

  9. I completely agree. Same goes for Microsoft’s Silverlight.

    One issue that’s been given a lot of attention recently is web video. Flash is still the de facto standard, although that seems to be rapidly changing. WebM/VP8 has big potential over the proprietary H.264, but then there are patent disputes looming.

    Before all the problems get sorted out, I expect these proprietary, closed “pseudo-standards” to remain.

  10. That is a very ambitious title. Would you send me a copy of Part II before the markets open on Tuesday?

  11. I don’t get this idea of moving everything to the net. What if for some reason you lose you internet connection? (some optic fiber cable is damaged, simple incompetence from Verizon or whatever company you happen to be with,…) You wouldn’t be able to do anything.

    I’ve never had any problem with an application damaging my computer. I’ve had several viruses coming from the web.

  12. I agree with bunalowbill. And it doesn’t even require a breakdown of some kind to lose your internet connection. Just go on an airplane and your laptop would become useless.

    Also, why would I trust the security of my data living out there somewhere on the net. Particularly in the custody of a corporation that makes its living selling information about people to advertisers.

    I can’t imagine I will ever want to move my applications and data off my own computer.

  13. Tim Berners-Lee wrote the first browser in Objective-C in a NeXT in CERN. Also HTTP Server was his idea.
    Tim never wanted HTML to be seen by user or be hand edited.
    It is the foolishness of Marc Andreesen that did that.

    Apps make it possible for purchase of software like mp3 singles.
    Customer can purchase without having buyers remorse.
    iPad brings safe environment for even technophobes.

    PC industry is like fashion industry. Cloud is nothing more
    then Mainframe. All the browser javascript ajax nonsense is just
    try to duplicate what the desktop app do well.
    That is why iTunes and now AppStore do, they use the browser technology wrapped inside desktop API. That is why iPhone and iPad are successful.

  14. Yeah but I invented Al Gore, so everybody owes me for the internet. You’re welcome!

  15. I would think you mean the browser idea wrapped up in a desktop API. The browser technology wrapped in a desktop API would be… *shudder* awful. More what apps have done is what unix has had as a philosophy for years: do one thing, do it well.

    It’s why they’re quick to develop, why things are not as crash prone, and a centralized and constantly updated place to find them makes it much easier to get what you want. Plus, their cheap price compared to traditional programs (mostly due to the aforementioned single use and ease of development) makes them far more appealing.

  16. Got a new computer a few years ago — a Microsoft unit. When I was trying to learn how to use it in my stumbling dinosaur way, it kept pushing Netscape at me aggressively, and it was not easy for me to circumvent their pushiness. They were rudely forcing their browser down my throat. So I returned the thing to the store and got myself an Apple. Been a macker ever since and I love them.

  17. Tim Berners-Lee invented the World Wide Web, which is not the same thing as the Internet. The Internet was invented by many people, but the principal inventors are widely acknowledged to be Bob Kahn and Vint Cerf. Others (including several distinguished researchers at the Lab where both I and Tim B-L work) were responsible for much of the technology that makes the Internet work today; a former boss of mine was Chief Protocol Architect of the Internet for a while in the 1980s. Marc Andreesen deserves credit (or perhaps blame) for inventing inline images — previously, even GUI-based browsers were text-only — in NCSA Mosaic. (Somewhere I still have a copy of the netnews post where he announced the release of Mosaic that had this feature.) Other people at Netscape, in cooperation with RSA Data Security, were responsible for SSL, which helped to make the Web actually commercially relevant. (More disclosure: the “R” in “RSA” also works at my place of employment.)

  18. Browser is nothing more than client-server concept simplified
    with standard HTML/HTTP protocol.
    What iTunes does is it uses Web Application Servers to vend
    XML data for which iTunes displays in standard UI.
    With iPhone. WebView (Safari API) is packaged inside
    on class which can be embedded inside any NSView.
    Almost no code is needed to write browser. which is what
    all these RSS feeds and simplified Web Apps are doing.

  19. rd above is right about “cloud” computing being nothing more than the mainframe and dumb terminal of yore.”cloud” computing won’t be popular unless people are stupid enough to upload their porn, pirated videos and company secrets on the “cloud” servers.

  20. The thing is, what HTML5 brings and what most web applications are working on is some kind of offline functionality. Google has used its Gears technology to make Google Docs function offline (so the user gets to his files and is able to edit them within a browser offline) and now is moving to HTML5 for the same functionality. So moving to the cloud does not destroy working offline.

    As for the security of storing your data online, it’s not that much of an issue. Your personal data is probably already somewhere on some network anyway (hospital, work, school, bank, …).
    Most personal files one creates on his computer end up shared, whether via e-mail, file sharing services, P2P, or otherwise. We’ve all put trust in those for quite a while, especially e-mail.

  21. It’s O.K. by me. Since every time I buy a product I am paying to be besieged by ads for the product, I consider free Google software like a tax rebate. Beats paying outlandish sums for MS software that is far from user friendly.

  22. I keep the stuff I want personal on an encrypted flash drive – a small part of my stuff.
    When I am collaborating with others I find Google Docs great, because we can have online or offline sessions, each edit and everyone sees the changes being made. Super collaboration tool.
    Also like web albums, because I can send links instead of huge emails.

  23. I choose to ignore Silverlight as I find so few websites (or rather I avoid all sites that do use it) and therefore find it to be like a spider on my windshield, easily crushed by the wipers.

    I am baffled by the hype over H.264. Some people claim it is the savior of linux… but linux is all open-source and given that H.264 (as you so astutely pointed out) is proprietary, it will likely not do much for linux users since no distribution developer/development team would pay for the licenses for their users.

    If people weren’t brainwashed by flash games and YouTube, and similar web content, there might be a chance to break the de facto rule by flash, but people don’t seem to care what happens to them, just so long as they can get their YouTube fix.

  24. Agree that Apple gains market share due to the **quality** of its products. I would add that Apple has two other key innovations that put it far ahead of MS:

    1. Its Genius Bar model for one-on-one quality care for problems or glitches with any of its products.

    2. It’s tutorial systems, in which for around $100 per year, a person can attend any Apple store for one-on-one tutorials.

    It also does a superb job of documenting its information, and always includes a feedback form.

    However, I’d like to also point out to kievite is in my experience, the software industry is about as ‘international’ as a business could get.

    On one time zone, people may start on a product; if bugs show up they are reported and when people 8 time zones later arrive at work, they begin working on the bug list.

    Software forums, design, and tech docs are among the most ‘international’ forms of communication that I’ve encountered.

    So the idea that MS is somehow just a ‘national’ brand doesn’t strike me as accurate. Certainly not in my observation. The same is true for any large software company; if you’re only ‘national’ you really can’t compete. (Unless you are writing some small-scale, specific app for a certain purpose or client. But even then… ‘national’ doesn’t fit what I’ve seen. There may be developers in 4 time zones all working on a project.)

    The Web is international.

    Fortunately, our banking systems don’t quite seem to be keeping up. (The criminals are certainly keeping up, but law enforcement and legislation is several versions late.)

  25. Hooboy… the hours that I’ve spent with Flash should make me upset with Jobs.

    I’m not.

    Jobs made the right call on refusing to let Flash onto iPad.
    I think that he’ll be vindicated, no matter what anyone may be shouting about at the present moment.

    There are some great people in the Flash community who are having a hard time with Jobs’ decision, but looking down the road it sure looks like a very courageous, smart call.

    Because, as James Kwak points out, it’s the Internet that matters for many people — much more than desktop apps. So for Jobs to refuse to let Flash onto the iPad was a bold move. For exactly the reasons that you explain.

  26. I’ll boil it down. There’s been a conspiracy in this country since the civil war.. at least. It’s infantile not to believe this; the proof’s all over the place.
    what do you think “think tanks” are for?
    We have a totally corrupt and police state government.
    We live in a state of total propaganda which we believe to our detriment and division while we are looted in front of our eyes, while our jaws dropped and drool ran out of our mouths. While we are told that our financial system would have collapsed if the government hadn’t robbed US. But, It HAS collapsed.
    If you’re not desperately poor now, just wait a few months.
    The American people are the frog in the proverbial pot and it’s getting hotter, little froggie.
    Why would there be laws against conspiracy if there weren’t any? You think all that power and money wouldn’t inspire a conspiracy? I laugh at you!

  27. Yet Another Reason that Apple has surged: while Microsoft and others are tied to “compatibility” to their legacy products, Apple is committed to “disruptive technology.” (See the fine Wikipedia article if your time is more valuable than a CEO’s and don’t want to read The Inventor’s Dilemma, which puts Apple in a long list of firms that shook up competitors with cheaper, more accessible, often much less capable devices. Or, if you’re an economist, go back to “creative destruction,” a more generic concept.

    So Apple brings out a phone with unheard-of capabilities but a tiny screen, almost no real programs and a chip that made the slowest netbook smug. Microsoft, which must’ve been caught utterly flat-footed by how fast Apple (and now, Google) could upset their comfy applecart, panicked and now has three different, incompatible phone product lines, two incompatible tablet line-ups, one of which was first copied from Apple and then embarrassingly aborted.

    And maybe the ultimate reason that Microsoft has collapsed in the face of this rapid change: today I saw a news report that Steve Ballmer was (now!) personally attending to the hardware and game division. (He’s been CEO officially since 2000— if he couldn’t afford to divert his attention from Windows Vista and MS Office, he should’ve fired himself.

  28. good article…but incomplete…the computer world will always be divided between the consumer universe and the corporate/business universe…and these two universes will never blend harmoniously together at least not in the forseeable future anyways…maybe 20 years from now…in the meantime we’re stuck with microsoft/windows technology…whether you hate or love them microsoft has performed rather well over the past 30+ years by catering to both the consumer and corporate needs…now google wants to break microsoft’s stranglehold (is it monopoly?) especially in the consumer side and google is succeeding spectacularly in that regard which should worry microsoft somewhat…but the corporate/business universe is another matter altogether…strictly relying on the internet browser to run all your business IT operations is not going to cut it for most security obsessed corporations in today’s increasingly high risk environment (there are more vicious hackers now than 10 years ago, some are state sponsored)…most people’s experience with facebook and google this past couple of years is a rude awakening in terms of how vulnerable today’s cloud technology (both facebook and google is cloud computing phase 1)…no fortune 1000 company ceo, cfo or cio today will put his neck on the line and recommend to their board to start migrating valuable business operations and extremely sensitive databases/assets from secure server-based IT to cloud-based (unsecure) IT infrastructure…not just yet

  29. apple’s appeal ahs always been it’s “cool” image…if and when apple (ipod, iphone or ipad) becomes the norm (used by bored housewives, AARP members, busdrivers) then apple loses that “cool” factor…the hard core apple users (more like cult followers) which has supported and sustained apple all these years will find some other “cool” gadget/company to acquire and support…steve jobs understands this that’s why he needs microsoft to be the “uncool villain” to maintain his company’s “cool” image…it’s worked beautifully for apple this past 20 years judging from their stock valuations…

  30. I’m a little clueless, I’m afraid. Why is this on baselinescenario? Having trouble seeing an important connection between this bit of thoughts about technology and the overall economic stuff this normally covers. :-(

  31. I’m an AARP housewife and my son is a busdriver. Our family has had Apples since before he was born.

  32. In the corporate world the cloud would still be inside the corporate firewall. The concept – which is far from new makes a great deal of sense when you consider the operating cost of personal computers vs dumb terminals – it is also far easier to secure a mainframe vs. 1000 pc’s.

Comments are closed.