The thin(ish) client revolution

David Berlind: “[P]icture a world where, instead of carrying a notebook computer with you everywhere you go, and instead of having power-drinking desktops in every corner of your house, all you have is a USB key that you take from one dirt cheap thin client to another. On that key is not just all of your personal data (that is stored in the cloud but replicated to your USB key for offline usage), but perhaps a small Web server and some applications, both of which are thin-client friendly.”

I’m in complete agreement that the thin client/web application computing model is the wave of the future, and that the days of the fat client as the primary mechanism for monetizing software are numbered. However, I don’t buy into the notion that software-as-a-service displaces the traditional fat client entirely, for one very simple reason: It’s going to be a while yet before we have truly ubiquitous network connectivity.

Sure, perhaps if you live in the San Francisco Bay area or in a similarly technology centric place, it’s easy to buy into the notion that it won’t be long before we’ll all be connected all the time. However, we can’t lose sight of the fact that a vast majority of the population doesn’t live in a world where technology billboards dominate the morning commute. Heck, my cell phone doesn’t even work when I visit my grandfather in rural Indiana—the notion of an omnipresent wifi fabric built from community hotspots is a long way off there, to put it mildly.

And even when we do get to the day when we have ubiquitous connectivity, we’ll still want caching for optimization and replication to eliminate the single point of failure problem, not to mention the ability to retain control over our own digital lives. So, I view the thin vs. fat client juxtaposition as being not so much about user experience—AJAX has shown us that remarkably good user interfaces can be built on the web. It’ll be more about caching and data replication, two very important considerations in any distributed system. I think most people are missing that. David, as usual, appears to get it.

My prediction: Applications will increasingly move to the web, so you’ll be able to get to your data anytime, anywhere, and from any connected device. No surprise there. But the majority of the time, when you’re using “your” client device (whether it’s a PC in your home or office or a handheld device or something that lives on a USB key and securely attaches to the network via some device that’s outside of your administrative control), you’ll still primarily interact with the web through a fat client of some sort.

To be sure, the fat clients of the future will look a lot different from the fat clients of today. Hint: “Fat client” doesn’t have to mean “self-administered”. Perhaps the fat clients of the future are delivered on demand or via independently developed components that integrate into some larger framework. Perhaps it’s as simple as running a web server locally. But the fat client in whatever form will still be the primary interface the vast majority of the time. That’s why I don’t buy for a minute that Web 2.0 is yet another death knell for Microsoft. They’ve obviously got a great fat client story, and in case you haven’t noticed, they’ve just woken up to what’s going on around them.

By the way, this is the area where I think Linux on the desktop could have the biggest impact—it won’t be about who has the better desktop or who can replicate the popular fat client applications of today in open source. It’ll be about who can mold Linux into the fat client framework of the future, seamlessly integrated with applications and services delivered via the web but open. If I were Microsoft’s competitors in this space, namely Google and Yahoo, I’d be spending a lot of time thinking about this. In today’s fat client world, Microsoft is between them and their customers more often than not, and if the past is any indication, they’ll be able to use this position to tremendous advantage.

9 comments on “The thin(ish) client revolution

  1. Rektide

    I’m so god damned tired of people who think the way to linux saturation is an improved desktop environment. You mention it at the end; you cannot out Microsoft Microsoft. Linux got where it is by becoming the backbone for the network. Now it needs to grow by becoming the threads of connectivity woven across that network.

    I’m pretty sure it was O’reilly who dubbed it thus: dataware. My only reservation is how vital it is that dataware be built on a bed open data, in order to build a true ecosystem of software.

    Distributed has never been so much fun.
    -Rektide

  2. Jay Carlson

    Vinge’s Zones of Thought may seem like a transparent plot device, but 5 years ago I drove from Boston to my home in the rural Midwest. When I walked out the door of the Transcend, I mean, my apartment, I lost broadband. No worries, I had a nice wireless ethernet card—oops, no, I had a CDPD modem. And then it gave out. Not to worry, I had a Sprint phone with the unlimited Vision plan, giving me a reasonably cheap way to get online. Somewhere around here I dropped back to WAP.

    And then I drove away from the interstate. Now my phone claimed to be in “Digital Roam”. No more WAP access. Strictly voice from now on. And then finally, I’m back to “Analog Roam”.

    Analog Roam. The Slowness. This is where I grew up: where every link beyond a local calling area was precious, whether it be Usenet, Citadel, or Fidonet. Life moves more slowly here.

    I bet you can draw bandwidth gradients anywhere. But it was very vivid to me how much the street’s technology fell off as I drove towards my home. Plot your own maps of the US from this.

  3. Rektide

    I rather neglected to provide explanation and justification [and most importantly clarity], but, begging your regard, while my comment may have lain askew, I dont believe that I myself have missed your point one bit. Yes, quite frankly, it was a terribly poor quality comment on my part, and I apologize for such low quality tripe.

    Just briefly here: you spoke of applications moving to the web, both explicitly in the sense of web apps, but also implicitly in the form of applications acting as agents on the web, the duality of thin & thick client systems, whatever forms they take. When you go to Indiana, you rely on a thick client to store your email and syncronize latter. Feed readers work the same way, building a buffer of news we can digest at our leisure. These disconnected systems– perhaps a more apt word than distributed– are potentiated from the open data that drives them, and here I swing the point to my own, that the ability to have your growing platform of clients is possible only with open data… or our nepharious trusted computing platforms and end to end vendor solutions.

    I fear my strong opening marks have already tainted your opinion, and of this I am sorry. I simply feel, as you do, that open source is at a juncture where we can begin to form an ecology of applications, of thin thick and medium sized clients playfully working together. We speak of the same thing; a new collaboration.

    Open source isnt going to tip the scales by more better apps, more better desktop. The game right now is who can disrupt the entrenched computing paradigm with something better. Bollocks, I just said paradigm. -Rektide

  4. Ian Murdock Post author

    Rektide,

    After reading your second comment, I think we see the world pretty much the same way. The point of my comment was that my post had nothing to do with an improved desktop environment, and if anything said I see the focus of many Linux on the desktop advocates on an improved desktop environment to be misguided.

    -ian

  5. michael

    I think the shan-gri-la for most people pushing the thin-client model is the computer is a content delivery device identical to the cable/satellite boxes most people own. This model threatens every aspect of the Linux method/model in the U.S.

    Implicit in this model is there is a monopoly that will own everything and license it in a very limited way to a customer. It will be domination of the latest method of entertainment distribution preferably owned by the media conglomerates.

    In traditional brick-and-mortar markets for entertainment retailing, this battle was long-ago dominated by the entertainment companies who remain profitable because they control the *distribution and price* of entertainment. It is important to note, that the actual production of the entertainment content serves the distribution monopoly and not the other way around.

    So, based on these two observations, I draw the following controversial conclusions.
    1. Linux may temporarily prosper because it beats microsoft’s commoditization game. However, at some point in our lives developing in Linux may be made a crime. It has the ability to threaten the revenue stream of the content delivery monopoly owner. Today’s potential win for Linux turns to tommorows chilling legislation.

    2. Thin client model eliminates all personal control over content. Thereby eliminating the ability to innovate because the individual will not be allowed to use tools to develop anything new for that platform. See TIVO/itunes for how it will likely deny/allow content and more importantly, allow a corporation to control and monetize something that is currently owned by the consumer.

    3. More monopolies will be created and current ones maintained thereby further chilling the ability to create new wealth in certain markets. More importantly, it reinforces the precedent to extend monopolies into developing markets headed for maturity.

    By extending the debian/dcc/whatever server to the desktop you set up a precedent to lock-out future OSS innovation by advocating a thin client monopoly. It takes innovation away from the edges of the network and centralizes it. With the centralization comes control. With control comes monetization and a monopoly market.

    I’m not saying you personally are on course to rule the world, but the people who potentially license your software already own the monopoly of traditional media distribution and are quite happy ruling the world and chilling innovation.

    If any of this is comprehensible, please consider the near-term implications of thin clients carefully.

  6. Rektide

    Thanks Ian. So many desktopians are working hard on building a monolithic UI which will itself outdo OSX and Windows. Thats what I was really talking about when I started spamming about “out Ms’ing Ms,” having somehow neglected your Ms Live linkage (although, I have to say, I’m not worried). I think we’re together in saying what’s needed is a more systems oriented experience. I happen to believe, eventually, it will eventually play out naturally into the heart of the desktop itself.

    Michael, its unreal to think that any one provider could ever make a box which will do everything. The giants still worship convergence, want to build their golden box to ensare a market. And here we are, chatting about how the future is clearly more distributed, that people are demanding to use the faculties of the network, not just the system. And thats where the network externality comes into play, where evolutino happens. The webapps we us are all thin client systems, and witness which ones are flourishing; ones that are open, extensible, and give the user back their data and control. The bubble burst when people realized that no one could provide the central nexus of control. Surival is possible only through the network externalities, through connection. The mere implication that big buisness content holders can fight evolution, decide to stand alone and force the status quo- simply by relying on some hardware tricks and legalese- is actually pretty laughable. Monolopy has lost its competitive advantage.

    You cannot fight Metcalf’s law. Given enough time, an ecosystem will eventually eat anything that stands too tall with too much hubris. Oss is simply here to make sure that the ecosystem is thriving, no matter who’se on top of the the rock pile today. Any one of us can fell the giant.

  7. Ian Murdock Post author

    Michael,

    I agree that the threat of intellectual property restrictions stifling innovation, if that’s what you mean by “developing in Linux may be made a crime”, is real (that’s one of the facets of “decommoditization of the lower levels of the software stack” I mentioned in my “Small pieces…” comment). However, in a world where there’s an open fat client platform, not to mention a small-pieces-loosely-joined world of services that fat client (and the thin clients it augments) consumes, I don’t think “thin client” equals “centrally managed and controlled distribution”. I think it’s merely a convenient aggregation and delivery vehicle for network services; and, moreover, it’s one the end user ultimately controls.

    -ian

  8. Knut Jarl Saelanf

    DSL, Feather and Puppy Linux work fine off the USB-key, but then it need a computer to work on –

Comments are closed.