More on smart vs. dumb, thick vs. thin

November 22, 2009 · 2 min read

After yesterday's post, I found some other folks who have written about the cycle between smart/dumb terminals, thick/thin clients, and centralized/decentralized computing.

Dharmesh Shah wrote "The Thin Client, Thick Client Cycle" at OnStartups in 2006: "One of the repeated cycles I have seen in my 15+ years in the software industry is that we constantly go through this 'thin client / thick client' cycle." He seems to disagree with what I suggested yesterday, that Web application technology (including JavaScript) might end the cycle, saying:

My thesis now is that we are due for another cycle. Why? For the same reason we had the prior cycles: Because there are still problems with the current model. User interfaces for true “thin client” applications basically suck. ... The problem goes back to the platform ... If I were looking to reproduce the user experience of even relatively trivial desktop applications today on the web, it’s hard.

Unclear whether Dharmesh really disagrees with my hypothesis, or if he's just saying that the developer tools and platforms will evolve to make writing good Web apps much easier. (He claims, though, that: "What drives these technology cycles as much as user experience is the developer experience." Paul Graham implied the same thing in his recent essay: "If programmers used some other device for mobile web access, they'd start to develop apps for that instead." I think I disagree, but that's a post for another time.)

More recently, Hal Pomeranz wrote that "the whole Cloud Computing story felt just like another turn in the epic cycle between centralized and decentralized computing" ("Future Cloudy, Ask Again Later" at Righteous IT, Feb 2009). He seems to agree with a point I made about the past cycles being driven by economics:

The centralized vs. decentralized cycle keeps turning because in any given computing epoch the costs of all of the above factors rise and fall. This leads IT folks to optimize one factor over another, which promotes shifts in computing strategy, and the wheel turns again.

Finally, I found a scholarly paper on the subject from 1999 titled "Centralization/decentralization cycles in computing: market evidence" (by D. A. Peak and M. H. Azadmanesh). Abstract:

Strategies concerning centralized and decentralized commercial computing have been major issues for more than two decades. Using longitudinal sales data consolidated into three major computer categories (mainframes, minicomputers, and microcomputers), we investigate whether historical market data show evidence of centralization and decentralization. Our finding of cyclic behavior leads us to conclude that computing sales data exhibits broadly cyclic characteristics. We suggest that computing strategies oscillate unevenly between domination of centralization and decentralization, and that commercial computing has already experienced two centralization/decentralization cycles. Currently, computing is nearing the end of the second cycle's decentralization period and is at the threshold of centralization in a third cycle.

Maybe the most important question to ask is: Why are so many of us in the industry so ignorant of its history? At least, I feel woefully ignorant of computing history, and I bet I'm not alone. As with political history in the US today, it seems that most of us only know the history that we lived through.

Anyone know a good source for an overview of the history of the industry, that would cover major trends such as this one?

Subscribe

Get my posts by email:

Follow @jasoncrawford on Twitter

Subscribe via RSS

Copyright © Jason Crawford. Some rights reserved: CC BY-ND 4.0