This question arose from comments about different kinds of progress in computing over the last 50 years or so.
I was asked by some of the other participants to raise it as a question to the whole forum.
The basic idea here is not to bash the current state of things but to try to understand something about the progress of coming up with fundamental new ideas and principles.
I claim that we need really new ideas in most areas of computing, and I would like to know of any important and powerful ones that have been done recently. If we can't really find them, then we should ask "Why?" and "What should we be doing?"
To address the two questions about "Why the death of new ideas", and "what to do about it"?
I suspect a lot of the lack of progress is due to the massive influx of capital and entrenched wealth in the industry. Sounds counterintuitive, but I think it's become conventional wisdom that any new idea gets one shot; if it doesn't make it at the first try, it can't come back. It gets bought by someone with entrenched interests, or just FAILs, and the energy is gone. A couple examples are tablet computers, and integrated office software. The Newton and several others had real potential, but ended up (through competitive attrition and bad judgment) squandering their birthrights, killing whole categories. (I was especially fond of Ashton Tate's Framework; but I'm still stuck with Word and Excel).
What to do? The first thing that comes to mind is Wm. Shakespeare's advice: "Let's kill all the lawyers." But now they're too well armed, I'm afraid. I actually think the best alternative is to find an Open Source initiative of some kind. They seem to maintain accessibility and incremental improvement better than the alternatives. But the industry has gotten big enough so that some kind of organic collaborative mechanism is necessary to get traction.
I also think that there's a dynamic that says that the entrenched interests (especially platforms) require a substantial amount of change - churn - to justify continuing revenue streams; and this absorbs a lot of creative energy that could have been spent in better ways. Look how much time we spend treading water with the newest iteration from Microsoft or Sun or Linux or Firefox, making changes to systems that for the most part work fine already. It's not because they are evil, it's just built into the industry. There's no such thing as Stable Equilibrium; all the feedback mechanisms are positive, favoring change over stability. (Did you ever see a feature withdrawn, or a change retracted?)
The other clue that has been discussed on SO is the Skunkworks Syndrome (ref: Geoffrey Moore): real innovation in large organizations almost always (90%+) shows up in unauthorized projects that emerge spontaneously, fueled exclusively by individual or small group initiative (and more often than not opposed by formal management hierarchies). So: Question Authority, Buck the System.
Computer Worms were researched in the early eighties of the last century in the Xerox Palo Alto Research Center.
From John Shoch's and Jon Hupp's The "Worm" Programs - Early Experience with a Distributed Computation" (Communications of the ACM, March 1982 Volume 25 Number 3, pp.172-180, march 1982):
Quoting Alan Kay: "The best way to predict the future is to invent it."
Google's Page Rank algorithm. While it could be seen as just a refinement of web crawling search engines, I would point out that they too were developed post-1980.
Software:
Virtualization and emulation
P2P data transfers
community-driven projects like Wikipedia, SETI@home ...
web crawling and web search engines, i.e. indexing information that is spread out all over the world
Hardware:
the modular PC
E-paper
Better user interfaces.
Today’s user interfaces still suck. And I don't mean in small ways but in large, fundamental ways. I can't help but to notice that even the best programs still have interfaces that are either extremely complex or that require a lot of abstract thinking in other ways, and that just don't approach the ease of conventional, non-software tools.
Granted, this is due to the fact that software allows to do so much more than conventional tools. That's no reason to accept the status quo though. Additionally, most software is simply not well done.
In general, applications still lack a certain “just works” feeling are too much oriented by what can be done, rather than what should be done. One point that has been raised time and again, and that is still not solved, is the point of saving. Applications crash, destroying hours of work. I have the habit of pressing Ctrl+S every few seconds (of course, this no longer works in web applications). Why do I have to do this? It's mind-numbingly stupid. This is clearly a task for automation. Of course, the application also has to save a diff for every modification I make (basically an infinite undo list) in case I make an error.
Solving this probem isn't even actually hard. It would just be hard to implement it in every application since there is no good API to do this. Programming tools and libraries have to improve significantly before allowing an effortless implementation of such effords across all platforms and programs, for all file formats with arbitrary backup storage and no required user interaction. But it is a necessary step before we finally start writing “good” applications instead of merely adequate ones.
I believe that Apple currently approximates the “just works” feeling best in some regards. Take for example their newest version of iPhoto which features a face recognition that automatically groups photos by people appearing in them. That is a classical task that the user does not want to do manually and doesn't understand why the computer doesn't do it automatically. And even iPhoto is still a very long way from a good UI, since said feature still requires ultimate confirmation by the user (for each photo!), since the face recognition engine isn't perfect.
Open Source community development.