This question arose from comments about different kinds of progress in computing over the last 50 years or so.
I was asked by some of the other participants to raise it as a question to the whole forum.
The basic idea here is not to bash the current state of things but to try to understand something about the progress of coming up with fundamental new ideas and principles.
I claim that we need really new ideas in most areas of computing, and I would like to know of any important and powerful ones that have been done recently. If we can't really find them, then we should ask "Why?" and "What should we be doing?"
I believe Unit Testing, TDD and Continuous Integration are significant inventions after 1980.
JIT compilation was invented in the late 1980s.
Outside of hardware innovations, I tend to find that there is little or nothing new under the sun. Most of the really big ideas date back to people like von Neumann and Alan Turing.
A lot of things that are labelled 'technology' these days are really just a program or library somebody wrote, or a retread of an old idea with a new metaphor, acronym, or brand name.
Tagging, the way information is categorized. Yes, the little boxes of text under each question.
It is amazing that it took about 30 years to invent tagging. We used lists and tables of contents; we used things which are optimized for printed books.
However 30 years is much shorter than the time people needed to realize that printed books can be in smaller format. People can keep books in hands.
I think that the tagging concept is underestimated among core CS guys. All research is focused on natural language processing (top-down approach). But tagging is the first language in which computers and people can both understand well. It is a bottom-up approach that makes computers use natural languages.
The rediscovery of the monad by functional programming researchers. The monad was instrumental in allowing a pure, lazy language (Haskell) to become a practical tool; it has also influenced the design of combinator libraries (monadic parser combinators have even found their way into Python).
Moggi's "A category-theoretic account of program modules" (1989) is generally credited with bringing monads into view for effectful computation; Wadler's work (for example, "Imperative functional programming" (1993)) presented monads as practical tool.
BitTorrent. It completely turns what previously seemed like an obviously immutable rule on its head - the time it takes for a single person to download a file over the Internet grows in proportion to the number of people downloading it. It also addresses the flaws of previous peer-to-peer solutions, particularly around 'leeching', in a way that is organic to the solution itself.
BitTorrent elegantly turns what is normally a disadvantage - many users trying to download a single file simultaneously - into an advantage, distributing the file geographically as a natural part of the download process. Its strategy for optimizing the use of bandwidth between two peers discourages leeching as a side-effect - it is in the best interest of all participants to enforce throttling.
It is one of those ideas which, once someone else invents it, seems simple, if not obvious.