As Knuth said,
We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.
This is something which often comes up in Stack Overflow answers to questions like "which is the most efficient loop mechanism", "SQL optimisation techniques?" (and so on). The standard answer to these optimisation-tips questions is to profile your code and see if it's a problem first, and if it's not, then therefore your new technique is unneeded.
My question is, if a particular technique is different but not particularly obscure or obfuscated, can that really be considered a premature optimisation?
Here's a related article by Randall Hyde called The Fallacy of Premature Optimization.
Premature optimization to me means trying to improve the efficiency of your code before you have a working system, and before you have actually profiled it and know where the bottleneck is. Even after that, readability and maintainability should come before optimization in many cases.
The way I see it is, if you optimize something without knowing how much performance you can gain in different scenario IS a premature optimization. The goal of code should really making it easiest for human to read.