Jesper Hald and others recently did a benchmark of a certain algorithm to figure out, which was fastest. It evolved into a kind of competition to make the fastest algorithm to solve this problem:
Fill a data structure with 1,000,000 random, unsorted values from 0-100
Run through this data structure 200 times, counting
a) number of values = 42
b) average for all 1,000,000*200 values
The benchmark was run on a new T5800 (2GHz, 800MHz) Toshiba laptop running 32-bit Windows Vista. Nothing had been done to the Vista to make it faster or behave in a special way.
The results were interesting, and our conclusions were:
* The fastest algorithm was made in Delphi (2009 version), was reasonably easy to read, and achieved to count the 42-values amongst 200,000,000 values in 55ms. That's pretty fast: 0.275 nanoseconds per value or about 0.5 clock cycles per value.
* Some of the first attempts in C# were 30-50 times slower than the winner.
* C# and Java were generally about 1.5 times slower than Delphi. Using normal generic lists in C# would make it 13 times slower than a simple Delphi implementation with static arrays. Is this comparison fair? Well, that's how the first results were made by the various programmers.
* Using unsafe code in C# seemed obvious, but actually wasn't necessary. It was possible to make it approximately as fast in C# without going unsafe.
* Delphi was approximately same speed inside and outside the IDE, whereas C# was almost 4-5 times slower when running the code inside the IDE.
* PHP was 1000-2500 times slower than Delphi.
* We gave up BASH scripting because it took too long time to fill the array with values (!)
Please do not generalize from this example, because there are many other things in this world than counting and averaging integer values. I'm not saying that Delphi is faster than C# or Java, and always remember, that a performance ratio of less than 2 in a specific algorithm is too little to make a difference in a large application.