Zarathustra[H]
Extremely [H]
- Joined
- Oct 29, 2000
- Messages
- 38,907
A compiler could be one day smart enough to notice patterns we're unable to see or simply brute force its way towards the best possible binary for a given combination of source code and platform hardware.
They already are. Back in my Demo Scene days in the 90's when the kids had to squeeze the absolute most they could out of an architecture in order to win the competition, they'd often hyper optimize their code in assembler, and just use higher order languages (mostly C or pascal) to loop that assembler code and provide basic logic.
These days very few people are doing this. Partially because for most software there is no point in optimizing anymore. Your mp3 player or word processor are not going to be anywhere near challenging to a modern CPU, so it doesn't matter if you can squeeze another percent or two out of it. For most tasks CPU cycles are cheap, because they are just sitting there doing nothing, if you don't use them.
For areas in which you actually need to get the most out of your hardware (computation type stuff, encoding, rendering, code breaking, etc, and to a lesser extent games) in some cases we still have limited assembler hand optimizations, but in most cases a good compiler can actually create better performance from C code than a skilled programmer can by using machine code on modern systems.
The part I find particularly amazing is that the use of Java, .NET, Scala and other VM type languages in some situations can out perform statically compiled code, because they run the code directly, and can take into account the overall state of the system when they decide how to implement the code when it is run (called runtime optimizations).
On average VM type code still performs significantly worse than compiled C-type languages, but there are many cases where the opposite is true, and the virtual type languages are gaining on compiled languages every year.