Performance of interpreted languages
Reading a tutorial on ruby, which starts with the introduction to basics of programming (although I know some already) there was an obligatory mention of two different types of programming languages, compiled and interpreted. Compiled programs don't have to be interpreted on the fly which seems to make their performance inherently better.
However advancements in interpreted languages and use of caching is apparently supposed to make the performance of interpreted languages better.
I'm curious though, it can't be that caching and those sort of "tricks" are the only ways in which interpreted languages gain on performance. There must be something more and closer to the nature of these programs. When talking about caching I usually think about something that's not ideal, but must be used or else.. it's going to be slow. Caching always implies delays that we don't want to have, right? It can't be an ideal solution.
So what else is there?