Yeah, but sometimes programmers ignore that. For instance, C is defined in terms of chunks of memory that are all the same, but the way people actually write C is by thinking of cache line sizes and disk-read chunks, which mean that all chunks of memory are not the same.
The alternative is to write your programs using the model provided and trust that they will be fast enough. I think that's how Haskell is written now, but I'm not sure.
In either case, though, this is a property of how the language is used rather than the language itself, at least as I understand what he means.
I have no idea about the Haskell/CL thing - I don't know much about how either language is used.
The alternative is to write your programs using the model provided and trust that they will be fast enough. I think that's how Haskell is written now, but I'm not sure.
In either case, though, this is a property of how the language is used rather than the language itself, at least as I understand what he means.
I have no idea about the Haskell/CL thing - I don't know much about how either language is used.