Lately I've been pondering the programming of quantum computers. Just because we don't have the hardware yet seems like an awfully bad excuse for not having languages and emulators/simulators. I really don't know much about quantum computing (book recommendations anyone?), but it seems to be based on a somewhat nondeterministic model. Is it going to be like programming in Prolog, but with backtracking that doesn't cost anything in terms of efficiency? Curious.
I am wondering if there exists any interest in doing some research into the low-level performance of ghc-cmpiled Haskell (duh) applications. It seems to me that getting an idea on cache-miss rates, branch prediction, etc. could be of interest to the Haskell community.
The main problem atm lies in the benchmarks that would be needed (of course!). People at #haskell have suggested using the ghc regression suite, but I am not sure if these qualify as real-world apps, i.e. the real stuff people use Haskell for.
All input is appreciated.
In a recent irc discussion with one of the developers of the Glorious Glasgow Haskell Compiler, it was heard that the development version of GHC can now use multiple processors for the same program.