I'm learning Haskell, and I find this pretty confusing because -> seems to have two different meanings. (1) separating argument types, and (2) designating the return type:f :: Int -> Int -> Int
Coming at this de novo, it looks to be saying "a function which takes an Int and returns a function which takes an Int and returns an int." I.e., as if it's a shorthand for this:f :: Int -> (f :: Int -> Int)
But that's not the case, of course. Is there some reason or logic for this that can help me understand these statements? Because, for clarity's sake, in my opinion, the two different meanings should have two different representations. E.g.,f :: Int, Int -> Int
Currently, my way of parsing these is, "Look for the last type in the list ... that's the return type. All the others are argument types."
EDIT: So! It looks the Haskell syntax really does mean what it says. I probably got confused by some tutorials which tried to simplify things.submitted by caninestrychnine
[link] [20 comments]
Learning Haskell is pretty mind blowing at times. I'm trying to really understand this example, found in rosettacode.org:stripChars :: String -> String -> String stripChars = filter . flip notElem
Testing in GHCI:
stripChars "aei" "She was a soul stripper. She took my heart!"
"Sh ws soul strppr. Sh took my hrt!"
notElem returns a function that will return True or False whether the element is in the list? This is then passed to flip, which returns another function (with its arguments flipped) to filter?submitted by joeblessyou
[link] [6 comments]
What I still don't understand is what reflection is for. The only real-world example given is this:reify 6 (\p -> reflect p + reflect p)
I do not understand what this is for; I would have just written(\p -> p + p) 6
How does reflection provide anything useful above just standard argument passing?
The original paper has a blurb about the motivation, describing the "configuration problem", but it just makes it sound like reflection is a complex replacement for ReaderT.
Can someone help me out in understanding this package?submitted by NiftyIon
[link] [24 comments]
Got something burning inside you? Waiting for the right time to give it voice?
Well, this is it.
HWN solicits Haskell enthusiasts from all walks of life for reviews, reports, criticism pieces, and personal essays about Haskell.
Not only will you get paid, you'll also be supported by a professional team of editors who'll help you say what you mean, help you mean what you say, and make sure you get read from start to finish.
And if you're looking for an advantage in a tough job market, nothing makes you stand out like "I can communicate better than everyone else, here's proof."
- Flexible word count: anywhere between 250 and 750 words is fine.
- Published pieces pay at 68 bitcents, under 200 euros / dollars right now. If you're in the US, I can cut you a check.
- Send me an outline first.
Interested? Email me: email@example.com with a subject line that starts with "[HWN pitch]" followed by the title of your piece. In the body, outline the main points you'll hit and explain how they engage the community.
Best, Kim-Ee Yeoh, Editor of HWN
p.s. Veteran Haskeller Heinrich Apfelmus will feature his article soon. Look out for it.
p.p.s. Missed the latest HWN? It's below. Back issues here.
- Gabriel Gonzalez evaluates Haskell in the style of a State-of-the-Union address. He rates Haskell from Immature to Mature to Best-in-class under 28 headings, the first four being Compilers, Server-side web programming, Scripting / Command-line applications, and Numerical programming. He also recommends libraries and tutorials under each heading. Reverberations on Hacker News and /r/haskell.
- Challenged over claims of FP productivity improvement, Douglas M. Auclair rattles off success stories from his previous work at various subsidiaries of the US Federal Gov fending for the taxpayer to the tune of billions of dollars. Nibbles of interest on Hacker News.
- Aaron Wolf goes from zero programming directly to Haskell and writes of his experience. His favorite learning resource is the Haskell Wikibook, which he can improve as he reads. He is co-founder of Snowdrift.coop, a crowdfunding platform for freely-licensed works. The Haskell Reddit finds Aaron's testimony a change from the "Haskell is too hard for me" meme.
- The season of introspection continues. On the heels of Hu, Hughes, and Wang on "How Functional Programming Mattered" (see previous HWN); Michael Green, Kathleen Fisher, and David Walker track the ebb and flow of research topics in the conference proceedings of the Big Four: Principles of PL (POPL), PL Design and Implementation (PLDI), International Conference on FP (ICFP); and OOP, Systems, Languages, Apps (OOPSLA). No mention of Haskell but if you're looking for a brief history of PL research -- the slides are even more succinct -- this is the only data-driven survey you'll find.
- Doug Beardsley reminds us that date-based version inference cannot replace the role of explicit version upper bounds. The plain reason is that the package developer might not be using the latest version of its dependencies on the day they publish the work. Moreover, among the 72 comments of the /r/haskell convo, Doug observes that Stackage over-conservatively locks to a single version, whereas community-wide adherence to the Package Versioning Policy (PVP) of original hackage yields seamless delivery of bugfixes and improvements.
- In less than a week, Xmonad will lose its issue tracking system. On Aug 24, Google Code goes read-only. Community heroes Brandon Allbery and Daniel Wagner work at grabbing a backup of the issues. Still no consensus over what and where to migrate to.
- Mark Dominus delves into the bits and bytes of the 1999 Cosmic Call attempt by astrophysicists to contact aliens. He shows the visual bitmaps transmitted into space. Brent Yorgey writes to say he enjoys the 23-part series interspersed with little puzzles.
Quotes of the Week:
- Doug McIlroy: Conditional compilation is admitting defeat.
- /u/kamatsu: I feel like the reason people find Haskell an eye-opening experience is because their CS education was deficient.
- @wfaler: Is there a club to join when you silently sob at having to give your Monad Transformers Monad Transformers? Sounds a lot like #EnterpriseFP
[link] [2 comments]
This is page 6 of the Cosmic Call message. An explanation follows.
The 10 digits again:
Page 6 discusses fundamental particles of matter, the structure of the hydrogen and helium atoms, and defines glyphs for the most important chemical elements.
Depicted at top left is the hydrogen atom, with a proton in the center and an electron circulating around the outside. This diagram is equated to the glyph for hydrogen.
The diagram for helium is similar but has two electrons, and its nucleus has two protons and also two neutrons.
The illustrations may puzzle the aliens, depending on how they think of atoms. (Feynman once said that this idea of atoms as little solar systems, with the elctrons traveling around the nucleus like planets, was a hundred years old and out of date.) But the accompanying mass and charge data should help clear things up. The first formula says
the mass of the proton is 1836 times the mass of the electron, and that 1836, independent of the units used and believed to be a universal and fundamental constant, ought to be a dead giveaway about what is being discussed here.
If you want to communicate fundamental constants, you have a bit of a problem. You can't tell the aliens that the speed of light is furlongs per fortnight without first explaining furlongs and fortnights (as is actually done on a later page). But the proton-electron mass ratio is dimensionless; it's 1836 in every system of units. (Although the value is actually known to be 1836.15267; I don't know why a more accurate value wasn't given.)
This is the first use of subscripts in the document. It also takes care of introducing the symbol for mass. The following formula does the same for charge : .
The next two formulas, accompanying the illustration of the helium atom, describe the mass (1.00138 protons) and charge (zero) of the neutron. I wonder why the authors went for the number 1.00138 here instead of writing the neutron-electron mass ratio of 1838 for consistency with the previous ratio. I also worry that this won't be enough for the aliens to be sure about the meaning of . The 1836 is as clear as anything can be, but the 0 and -1 of the corresponding charge ratios could in principle be a lot of other things. Will the context be enough to make clear what is being discussed? I suppose it has to; charge, unlike mass, comes in discrete units and there is nothing like the 1836.
The second half of the page reiterates the symbols for hydrogen and helium and defines symbols for eight other chemical elements. Some of these appear in organic compounds that will be discussed later; others are important constitutents of the Earth. It also introduces symbol for “union” or “and”: . For example, sodium is described as having 11 protons and 12 neutrons.
Most of these new glyphs are not especially mnemonic, except for hydrogen—and aluminium, which is spectacular.next article will discuss page 7, shown at right. It has three errors. Can you find them? (Click to enlarge.)
I enjoyed so much discussion about What are haskellers critiques of clojure?(https://www.reddit.com/r/haskell/comments/3gtbzx/what_are_haskellers_critiques_of_clojure/), What are Haskellers' critiques of Scala? (https://www.reddit.com/r/haskell/comments/3h7fqr/what_are_haskellers_critiques_of_scala/) and similar that I'd like to hear also opinions about ML-family languages.submitted by gsscoder
[link] [111 comments]
I would normally post this in /r/haskellquestions but I feel that this is probably relevant to a broader audience.
I just noticed that the compiled version of a test programm is running twice (on average 2.23x) slower than the interpreted version of the same program.
I have no clue why this is possible, so I want to ask you to speculate on probable causes and/or ways to track this down.
I am using GHC 7.10.1 and 7.10.2 on a current Arch Linux and Ubuntu 14.04.2. Compiling with and without -fllvm and with several -O levels.
Code to reproduce this can be found here: https://github.com/fhaust/aer-utils (you'll need https://github.com/fhaust/aer too). Just do the common cabal sandbox, cabal run dance. This will display the time per iteration.submitted by goliatskipson
[link] [31 comments]
I need to implement a SQL optimizer for a research project. I know how to do that in C++ but I am not really looking forward to it (writing an optimizer is really tedious). I am now thinking whether Haskell would be a good match for that task. But since my Haskell experience is limited I need to answer a few questions before I can design the system and I am wondering whether some people in this community would be kind enough to give me some pointers. First question: does anyone see any big problems in using Haskell to do that?
The reason I think Haskell is a good match, is the following: an optimizer generates an relation algebra expression out of a SQL query. It then converts this expression into other, equivalent expressions and tests them against a cost model. At one point it will decide, that the ordering of the operators is good enough (using some heuristics - since optimizing a SQL query is an NP-complete problem) and it will generate an execution plan out of the resulting expression. While you can do it in C++ (of course), Haskell has some benefits:
- Expression conversions need to be correct (otherwise the user will get a wrong result) - since we are talking about an algebra the type system should help me to write correct code.
- It handles a lot of tree structures. Conceptually, for every step the optimizer generates a new tree and throws the old away - this sounds very functional to me.
- Haskell seems to be fast enough (since we have strong storage and execution separation scaling out the processing is trivial and I am willing to sacrifice a few CPU cycles there - my hope is of course that I will be, in return, able to get a better optimizer).
One problem is that the lower level is implemented in C++, so I will have to implement a C interface to communicate with Haskell (but this is feasable). But I am unsure whether I will be able to handle the following potential problems:
- We use our own threading model (since we have better knowledge about the workload than the OS or another general runtime - like the one from Haskell). To execute a query you need to pass a function (which would be Haskells execution tree) to a client handle that will run the function in another thread. This should be fine unless the Haskell runtime makes some assumptions about the threading or if it tries to do its own threading. Also, are there hidden locks? Since we have fibers these might result in deadlocks and poor performance (the GC is one problem for sure, but maybe there are others).
- The operators within the execution model will be special iterators. My plan is to translate them to a Haskell list and lazy evaluation should make sure that the iterator does not get forwarded further than needed. Is this assumption correct? I don't want to use a state monad here (or at least I don't want to propagate it all the way up)...
I am grateful for answers and sorry for the long post and the fuzzy questionsubmitted by cppd
[link] [6 comments]
I am stream-processing lots of data and accumulating some stuff from the stream along the road. This data I accumulate is simply a map (or, sometimes, a map of maps) where a ByteString is a key and Int is a value. Consider a word counting example, it is somehow close to what I am doing. The data itself isn't that huge, it only takes ~200Mb when written on disk, but it takes ~15-18 gigabytes in memory when I use Data.Map. I have tried things like IntMap and HashMap, doesn't help much in terms of memory. Judy arrays aren super fine, but since they can only accept Word as keys they are not very useful in my situation.
So here is the question: is there anything "better" implementation of a Map that is more optimised for memory consumption? The current ratio 200MB:18GB doesn't seem to be very usable...submitted by alexeyraga
[link] [13 comments]
Most embedded systems development is done in C. It's rare to see a functional programming language target any kind of microcontroller, let alone an 8-bit microcontroller with only a few kB of RAM. But the team behind the OcaPic project has somehow managed to get OCaml running on a PIC18 microcontroller. To do so, they created an efficient OCaml virtual machine in PIC assembler (~4kB of program memory), and utilized some clever techniques to postprocess the compiled bytecode to reduce heap usage, eliminate unused closures, reduce indirections, and compress the bytecode representation. Even if you're not interested in embedded systems, you may find some interesting ideas there for reducing overheads or dealing with constrained resource budgets.