News aggregator

I have tried googling the solution to this for hours, but the ubiquitous nature of the terms makes finding the result impossible: how the hell do I actually get HUGS to not throw a shitfit with my lambdas

Haskell on Reddit - Tue, 09/09/2014 - 3:04am

I'm working through Programming in Haskell by Graham Hutton. Everything was going fine when I'd work along with him in the book, writing out my little Haskell code in my test.hs file when he started defining new functions. Until the lambdas came. The lambda's ruined everything. It's not the lambdas themselves though. I get currying, lambda abstraction, etc. It's the code. The actual haskell syntax that lets me use the lambdas. HOW?!

How do I actually use " \ " in my code? Whenever I try and type something like this:

\x -> x+x

HUGS tells me: "functions.hs":17 - Syntax error in input (unexpected backslash (lambda))

HUGS likes all of the other functions I've defined just fine. Can I just not type anonymous functions like that all alone in my source code? Can they only be used in defining more complex functions?

submitted by socratesthefoolish
[link] [5 comments]
Categories: Incoming News

Dominic Steinitz: Fun with (Extended Kalman) Filters

Planet Haskell - Tue, 09/09/2014 - 2:28am

An extended Kalman filter in Haskell using type level literals and automatic differentiation to provide some guarantees of correctness.

Population Growth

Suppose we wish to model population growth of bees via the logistic equation

We assume the growth rate is unknown and drawn from a normal distribution but the carrying capacity is known and we wish to estimate the growth rate by observing noisy values of the population at discrete times . Note that is entirely deterministic and its stochasticity is only as a result of the fact that the unknown parameter of the logistic equation is sampled from a normal distribution (we could for example be observing different colonies of bees and we know from the literature that bee populations obey the logistic equation and each colony will have different growth rates).

Haskell Preamble > {-# OPTIONS_GHC -Wall #-} > {-# OPTIONS_GHC -fno-warn-name-shadowing #-} > {-# OPTIONS_GHC -fno-warn-type-defaults #-} > {-# OPTIONS_GHC -fno-warn-unused-do-bind #-} > {-# OPTIONS_GHC -fno-warn-missing-methods #-} > {-# OPTIONS_GHC -fno-warn-orphans #-} > {-# LANGUAGE DataKinds #-} > {-# LANGUAGE ScopedTypeVariables #-} > {-# LANGUAGE RankNTypes #-} > {-# LANGUAGE BangPatterns #-} > {-# LANGUAGE TypeOperators #-} > {-# LANGUAGE TypeFamilies #-} > module FunWithKalman3 where > import GHC.TypeLits > import Numeric.LinearAlgebra.Static > import Data.Maybe ( fromJust ) > import Numeric.AD > import Data.Random.Source.PureMT > import Data.Random > import Control.Monad.State > import qualified Control.Monad.Writer as W > import Control.Monad.Loops Logistic Equation

The logistic equation is a well known example of a dynamical system which has an analytic solution

Here it is in Haskell

> logit :: Floating a => a -> a -> a -> a > logit p0 k x = k * p0 * (exp x) / (k + p0 * (exp x - 1))

We observe a noisy value of population at regular time intervals (where is the time interval)

Using the semi-group property of our dynamical system, we can re-write this as

To convince yourself that this re-formulation is correct, think of the population as starting at ; after 1 time step it has reached and and after two time steps it has reached and this ought to be the same as the point reached after 1 time step starting at , for example

> oneStepFrom0, twoStepsFrom0, oneStepFrom1 :: Double > oneStepFrom0 = logit 0.1 1.0 (1 * 0.1) > twoStepsFrom0 = logit 0.1 1.0 (1 * 0.2) > oneStepFrom1 = logit oneStepFrom0 1.0 (1 * 0.1) ghci> twoStepsFrom0 0.11949463171139338 ghci> oneStepFrom1 0.1194946317113934

We would like to infer the growth rate not just be able to predict the population so we need to add another variable to our model.

Extended Kalman

This is almost in the form suitable for estimation using a Kalman filter but the dependency of the state on the previous state is non-linear. We can modify the Kalman filter to create the extended Kalman filter (EKF) by making a linear approximation.

Since the measurement update is trivially linear (even in this more general form), the measurement update step remains unchanged.

By Taylor we have

where is the Jacobian of evaluated at (for the exposition of the extended filter we take to be vector valued hence the use of a bold font). We take to be normally distributed with mean of 0 and ignore any difficulties there may be with using Taylor with stochastic variables.

Applying this at we have

Using the same reasoning as we did as for Kalman filters and writing for we obtain

Haskell Implementation

Note that we pass in the Jacobian of the update function as a function itself in the case of the extended Kalman filter rather than the matrix representing the linear function as we do in the case of the classical Kalman filter.

> k, p0 :: Floating a => a > k = 1.0 > p0 = 0.1 * k > r, deltaT :: Floating a => a > r = 10.0 > deltaT = 0.0005

Relating ad and hmatrix is somewhat unpleasant but this can probably be ameliorated by defining a suitable datatype.

> a :: R 2 -> R 2 > a rpPrev = rNew # pNew > where > (r, pPrev) = headTail rpPrev > rNew :: R 1 > rNew = konst r > > (p, _) = headTail pPrev > pNew :: R 1 > pNew = fromList $ [logit p k (r * deltaT)] > bigA :: R 2 -> Sq 2 > bigA rp = fromList $ concat $ j [r, p] > where > (r, ps) = headTail rp > (p, _) = headTail ps > j = jacobian (\[r, p] -> [r, logit p k (r * deltaT)])

For some reason, hmatrix with static guarantees does not yet provide an inverse function for matrices.

> inv :: (KnownNat n, (1 <=? n) ~ 'True) => Sq n -> Sq n > inv m = fromJust $ linSolve m eye

Here is the extended Kalman filter itself. The type signatures on the expressions inside the function are not necessary but did help the implementor discover a bug in the mathematical derivation and will hopefully help the reader.

> outer :: forall m n . (KnownNat m, KnownNat n, > (1 <=? n) ~ 'True, (1 <=? m) ~ 'True) => > R n -> Sq n -> > L m n -> Sq m -> > (R n -> R n) -> (R n -> Sq n) -> Sq n -> > [R m] -> > [(R n, Sq n)] > outer muPrior sigmaPrior bigH bigSigmaY > littleA bigABuilder bigSigmaX ys = result > where > result = scanl update (muPrior, sigmaPrior) ys > > update :: (R n, Sq n) -> R m -> (R n, Sq n) > update (xHatFlat, bigSigmaHatFlat) y = > (xHatFlatNew, bigSigmaHatFlatNew) > where > > v :: R m > v = y - (bigH #> xHatFlat) > > bigS :: Sq m > bigS = bigH <> bigSigmaHatFlat <> (tr bigH) + bigSigmaY > > bigK :: L n m > bigK = bigSigmaHatFlat <> (tr bigH) <> (inv bigS) > > xHat :: R n > xHat = xHatFlat + bigK #> v > > bigSigmaHat :: Sq n > bigSigmaHat = bigSigmaHatFlat - bigK <> bigS <> (tr bigK) > > bigA :: Sq n > bigA = bigABuilder xHat > > xHatFlatNew :: R n > xHatFlatNew = littleA xHat > > bigSigmaHatFlatNew :: Sq n > bigSigmaHatFlatNew = bigA <> bigSigmaHat <> (tr bigA) + bigSigmaX

Now let us create some sample data.

> obsVariance :: Double > obsVariance = 1e-2 > bigSigmaY :: Sq 1 > bigSigmaY = fromList [obsVariance] > nObs :: Int > nObs = 300 > singleSample :: Double -> RVarT (W.Writer [Double]) Double > singleSample p0 = do > epsilon <- rvarT (Normal 0.0 obsVariance) > let p1 = logit p0 k (r * deltaT) > lift $ W.tell [p1 + epsilon] > return p1 > streamSample :: RVarT (W.Writer [Double]) Double > streamSample = iterateM_ singleSample p0 > samples :: [Double] > samples = take nObs $ snd $ > W.runWriter (evalStateT (sample streamSample) (pureMT 3))

We created our data with a growth rate of

ghci> r 10.0

but let us pretend that we have read the literature on growth rates of bee colonies and we have some big doubts about growth rates but we are almost certain about the size of the colony at .

> muPrior :: R 2 > muPrior = fromList [5.0, 0.1] > > sigmaPrior :: Sq 2 > sigmaPrior = fromList [ 1e2, 0.0 > , 0.0, 1e-10 > ]

We only observe the population and not the rate itself.

> bigH :: L 1 2 > bigH = fromList [0.0, 1.0]

Strictly speaking this should be 0 but this is close enough.

> bigSigmaX :: Sq 2 > bigSigmaX = fromList [ 1e-10, 0.0 > , 0.0, 1e-10 > ]

Now we can run our filter and watch it switch away from our prior belief as it accumulates more and more evidence.

> test :: [(R 2, Sq 2)] > test = outer muPrior sigmaPrior bigH bigSigmaY > a bigA bigSigmaX (map (fromList . return) samples)

Categories: Offsite Blogs

End of support for test-framework-smallcheck and test-framework-golden

haskell-cafe - Mon, 09/08/2014 - 10:59pm
I'm going to stop maintaining the test-framework-smallcheck and test-framework-golden packages. I recommend switching to tasty's tasty-smallcheck and tasty-golden, respectively. If you are interested in taking over these test-framework packages, let me know. Otherwise, I will continue supporting them for the next 3 months and abandon/deprecate them afterwards. Roman _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe< at >
Categories: Offsite Discussion

Yesod Web Framework: Misassigned credit for conduit

Planet Haskell - Mon, 09/08/2014 - 6:00pm

When I was at ICFP last week, it became clear that I had made a huge mistake in the past three years. A few of us were talking, including Erik de Castro Lopo, and when I mentioned that he was the original inspiration for creating the conduit package, everyone else was surprised. So firstly: Erik, I apologize for not making it clear that you initially kicked off development by finding some fun corner cases in enumerator that were difficult to debug.

So to rectify that, I think it's only fair that I write the following:

  • conduit is entirely Erik's fault.
  • If you love conduit, write Erik a thank you email.
  • More importantly, if you hate conduit, there's no need to complain to me anymore. Erik presumably will be quite happy to receive all such further communications.
  • In other words, it's not my company, I just work here.

Thanks Erik :)

UPDATE Please also read my follow-up blog post clarifying this one, just in case you're confused.

Categories: Offsite Blogs

[code-review] Compute the most depended on packages in Hackage.

Haskell on Reddit - Mon, 09/08/2014 - 5:44pm

If you're interested in helping someone become a better Haskell programmer, I would appreciate a code review along with feedback & criticism. Thanks in advance!

Compute the most depended on packages in Hackage by requesting a list of all packages along with their cabal files. For each cabal file, get a list of unique dependencies. Count these up to determine how many packages depend on a particular package. Sort.


I am comfortable with Haskell, and really enjoy it, but to be shamefully honest it took me 4 hours to do this. I think it is because I am not familiar with the libraries--I think a large portion of the time was spent reading documentation and library code & tests (to see how to use the library!). Any advice on how to be more efficient? Two things I have started doing are try to use command line hoogle and haskell-mode more often, and grow a "cheat sheet" which is just a super dense collection of functions and their types. I write this out by hand.

Another thing I am weary of is that Haskell is beautiful, but when I try to write something 'real' or 'productive' in it, I generally hack my way through it and end up with.. not so beautiful code. Advice on how to grow or build great Haskell code from the beginning?


I have the Control.Concurrent.Async code commented out because when I have ulimit -n set to high enough, I eventually hit this problem:

file descriptor 1024 out of range for select (0--1024). Recompile with -threaded to work around this.

But, when I compile with -threaded, the program quickly hits this notorious issue:

getAddrInfo: does not exist (Name or service not known)

Google and friends will tell you that you need 'withSocketsDo' on Windows. But I am on a Linux machine, and I cannot find how else to debug. I have seen this error for a bad / ill-formed URL, but I don't think that is the case here because serial map runs fine.

Streaming library?

Is there any advantage to using a streaming library (pipes or conduit) to construct this code?

As far as performance goes, it appears to be okay (besides the parallelism trouble described above):

time cabal run +RTS -s Preprocessing executable 'hackage-mining' for hackage-mining- [("base",6533),("bytestring",2249),("containers",2223),("mtl",1817),("text",1270),("transformers",1155),("directory",1032),("filepath",969),("time",818),("array",687)] 1,941,217,280 bytes allocated in the heap 157,663,360 bytes copied during GC 9,957,384 bytes maximum residency (13 sample(s)) 182,664 bytes maximum slop 27 MB total memory in use (0 MB lost due to fragmentation) Tot time (elapsed) Avg pause Max pause Gen 0 3728 colls, 0 par 0.18s 0.18s 0.0000s 0.0008s Gen 1 13 colls, 0 par 0.10s 0.10s 0.0079s 0.0127s TASKS: 4 (1 bound, 3 peak workers (3 total), using -N1) SPARKS: 0 (0 converted, 0 overflowed, 0 dud, 0 GC'd, 0 fizzled) INIT time 0.00s ( 0.00s elapsed) MUT time 0.81s (2205.36s elapsed) GC time 0.28s ( 0.28s elapsed) EXIT time 0.00s ( 0.00s elapsed) Total time 1.09s (2205.64s elapsed) Alloc rate 2,401,538,990 bytes per MUT second Productivity 74.2% of total user, 0.0% of total elapsed gc_alloc_block_sync: 0 whitehole_spin: 0 gen[0].sync: 0 gen[1].sync: 0 real 36m45.647s user 0m47.507s sys 0m10.817s submitted by brooksbp
[link] [2 comments]
Categories: Incoming News

Use Haskell shared library with foreign exports withdlopen/dlsym

haskell-cafe - Mon, 09/08/2014 - 3:05pm
Hello Cafe, I am trying to use a Haskell shared library with foreign exports from Haskell again via dlopen/dlsym. Sadly it segfaults, and the segfaults happen on dlclose during garbage collection points (as figured out by monochrom in #haskell). So right now I can only open a library once and may not dlclose it. Can someone point me to a mistake I made, or is this rather a ghc (7.8.3) bug? Please see attached minimal example. Regards, Tobias test.hs: module Main where import qualified System.Posix.DynamicLinker as DL import Foreign foreign import ccall "dynamic" mkTest :: FunPtr Int -> Int main = do DL.withDL ("./") [DL.RTLD_NOW] $ \dl -> do dimPtr <- DL.dlsym dl "test" let test = mkTest dimPtr print test libtest.hs: module Test() where import Foreign foreign export ccall test :: Int test :: Int test = 124 build with: ghc --make -shared -dynamic -fPIC libtest.hs -o ghc --make -dynamic test.hs
Categories: Offsite Discussion

HTML5-compliant parsers and CSS selectors

Haskell on Reddit - Mon, 09/08/2014 - 3:02pm

I need to do two tasks, I'm searching for Haskell libraries to

  • Convert HTML documents on the wild to a DOM structure using the HTML5 parsing algorithm. This rules out xmlhtml but apparently - from a 2010 discussion - TagSoup qualifies as an HTML5 parser. Is this accurate?

  • Run CSS selectors on the resulting DOM. selectors from Hackage seem to qualify - but it seems to be based on xml-conduit, not sure how it compares to TagSoup (or whether they can be used together), but it seems to be about XML parsing, not HTML5. There's also dom-selector, and, what seems more promising, HXT combined with HandsomeSoup.

TagSoup can be interfaced with HXT using hxt-tagsoup. Is this the way to go? Should I use some other combination of libraries?

The code at the HandsomeSoup Github looks good and does what I want,

import Text.XML.HXT.Core import Text.HandsomeSoup main = do let doc = fromUrl "" links <- runX $ doc >>> css "h3.r a" ! "href" mapM_ putStrLn links

Is it easy to switch it to use hxt-tagsoup? (or rather, would it make sense?)

submitted by protestor
[link] [2 comments]
Categories: Incoming News

ANNOUNCE: buildable-

haskell-cafe - Mon, 09/08/2014 - 2:32pm
Have you ever wanted to deal with the builders for various data types in a polymorphic/overloaded fashion? I'm needing to do so and couldn't find any existing code that did so, so I decided to rectify this: As a (very contrived) example: λ> build ((365 :: Dec Int) <| fromValue (Char7 ' ') |> (365 :: BigEndian Int16) |> (" omega=
Categories: Offsite Discussion

Working with haskell in school

Haskell on Reddit - Mon, 09/08/2014 - 10:29am

So im a beginner with haskell and im currently working on an assignment with haskell in school and i have gotten stuck at this 5:

The assignments are here

I havent found anything on how to do it so i would love to get any help i can get.

I dont want you do to it for me i just want some help so i can solve it on my own

And something on assignment 5.1 would be great too

*edit: Cleaned up a bit and added a link to the assignment

submitted by Fassticman
[link] [4 comments]
Categories: Incoming News

The GHC Team: Haskell Implementors Workshop 2014 videos available!

Planet Haskell - Mon, 09/08/2014 - 9:55am

Without further ado, here's the ​HIW 2014 Youtube Playlist (kindly provided by Malcolm Wallace)

Categories: Offsite Blogs

introspection at runtime

Haskell on Reddit - Mon, 09/08/2014 - 9:41am

Can someone explain me why it is impossible to introspect into a function at runtime in Haskell?

submitted by felipeZ
[link] [6 comments]
Categories: Incoming News

FFI question

haskell-cafe - Mon, 09/08/2014 - 8:11am
Hello, I need memory refreshing about FFI because I need to fix a bug (plus I got down a rat hole because of Ukraine war :-(). Following a snippet of something I wrote:
Categories: Offsite Discussion

Jan Stolarek: Promoting functions to type families in Haskell

Planet Haskell - Mon, 09/08/2014 - 5:02am

It’s been very quiet on the blog these past few months not because I’m spending less time on functional programming but precisely for the opposite reason. Since January I’ve been working together with Richard Eisenberg to extend his singletons library. This work was finished in June and last Friday I gave a talk about our research on Haskell Symposium 2014. This was the first time I’ve been to the ICFP and Haskell Symposium. It was pretty cool to finally meet all these people I know only from IRC. I also admit that the atmosphere of the conference quite surprised me as it often felt like some sort of fan convention rather than the biggest event in the field of functional programming.

The paper Richard and I published is titled “Promoting Functions to Type Families in Haskell”. This work is based on Richard’s earlier paper “Dependently typed programming with singletons” presented two years ago on Haskell Symposium. Back then Richard presented the singletons library that uses Template Haskell to generate singleton types and functions that operate on them. Singleton types are types that have only one value (aside from bottom) which allows to reason about runtime values during compilation (some introduction to singletons can be found in this post on Richard’s blog). This smart encoding allows to simulate some of the features of dependent types in Haskell. In our current work we extended promotion capabilities of the library. Promotion is only concerned with generating type-level definitions from term-level ones. Type-level language in GHC has become quite expressive during the last couple of years but it is still missing many features available in the term-level language. Richard and I have found ways to encode almost all of these missing features using the already existing type-level language features. What this means is that you can write normal term-level definition and then our library will automatically generate an equivalent type family. You’re only forbidden from using infinite terms, the do-notation, and decomposing String literals to Chars. Numeric literals are also very problematic and the support is very limited but some of the issues can be worked around. What is really cool is that our library allows you to have partial application at the type level, which GHC normally prohibits.

You can learn more by watching my talk on YouTube, reading the paper or the singletons documentation. Here I’d like to add a few more information that are not present in the paper. So first of all the paper was concerned only with promotion and didn’t say anything about singletonization. But as we enabled more and more language constructs to be promoted we also made them singletonizable. So almost everything that can be promoted can also be singletonized. The most notable exception to this rule are type classes, which are not yet implemented at the moment.

An interesting issue was raised by Adam Gundry in a question after the talk: what about difference between lazy term-level semantics and strict type-level semantics? You can listen to my answer in the video but I’ll elaborate some more on this here. At one point during our work we were wondering about this issue and decided to demonstrate an example of an algorithm that crucially relies on laziness to work, ie. fails to work with strict semantics. I think it’s not straightforward to come up with such an algorithm but luckily I recalled the backwards state monad from Philip Wadler’s paper “The essence of functional programming”1. Bind operator of that monad looks like this (definition copied from the paper):

m `bindS` k = \s2 -> let (a,s0) = m s1 (b,s1) = k a s2 in  (b,s0)

The tricky part here is that the output of call to m becomes input to call to k, while the output of call to k becomes the input of m. Implementing this in a strict language does not at all look straightforward. So I promoted that definition expecting it to fail spectacularly but to my surprised it worked perfectly fine. After some investigation I understood what’s going on. Type-level computations performed by GHC are about constraint solving. It turns out that GHC is able to figure out in which order to solve these constraints and get the result. It’s exactly analogous to what happens with the term-level version at runtime: we have an order of dependencies between the closures and there is a way in which we can run these closures to get the final result.

All of this work is a small part of a larger endeavour to push Haskell’s type system towards dependent types. With singletons you can write type-level functions easily by writing their definitions using the term-level language and then promoting these definitions. And then you can singletonize your functions to work on singleton types. There were two other talks about dependent types during the conference: Stephanie Weirich’s “Depending on Types” keynote lecture during ICPF and Richard’s “Dependent Haskell” talk during Haskell Implementators Workshop. I encourage everyone interested in Haskell’s type system to watch both of these talks.

  1. The awful truth is that this monad does not really work with the released version of singletons. I only realized that when I was writing this post. See issue #94 on singletons bug tracker.
Categories: Offsite Blogs