News aggregator

Chalmers is advertising positions, deadline April 6

General haskell list - Thu, 04/02/2015 - 1:16pm
Dear all, Chalmers University of Technology has 2-3 open positions in the division of Software Technology (which consists of functional programming, language based security and formal methods). Please consider applying! And/or forward to people you know who would be interested. Application deadline: April 6, 2015 (soon!). --- Associate Professor in Software Technology Lecturer in Software Technology /Koen
Categories: Incoming News

Announcing: LTS (Long Term Support) Haskell 2

haskell-cafe - Thu, 04/02/2015 - 11:16am
The LTS Haskell 2.0 release is officially out the door. More details are available at: For the tl;dr crowd, run the following inside your project to get a cabal.config with appropriate constraints: wget There are some questions about policy at the end of the post that I'd appreciate feedback on. _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe< at >
Categories: Offsite Discussion

Hindley Milner and Scott Encodings

Haskell on Reddit - Thu, 04/02/2015 - 11:04am

Ive been playing around with Hindley Milner and Scott encodings, I'd love to know what you all think, especially those who know more about (sub)types than I.


submitted by stelleg
[link] [11 comments]
Categories: Incoming News

FARM 2015: 2nd Call For Papers

General haskell list - Thu, 04/02/2015 - 10:28am
Dear all, The workshop on Functional Art, Music, Modelling and Design, co-located with ICFP 2015 in Vancouver, may be of interest to some of you. The 2nd call for papers is enclosed; deadline 17 May. Any help spreading the CFP to colleagues and friends who might be interested would be much appreciated. Thanks! /Henrik
Categories: Incoming News

Is `race (readChan a) (readChan b)` safe?

Haskell on Reddit - Thu, 04/02/2015 - 7:50am

Or what about race (threadDelay n) (readChan a)?

Golang has that select thing for selecting from multiple chans, but I'd guess that it's been checked for correctness.

race, however, looks like it could make one of the actions in a half-assed state. What do y'all make of this?

submitted by hxr
[link] [5 comments]
Categories: Incoming News

Haskell Library for

Haskell on Reddit - Thu, 04/02/2015 - 5:20am
Categories: Incoming News

Ian Ross: Non-diffusive atmospheric flow #14: Markov matrix calculations

Planet Haskell - Thu, 04/02/2015 - 4:48am
Non-diffusive atmospheric flow #14: Markov matrix calculations April 2, 2015

This is going to be the last substantive post of this series (which is probably as much of a relief to you as it is to me…). In this article, we’re going to look at phase space partitioning for our dimension-reduced <semantics>Z500<annotation encoding="application/x-tex">Z_{500}</annotation></semantics> PCA data and we’re going to calculate Markov transition matrices for our partitions to try to pick out consistent non-diffusive transitions in atmospheric flow regimes.

Phase space partitioning

We need to divide the phase space we’re working in (the unit sphere parameterised by <semantics>θ<annotation encoding="application/x-tex">\theta</annotation></semantics> and <semantics>ϕ<annotation encoding="application/x-tex">\phi</annotation></semantics>) into a partition of equal sized components, to which we’ll assign each data point. We’ll produce partitions by dividing the unit sphere into bands in the <semantics>θ<annotation encoding="application/x-tex">\theta</annotation></semantics> direction, then splitting those bands in the <semantics>ϕ<annotation encoding="application/x-tex">\phi</annotation></semantics> direction as required. The following figures show the four partitions we’re going to use here1:

In each case, the “compartments” of the partition are each of the same area on the unit sphere. For Partitions 1 and 2, we find the angle <semantics>α<annotation encoding="application/x-tex">\alpha</annotation></semantics> of the boundary of the “polar” components by solving the equation

<semantics>∫0αsinθdθ∫02πdϕ=4πC,<annotation encoding="application/x-tex"> \int_0^{\alpha} \sin \theta \, d\theta \int_0^{2\pi} \, d\phi = \frac{4\pi}{C}, </annotation></semantics>

where <semantics>C<annotation encoding="application/x-tex">C</annotation></semantics> is the number of components in the partition. For partition 1, with <semantics>N=4<annotation encoding="application/x-tex">N=4</annotation></semantics>, this gives <semantics>α1=π/3<annotation encoding="application/x-tex">\alpha_1 = \pi/3</annotation></semantics> and for partition 2, with <semantics>N=6<annotation encoding="application/x-tex">N=6</annotation></semantics>, <semantics>α2=cos−1(2/3)<annotation encoding="application/x-tex">\alpha_2 = \cos^{-1} (2/3)</annotation></semantics>.

Assigning points in our time series on the unit sphere to partitions is then done by this code (as usual, the code is in a Gist):

-- Partition component: theta range, phi range. data Component = C { thmin :: Double, thmax :: Double , phmin :: Double, phmax :: Double } deriving Show -- A partition is a list of components that cover the unit sphere. type Partition = [Component] -- Angle for 1-4-1 partition. th4 :: Double th4 = acos $ 2.0 / 3.0 -- Partitions. partitions :: [Partition] partitions = [ [ C 0 (pi/3) 0 (2*pi) , C (pi/3) (2*pi/3) 0 pi , C (pi/3) (2*pi/3) pi (2*pi) , C (2*pi/3) pi 0 (2*pi) ] , [ C 0 th4 0 (2*pi) , C th4 (pi-th4) 0 (pi/2) , C th4 (pi-th4) (pi/2) pi , C th4 (pi-th4) pi (3*pi/2) , C th4 (pi-th4) (3*pi/2) (2*pi) , C (pi-th4) pi 0 (2*pi) ] , [ C 0 (pi/2) 0 pi , C 0 (pi/2) pi (2*pi) , C (pi/2) pi 0 pi , C (pi/2) pi pi (2*pi) ] , [ C 0 (pi/2) (pi/4) (5*pi/4) , C 0 (pi/2) (5*pi/4) (pi/4) , C (pi/2) pi (pi/4) (5*pi/4) , C (pi/2) pi (5*pi/4) (pi/4) ] ] npartitions :: Int npartitions = length partitions -- Convert list of (theta, phi) coordinates to partition component -- numbers for a given partition. convert :: Partition -> [(Double, Double)] -> [Int] convert part pts = map (convOne part) pts where convOne comps (th, ph) = 1 + length (takeWhile not $ map isin comps) where isin (C thmin thmax ph1 ph2) = if ph1 < ph2 then th >= thmin && th < thmax && ph >= ph1 && ph < ph2 else th >= thmin && th < thmax && (ph >= ph1 || ph < ph2)

The only thing we need to be careful about is dealing with partitions that extend across the zero of <semantics>ϕ<annotation encoding="application/x-tex">\phi</annotation></semantics>.

What we’re doing here is really another kind of dimensionality reduction. We’ve gone from our original spatial maps of <semantics>Z500<annotation encoding="application/x-tex">Z_{500}</annotation></semantics> to a continuous reduced dimensionality representation via PCA, truncation of the PCA basis and projection to the unit sphere, and we’re now reducing further to a discrete representation – each <semantics>Z500<annotation encoding="application/x-tex">Z_{500}</annotation></semantics> map in our original time series data is represented by a single integer label giving the partition component in which it lies.

We can now use this discrete data to construct empirical Markov transition matrices.

Markov matrix calculations

Once we’ve generated the partition time series described in the previous section, calculating the empirical Markov transition matrices is fairly straightforward. We need to be careful to avoid counting transitions from the end of one winter to the beginning of the next, but apart from that little wrinkle, it’s just a matter of counting how many times there’s a transition from partition component <semantics>j<annotation encoding="application/x-tex">j</annotation></semantics> to partition component <semantics>i<annotation encoding="application/x-tex">i</annotation></semantics>, which we call <semantics>Tij<annotation encoding="application/x-tex">T_{ij}</annotation></semantics>. We also need to make sure that we consider the same number, <semantics>Nk<annotation encoding="application/x-tex">N_k</annotation></semantics>, of points from each of the partition components. The listing below shows the function we use to do this – the function takes as arguments the size of the partition and the time series of partition components as a vector, and returns the transition count matrix <semantics>

Categories: Offsite Blogs

FP Complete: Announcing: LTS (Long Term Support) Haskell 2

Planet Haskell - Thu, 04/02/2015 - 3:00am

The Stackage team is proud to announce the release of LTS Haskell 2. To quote the package page:

LTS Haskell: Version your Ecosystem

LTS (Long Term Support) Haskell is a curated set of packages which includes non-breaking point releases. It is a companion to Stackage Nightly: whereas Stackage Nightly releases include potentially breaking changes with each new release, LTS Haskell maintains major version stability for a longer period of time.

As usual, to start using LTS Haskell, you typically need to run the command wget in your package directory. More detailed instructions are available on the LTS Haskell 2 page itself.

This release is significant in that it is the first major version bump we've performed on LTS Haskell. I'm also happy to note that, despite some earlier concerns, both primitive 0.6 and blaze-builder 0.4 made it in, thanks to last minute patches by Emanuel Borsboom, Simon Meier, Edward Kmett, and Gregory Collins.

I'm also happy to announce that, in the three months since LTS 1 was released, there has been a significant surge in involvement from the community. For comparison:

MeasurementLTS 1.0LTS 2.0 Core packages2929 Non-core packages8331030 Total packages8621059 Unique maintainers6796

I'm excited to see the community embrace this project so fully, and look forward to the trend continuing.

The road to 3.0

The current plan is to target the LTS 3.0 release some time around August, depending on when the Hackage ecosystem updates to GHC 7.10 fully. The goal is to make sure the 3.0 is switched over to GHC 7.10.

In addition, Daniel Bergey sent an email which resulted in some questions from me about how we should plan and communicate around LTS major bumps. To summarize my goals and ideas:

  • We need to make sure package authors understand when a release is coming out, and the importance of making their packages compatible with upstream dependencies. I believed previously that the issue tracker on the Stackage repo was sufficient to indicate this to authors, but Daniel's questions and other responses I received from package authors tells me that we need some more explicit communication. Perhaps there should be an email 1-2 weeks in advance of the release warning about restrictive upper bounds.
  • How strictly should we adhere to a release schedule? I want to make sure that LTS Haskell is a reliable release, but perhaps providing a release window of 1-2 weeks instead of a hard release date will give package authors some necessary flexibility.
  • Since Stackage Nightly is essentially the testing ground for new LTS major bumps, how aggressive should we be on removing packages with restrictive upper bounds? I've been pretty lenient until now. However, this is a two-edged sword. Allowing upper bounds to creep in makes the lives of some authors easier, but makes the lives of other authors (the ones updating their packages regularly) much more difficult.

I don't want to make any of these decisions myself, as they're pretty central to how the LTS ecosystem is going to operate. If you have thoughts on any of these points- or on points I haven't raised- please bring them up on the Stackage mailing list and/or Reddit.

Categories: Offsite Blogs

What would I need to know to work with haskell?

Haskell on Reddit - Thu, 04/02/2015 - 2:35am

Ok, so I'm learning haskell. I know a fair bit of it - I can do the stuff on the 99 haskell problems page, etc etc. Took a university course in haskell, I know how recursion and folds work and stuff. I understand why monads are useful.

But I feel like I'm so, so far from actually making an application with Haskell. When I look at code for applications that actually do exist, I can't understand any of it. And I feel like if I wanted to work with haskell coding, I'd have to actually be able to program applications myself.

So what's the next step? What can I do to actually make something in haskell, and make myself interesting in the market?

submitted by Krexington_III
[link] [14 comments]
Categories: Incoming News

cabal haddock --executables

haskell-cafe - Thu, 04/02/2015 - 2:17am
I'm getting: cabal: internal error when calculating transitive package dependencies. Debug info: [] and it seems this has been a problem for quite some while (and it has to do with the executable depending on a local library) is there any known workaround? Thanks, Maurizio _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe< at >
Categories: Offsite Discussion

How to implement a source-sink pattern

haskell-cafe - Wed, 04/01/2015 - 8:31pm
Hi! I'm trying to implement a source-sink type of pattern where there are a number of sinks, each connected to exactly one source, and a number of sources, each connected to zero or more sinks. The program consists of some modules, each defining their own sources and sinks. To illustrate this, here's what this would look like in C: /* global.h */ struct source { char *state; /* some more fields */ } struct sink { struct source *source; char *state; /* some more fields */ } struct sink **get_sinks_for_source(struct source *source); /* module_a.c */ struct source a_source, another_source; struct sink foo, bar, baz; ... foo.source = &a_source; ... Since getting the list of sinks for a source is a common operation, I'd probably define some kind of reverse map which is updated when a sink is remapped to a new source. I tried to rebuild this in Haskell, but the result is ridiculously complicated. Since I can't use pointers, I had to define an ordinal type class to enumerate the so
Categories: Offsite Discussion