# News aggregator

### Will the performance of lazy code get better with improvements in AI?

I often see experienced Haskellers adding strictness annotations to a lot of performance critical code. I, however, can't see why laziness would ever be a bad thing. The laziest program would be one that calculated the bare minimum needed to get an answer, essentially deferring all work that was unnecessary. This leads me to believe that it is how the compiler knows when to be lazy that is the problem, not laziness itself. Would machine learning or artificial intelligence help in this regard?

submitted by briticus557[link] [18 comments]

### Given that we aren’t going to put any effort into testing, what’s the best way to test?

### http://www.cs.nott.ac.uk/~gmh/monads

### Philip Wadler: CITIZENFOUR showing in Edinburgh

Kickstarter-like, OurScreen will arrange to show a film if enough people sign up to see it. I've scheduled a showing:

Cameo Edinburgh, 12 noon, Tuesday 2 December 2015cost: £6.80 Book tickets from OurScreen.

If 34 people sign up by Sunday 23 November, then the showing will go ahead.

### I’m debating between Haskell and Clojure... (xPost r/Clojure)

I'm an experienced OO Programmer (Java, some C#, less ruby) considering jumping into the FP world. Some problem spaces I’m dealing with seem better suited for that approach. I’m also a big fan of the GOOS book, and want to push some of those concepts further.

I’m debating between Haskell and Clojure as my jumping off point. My main criteria is good community, tool support, and a language with an opinion (I'm looking at you, scala and javascript).

Other than serendipity, what made you choose Haskell over others, especially Clojure?

Why should I chose Haskell?

submitted by spatchcoq[link] [81 comments]

### Haskell and Mapping Real Numbers

I've been reading "What Every Computer Scientist Should Know About Floating-Point Arithmetic" https://docs.oracle.com/cd/E19957-01/806-3568/ncg_goldberg.html#680 and I noticed that haskell uses the IEEE standard -- like most languages (> 0.1 + 0.2 = 0.30000000000000004 :: Fractional a => a).

What I'm wondering now though, is how possible is it to accurately map Real Number math precisely with a programming language? I always thought haskell would be really good at this, but I guess not.. Are there any languages that do real arithmetic accurately/precisely, and if it's impossible -- how close are computer scientists currently -- is there any thing out there that I can read to get caught up on the subject?

submitted by alphonse23[link] [16 comments]

### Is this a compiler bug?

GIST here:

https://gist.github.com/philipnilsson/7b0974e4a8094c04efb3

Basically, I have a program that implements a small DSL w/ assignments. It's parametrised on it's type (typ) via GADTs.

type RHS typ = String data Exp typ where ... data Statement = forall typ. Assign (RHS typ) (Exp typ)The idea is that assignment should be type checked by matching the typ parameter, but once it is, it should be forgotten so that I can combine statements w/ different types.

Strangely this type checks

exp = ... :: Exp String rhs = "foo" :: RHS () st = Assign rhs expHowever, by changing RHS into a data def instead of a type def it fails to compile.

data RHS typ = RHS String rhs = RHS "foo" st = Assign rhs expIs this correct? If so, why? Only extension is GADTs.

Just tried to report a bug on the GHC trac, but can't get my verification email.

submitted by ueberbobo[link] [3 comments]

### I made a reverse polish notation calculator, give it a look if you want and tell me how to make it better

### Danny Gratzer: Functors and Recursion

One of the common pieces of folklore in the functional programming community is how one can cleanly formulate recursive types with category theory. Indeed, using a few simple notions we can build a coherent enough explanation to derive some concrete benefits.

In this post I’ll outline how one thinks of recursive types and then we’ll discuss some of the practical ramifications of such thoughts.

PrecursorI’m assuming the reader is familiar with some basic notions from category theory. Specifically familiarity with the definitions of categories and functors.

Let’s talk about endofunctors, which are functors whose domain and codomain are the same. *spoiler: These are the ones we care about in Haskell*. An interesting notion that comes from endofunctors is that of algebras. An algebra in this sense is a pair of an object C, and a map F C → C. Here F is called the “signature” and C is called the carrier.

If you curious about why these funny terms, in abstract algebra we deal with algebras which are comprised of a set of distinguished elements, functions, and axioms called the signature. From there we look at sets (called carriers) which satisfy the specification. We can actually cleverly rearrange the specification for something like a group into an endofunctor! It’s out of scope for this post, but interesting if algebras your thing.

Now we can in fact define a category for F-algebras. in such a category an object is α : F A → A and each arrow is a triplet.

- normal arrow f : A → B
- An F-algebra α : F A → A
- Another F-algebra β : F B → B

So that f ∘ α = β ∘ F f. In picture form

F f F A ———————————————–→ F B | | | | | α | β ↓ ↓ A —————————————————→ B fcommutes. I generally elide the fact that we’re dealing with triplets and instead focus on the arrow, since that’s the interesting bit.

Now that we’ve established F-algebras, we glance at one more thing. There’s one more concept we need, the notion of initial objects. An initial object is an… object, I in a category so that for any object C

f I - - - - - - - - → CSo that f is unique.

Now what we’re interested in investigating is the initial object in the category of F-algebras. That’d mean that

α F I ————————————————–→ I | | | | F λ | λ | ↓ ↓ F C —————————————————→ CCommutes only for a unique λ.

A List is just an Initial Object in the Category of F-Algebras.*What’s the problem?*

Now, remembering that we’re actually trying to understand recursive types, how can we fit the two together? We can think of recursive types as solutions to certain equations. In fact, our types are what are called the *least fixed point* solutions. Let’s say we’re looking at IntList. We can imagine it defined as

We can in fact, factor out the recursive call in Cons and get

data IntList a = Cons Int a | Nil deriving FunctorNow we can represent a list of length 3 as something like

type ThreeList = IntList (IntList (IntList Void))Which is all well and good, but we really want arbitrary length list. We want a solution to the equation that

X = IntList XWe can view such a type as a set {EmptyList, OneList, TwoList, ThreeList ... }. Now how can we actually go about saying this? Well we need to take a fixed point of the equation! This is easy enough in Haskell since Haskell’s type system is unsound.

-- Somewhere, somehow, a domain theorist is crying. data FixedPoint f = Fix {unfix :: f (FixedPoint f)}Now we can regain our normal representation of lists with

type List = FixedPoint IntListTo see how this works

out :: FixedPoint IntList -> [Int] out (Fix f) = case fmap out f of Nil -> [] Cons a b -> a : b in :: [Int] -> FixedPoint IntList in [] = Nil in (x : xs) = Fix (Cons x (in xs))Now this transformation is interesting for one reason in particular, IntList is a functor. Because of this, we can formulate an F-algebra for IntList.

type ListAlg a = IntList a -> aNow we consider what the initial object in this category would be. It’d be something I so that we have a function

cata :: Listalg a -> (I -> a) -- Remember that I -> a is an arrow in F-Alg cata :: (List a -> a) -> I -> a cata :: (Either () (a, Int) -> a) -> I -> a cata :: (() -> a) -> ((a, Int) -> a) -> I -> a cata :: a -> (Int -> a -> a) -> I -> a cata :: (Int -> a -> a) -> a -> I -> aNow that looks sort of familiar, what’s the type of foldr again?

foldr :: (a -> b -> b) -> b -> [a] -> a foldr :: (Int -> a -> a) -> a -> [Int] -> aSo the arrow we get from the initiality of I is precisely the same as foldr! This leads us to believe that maybe the initial object for F-algebras in Haskell is just the least fixed point, just as [Int] is the least fixed point for IntList.

To confirm this, let’s generalize a few of our definitions from before

type Alg f a = f a -> a data Fix f = Fix {unfix :: f (Fix f)} type Init f = Alg f (Fix f) cata :: Functor f => Alg f a -> Fix f -> a cata f = f . fmap (cata f) . unfix*Exercise, draw out the reduction tree for cata on lists*

Our suspicion is confirmed, the fixed point of an functor is indeed the initial object. Further more, we can easily show that initial objects are unique up to isomorphism (exercise!) so anything that can implement cata is isomorphic to the original, recursive definition we were interested in.

When The Dust SettlesNow that we’ve gone and determined a potentially interesting fact about recursive types, how can we use this knowledge? Well let’s start with a few things, first is that we can define a truly generic fold function now:

fold :: Functor f => (f a -> a) -> Fix f -> aThis delegates all the messy details of how one actually thinks about handling the “shape” of the container we’re folding across by relegating it to the collapsing function f a -> a.

While this may seem like a small accomplishment, it does mean that we can build off it to create data type generic programs that can be fitted into our existing world.

For example, what about mutual recursion. Fold captures the notion of recurring across one list in a rather slick way, however, recurring over two in lockstep involves a call to zip and other fun and games. How can we capture this with cata?

We’d imagine that the folding functions for such a scenario would have the type

f (a, b) -> a f (a, b) -> bFrom here we can build

muto :: (f (a, b) -> a) -> (f (a, b) -> b) -> Fix f -> (a, b) muto f g = cata ((,) <$> f <*> g)Similarly we can build up oodles of combinators for dealing with folding all built on top of cata!

That unfortunately sounds like a lot of work! We can shamelessly free-load of the hard work of others thanks to hackage though. In particular, the package recursion-schemes has built up a nice little library for dealing with initial algebras. There’s only one big twist between what we’ve laid out and what it does.

One of the bigger stumbling blocks for our library was changing the nice recursive definition of a type into the functorfied version. Really it’s not realistic to write all your types this way. To help simplify the process recursion-schemes provides a type family called Base which takes a type and returns its functorfied version. We can imagine something like

data instance Base [a] b = Cons a b | NilThis simplifies the process of actually using all these combinators we’re building. To use recursion-schemes, all you need to is define such an instance and write project :: t -> Base t t. After that it’s all kittens and recursion.

Wrap UpSo dear reader, where are we left? We’ve got a new interesting formulation of recursive types that yields some interesting results and power. There’s one interesting chunk we’ve neglected though: what does unfolding look like?

It turns out there’s a good story for this as well, unfolding is the operation (anamorphism) defined by a terminal object in a category. A terminal object is the precise dual of an initial one. You can notice this all in recursion-schemes which features ana as well as cata.

<script type="text/javascript"> var disqus_shortname = 'codeco'; (function() { var dsq = document.createElement('script'); dsq.type = 'text/javascript'; dsq.async = true; dsq.src = '//' + disqus_shortname + '.disqus.com/embed.js'; (document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(dsq); })(); </script> <noscript>Please enable JavaScript to view the comments powered by Disqus.</noscript> comments powered by Disqus### Installing application dependencies using Stackage, sandboxes, and freezing

### Why the presence/absence of the HsColour binary forces to recompile the QuickCheck library?

### Learning Haskell as a Nonprogrammer

### NOTE: the primary www.haskell.org server is going down for immediate maintenance.

Hello *,

The primary haskell.org domain, www.haskell.org, is hosted on a system which seems to have lost one of its RAID disks completely.

We were planning on moving this machine in the next few weeks to new infrastructure, but we are now expediting this plan and will be doing it ASAP.

As we move this server, both the webserver and the mailing system will be going down. Please don't be alarmed if your emails aren't delivered or things go quiet. Many services will continue to work, but we do realize this will be upsetting for many.

You can follow the progress on #haskell-infrastructure on Freenode, and see updates on https://status.haskell.org

If you need to download something like a GHC binary or Haskell Platform package, you can use https://downloads.haskell.org in the mean time, which is a new service we were hoping to announce more officially soon, but is already working today.

Unfortunately we cannot give an expected time of completion for the move, but we'll try to keep people well informed through IRC or something like Reddit.

Thanks

submitted by aseipp[link] [3 comments]

### Facebook releases "Flow", a statically typed JavaScript variant

The goal of Flow is to find errors in JavaScript code with little programmer effort. Flow relies heavily on type inference to find type errors even when the program has not been annotated - it precisely tracks the types of variables as they flow through the program.

At the same time, Flow is a gradual type system. Any parts of your program that are dynamic in nature can easily bypass the type checker, so you can mix statically typed code with dynamic code.

Flow also supports a highly expressive type language. Flow types can express much more fine-grained distinctions than traditional type systems. For example, Flow helps you catch errors involving null, unlike most type systems.