# News aggregator

### Loop School

### Doctoral Teaching Assistantships in CS at Oxford

### Why isn't anyone talking about optimal lambda calculus implementations?

On 1990, John Lamping proposed an optimal algorithm for lambda calculus reduction as per Levy's notion. As far as my understanding goes, it works by translating lambda calculus terms to interaction nets, reducing them, and translating back. Interaction nets are a computational model based on graph rewriting with interesting properties such as computation being local and asynchronous (different from, say, automatas, which are local but synchronized). They are very interesting by themselves, having universal combinators akin to SKI, connections to linear logic, biology, etc. That's not the point, though: what most of this means for functional programming is that John's system is not only optimal on the sense that the minimal amount of duplication is guaranteed (making, i.e., lazy evaluation irrelevant) but is also very promising for the possibility of parallelization exploiting.

Yet, 25 years have passed and there is only one implementation of those ideas on Hackage, which is undocumented and, as far as my source-code-stalking goes, possibly incomplete, since I couldn't find a function to map from nets back to lambdas (it might be there, though).

Anyway, the question holds: why is there almost no interest in implementing those ideas and seeing how well they go in practice?

submitted by SrPeixinho[link] [22 comments]

### Strongly typed data / API specification librares

### hasql: A minimalistic general high level API for relational databases | Hackage

### hasql: A minimalistic general high level API for relational databases | Hackage

### Philip Wadler: Status Report 3

I have a date for keyhole surgery to remove my kidney: Tuesday 17 March. This fits what I was told previously, so all is to schedule. Recovery time is likely to be four weeks. I expect to be in hospital until Monday 23 March: visitors most welcome! Please call the Western General, or Level 4 office has my contact details.

My liver biopsy took place on Thursday 19 February; the hardest part was to lie still for six hours after. I had meetings with my infectious disease doctor on Wednesday 25 February and with my cardiologist on Monday 2 March. My endocarditis is clear, but will need to be monitored once a year. The biopsy shows a deposit of amyloid protein; experts are being consulted, but it has no effect on my surgery. My thanks to all my doctors and the staff at the Western General, who have been excellent.

**Related**: Status report, Status report 2, A paean to the Western General, Status report 4.

### apt-get install haskell-platform fails on ubuntu14.04

### Existentials for multi-parameter type classes?

### Proposal: parametrize tree stem container type in containers

### First order pattern matching.

Title edit: not first order, first class*

Is there any language extension that enables first-class pattern matching? For example:

((a,b) => a+b) (1,2) == 3 (x:xs => xs) [1,2,3] == [2,3]In general,

(pattern => match)Equates to

(\ x -> let pattern = x in match) submitted by Hadkell[link] [11 comments]

### Staggered zip, or fold with 2 elements at a time

If there is a better location to post this question, please let me know.

Suppose given

1. A finite list x = [x^1, x^2 .. x^n] 2. A pure binary function h, and pure unary functions f and g.It is pretty easy to apply the functions to every element of x, with higher-order functions instead of recursion:

aligned1 f g h x = (h <$> f <*> g) `map` x aligned2 f g h x = (\p -> h (f p) (g p)) `map` xHere, x is traversed one element at a time and the fused functions are applied to produce an element of the result. Sequential evaluation is not needed because each element of the result only depends on one element of x. Also, the higher-order function takes care of the raw recursion.

Is there a way to achieve the same for the following?

y^1 = f x^1 y^(n+1) = g x^n y^i = h ( f x^i ) ( g x^(i-1) )For example, if h = (+) and f,g :: a -> Int, the job would get done with:

staggered1 f g x = zipWith (+) ((map f x) ++ [0]) (0 : map g x)This code zips the following two lists with addition (+):

[ f x^1 , f x^2 .. f x^n , 0 ] [ 0 , g x^1 .. g x^(n-1) , g x^n ]Alternatively, one could map f and g on x, and then use raw recursion to zip the two lists:

staggered2 f g h x = y : inner ys zs where y:ys = map f x zs = map g x inner [] (q:qs) = q : inner [] qs inner (p:ps) (q:qs) = h p q : inner ps qs inner _ _ = []Or, an accumulator variable could be used to store the previous x or (f x):

staggered3a f g h (x:xs) = (f x) : inner x xs where inner acc [] = g acc : [] inner acc (y:ys) = h (f y) (g acc) : inner y ys staggered3b f g h (x:xs) = f x : inner (g x) xs where inner acc [] = acc : [] inner acc (y:ys) = h (f y) acc : inner (g y) ysMore generally, this accumulator could even be a queue to store several consecutive elements at once...

Is there an elegant way to zip staggered maps on the same list, or fold consecutive elements at a time? The goal is to traverse x only once, avoid raw recursion with higher order functions, preserve the local dependence on consecutive elements of x, and avoid forcing sequential execution.

Are there existing libraries that do this? Maybe a generalization of this concept?

BONUS (pat on the back) for applicative style.

PS. In my context, (f xi ) is the remainder from applying (g xi ), and vice versa. So, in theory, they could be calculated at the same time. I haven't found a way to do this yet, but that's a whole other question :)

submitted by newestHaskeller[link] [20 comments]

### Mark Jason Dominus: Rectangles with equal area and perimeter

Wednesday while my 10-year-old daughter Katara was doing her math homework, she observed with pleasure that a rectangle has a perimeter of 18 units and also an area of 18 square units. I mentioned that there was an infinite family of such rectangles, and, after a small amount of tinkering, that the only other such rectangle with integer sides is a square, so in a sense she had found the single interesting example. She was very interested in how I knew this, and I promised to show her how to figure it out once she finished her homework. She didn't finish before bedtime, so we came back to it the following evening.

This is just one of many examples of how she has way too much homework, and how it interferes with her education.

She had already remarked that she knew how to write an equation expressing the condition she wanted, so I asked her to do that; she wrote $$(L×W) = ([L+W]×2).$$ I remember being her age and using all different shapes of parentheses too. I suggested that she should solve the equation for , getting on one side and a bunch of stuff involving on the other, but she wasn't sure how to do it, so I offered suggestions while she moved the symbols around, eventually obtaining $$W = 2L\div (L-2).$$ I would have written it as a fraction, but getting the right answer is important, and using the same notation I would use is much less so, so I didn't say anything.

I asked her to plug in and observe that popped right out, and then similarly that yields , and then I asked her to try the other example she knew. Then I suggested that she see what did: it gives , This was new, so she checked it by calculating the area and the perimeter, both . She was very excited by this time. As I have mentioned earlier, algebra is magical in its ability to mechanically yield answers to all sorts of questions. Even after thirty years I find it astonishing and delightful. You set up the equations, push the symbols around, and all sorts of stuff pops out like magic. Calculus is somehow much less astonishing; the machinery is all explicit. But how does algebra work? I've been thinking about this on and off for a long time and I'm still not sure.

At that point I took over because I didn't think I would be able to guide her through the next part of the problem without a demonstration; I wanted to graph the function and she does not have much experience with that. She put in the five points we already knew, which already lie on a nice little curve, and then she asked an incisive question: does it level off, or does it keep going down, or what? We discussed what happens when gets close to 2; then shoots up to infinity. And when gets big, say a million, you can see from the algebra that is a hair more than 2. So I drew in the asymptotes on the hyperbola.

Katara is not yet familiar with hyperbolas. (She has known about parabolas since she was tiny. I have a very fond memory of visiting Portland with her when she was almost two, and we entered Holladay park, which has fountains that squirt out of the ground. Seeing the water arching up before her, she cried delightedly “parabolas!”)

Once you know how the graph behaves, it is a simple matter to see that
there are no integer solutions other than and . We know that
does not work. For the value of is always strictly
between and . For there is no value of that works at
all. For the formula says that is negative, on the
other branch of the hyperbola, which is a perfectly good *numerical*
solution (for example, ) but makes no sense as the width of
a rectangle. So it was a good lesson about how mathematical modeling
sometimes introduces solutions that are wrong, and how you have to
translate the solutions back to the original problem to see if they
make sense.

[ Addendum 20150330: Thanks to Steve Hastings for his plot of the hyperbola, which is in the public domain. ]

### ANN: Magic Cookies! Android board game in Haskell

### Noam Lewis: A simple bash script for monitoring the output of a running process

I got tired of my screen getting flooded with text when all I wanted was to see the last output line **at any given time**. What I really want is to just see one output line and have it replaced with some newer text when it’s available. So I wrote a script (in, ahem, bash) that lets your do exactly this. Call it “tailor.sh” or whatever you want:

(If the code below is screwed up, see it here on gist)

#!/bin/bash -eu LOG_FILE=$1 SB="stdbuf -i0 -oL" shift tput sc $@ 2>&1 \ | $SB tee $LOG_FILE \ | $SB cut -c-$(tput cols) \ | $SB sed -u 's/\(.\)/\\\1/g' \ | $SB xargs -0 -d'\n' -iyosi -n1 bash -c 'tput rc;tput el; printf "\r%s" yosi' EXIT_CODE=${PIPESTATUS[0]} tput rc;tput el;printf "\r" # Delete the last printed line exit $EXIT_CODEI use it in a test env bringup script to run some builds. Instead of having to lose all the context of the wrapping script, I now see just a single output line of the invoked builds at any given time.