# News aggregator

### ETAPS 2016 call for papers

### State of the Haskell ecosystem - August 2015

Interesting survey.

Based on a brief look I am not sure I agree with all the conclusions/rankings. But most seem to make sense and the Notable Libraries and examples in each category are helpful.

### ANN: react-flux initial release

I am announcing the initial release of react-flux. It is a GHCJS package for React based on the Flux design.

I spent some effort writing good haddocks, so the haddock documentation is the best place to learn the library. There is also an TODO example application.

It differes significantly from the other two react bindings, react-haskell and ghcjs-react. In particular, the major difference is how events are handled. In the Flux design, the state is moved out out of the view and then handlers produce actions which transform the state. Thus there is a one-way flow of data from the store into the view. In contrast, react-haskell and ghcjs-react both have event signals propagaing up the react component tree, transforming state at each node. In particular, react-haskell with its InSig and OutSig have the signals propagate up the tree and optionally transform state at each node and change the type of the signal.

I have had success in the past with the Flux design in javascript, and wanted to bring it to GHCJS. At first I tried to work with or slightly modify react-haskell, but the design difference is too fundamental. I then tried to at least share code with react-haskell, but there is unfortunately nothing that can be shared. The element creation, class definition, and event handlers are all significantly different due to the difference in how events are handled. Therefore, I made it a separate package.

submitted by wuzzeb[link] [4 comments]

### ANN: ghc-mod-5.3.0.0

### Show Haskell: Python Dependency Graphing

Hello Haskellers,

I've been reading this sub and introductory Haskell materials for awhile, and finally decided to try to *actually learn* the language by building something interesting in it.

I work as a Python/Django developer, and one of my frustrations when dealing with legacy code is circular dependencies, which, to my thinking represent broader architectural problems. For this reason, I'd been thinking about building an application that graphs Python dependencies, and so I decided to do it in Haskell in order to combine various experimental activities into one project:

Graphs in general and graphs in a functional language

Parsing of Python source

File-system stuff, including locating files and directories

It's still a work-in-progress and has some issues, and some of my goals have not yet been realized, but here's what I've got so far: https://github.com/pellagic-puffbomb/haskpy-dependency-graphs.

I found myself accumulating loads of questions, but maybe I'll just post the highlights here:

I tried to use if-then-else here and got the confusing message "ifThenElse Not in scope". Isn't "ifThenElse" built-in? (Errantly copying junk from cabal files without thinking about what it is...)

I wrote this chunk and then afterward had the thought that there's an issue of context here that monads may solve, but I couldn't really piece it together. If I made my datatype an instance of monad, is it possible that I could write this chunk in a simpler way?

I will happily take any other comments you have. Thanks for checking it out.

submitted by erewok[link] [13 comments]

### oddsFrom3 function

### Flycheck (emacs) now supports stack

### What haskellers' critiques of PHP

Not really expecting a variaty of opinions (if any) in this one to be honest

submitted by zarandysofia[link] [26 comments]

### Mark Jason Dominus: A message to the aliens, part 4/23 (algebra)

Earlier articles: Introduction Common features Page 1 (numerals) Page 2 (arithmetic) Page 3 (exponents)

This is page 4 of the *Cosmic Call*
message. An explanation follows.

Reminder: page 1 explained the ten digits:

0

1

2

3

4

5

6

7

8

9

And the equal sign . Page 2 explained the four basic arithmetic operations and some associated notions:

addition

subtraction

multiplication

division

negation

ellipsis (…)

decimal

point

indeterminate

This page, headed with the glyph for “mathematics” , describes the solution of simple algebraic equations and defines glyphs for three variables, which we may as well call and :

x

y

z

Each equation is introduced by the locution which means “solve for ”. This somewhat peculiar “solve” glyph will not appear again until page 23.

For example the second equation is :

**Solve for : **

The solution, 6, is given over on the right:

After the fourth line, the equations to be solved change from simple numerical equations in one variable to more abstract algebraic relations between three variables. For example, if

**Solve for : **

then

.

The next-to-last line uses a decimal fraction in the exponent, : . On the previous page, the rational fraction was used. Had the same style been followed, it would have looked like this: .

Finally, the last line defines and then, instead of an algebraic solution, gives a graph of the resulting relation, with axes labeled. The scale on the axes is not the same; the -coordinate increases from 0 to 20 pixels, but the -coordinate increases from 0 to 8000 pixels because . If axes were to the same scale, the curve would go up by 8,000 pixels. Notice that the curve does not peek above the -axis until around or so. The authors could have stated that this was the graph of , but chose not to.

I also wonder what the aliens will make of the arrows on the axes. I think the authors want to show that our coordinates increase going up and to the left, but this seems like a strange and opaque way to do that. A better choice would have been to use a function with an asymmetric graph, such as .

(After I wrote that I learned that similar concerns were voiced about the use of a directional arrow in the Pioneer plaque.

(Wikipedia says: “An article in Scientific American criticized the use of an arrow because arrows are an artifact of hunter-gatherer societies like those on Earth; finders with a different cultural heritage may find the arrow symbol meaningless.”)

The next article will discuss page 5, shown at right. (Click to enlarge.) Try to figure it out before then.### Using Commutative Assessments to Compare Conceptual Understanding in Blocks-based and Text-based Programs

Using Commutative Assessments to Compare Conceptual Understanding in Blocks-based and Text-based Programs, David Weintrop, Uri Wilensky. Proceedings of the eleventh annual International Conference on International Computing Education Research. Via Computing Education Blog.

Blocks-based programming environments are becoming increasingly common in introductory programming courses, but to date, little comparative work has been done to understand if and how this approach affects students' emerging understanding of fundamental programming concepts. In an effort to understand how tools like Scratch and Blockly differ from more conventional text-based introductory programming languages with respect to conceptual understanding, we developed a set of "commutative" assessments. Each multiple-choice question on the assessment includes a short program that can be displayed in either a blocks- based or text-based form. The set of potential answers for each question includes the correct answer along with choices informed by prior research on novice programming misconceptions. In this paper we introduce the Commutative Assessment, discuss the theoretical and practical motivations for the assessment, and present findings from a study that used the assessment. The study had 90 high school students take the assessment at three points over the course of the first ten weeks of an introduction to programming course, alternating the modality (blocks vs. text) for each question over the course of the three administrations of the assessment. Our analysis reveals differences on performance between blocks-based and text-based questions as well as differences in the frequency of misconceptions based on the modality. Future work, potential implications, and limitations of these findings are also discussed.

### Compiling Haskell with profiling enabled

I'm trying to do some Haskell profiling. My program has both dynamic and static parts and I need both.

When I try to compile, I get errors stating that my base libraries don't have profiling enabled.

So, I went ahead and compiled GHC from source, with

GhcLibWays += dyn v p_dynNow, when I install cabal-install via the bootstrap (also required some modification), I get errors in the style of

Distribution/Utils/NubList.hs:18:18: Could not find module ‘Text.Read’ Perhaps you haven't installed the "p_dyn" libraries for package ‘base’? Locations searched: dist/build/Text/Read.hs dist/build/Text/Read.lhs Text/Read.hs Text/Read.lhs dist/build/autogen/Text/Read.hs dist/build/autogen/Text/Read.lhs /usr/local/prerequisites/ghc-7.8.4-gcc-4.9.1/lib/ghc-7.8.4/base-4.7.0.2/Text/Read.p_dyn_hiLooking at the directory listing for this example, I see

ParserCombinators Printf.dyn_hi Printf.p_hi Read Read.dyn_hi Read.p_hi Show Show.dyn_hi Show.p_hiSo the profiling seems to have been enabled. Does anyone know how to go about solving this issue? Does anyone have experience with profiling a dynamic Haskell program/library?

**Update**: It seems like the files **DO exist**, but they weren't copied over during the make install stage:

Should I just copy them over manually? Is the install broken?

submitted by ash286[link] [12 comments]

### Swedish spin-off has openings for advanced functional programmers - Haskell, OCaml, Scala, and more, Functor, the Swedish Martin-Löf type theory spinoff, is in the midst of recruiting right now, interviewing beginning already — Is it for you?

### Brandon Simmons: Announcing: Hashabler 1.0. Now even more hashy with SipHash

I’ve just released version 1.0 of a haskell library for principled, cross-platform & extensible hashing of types. It is available on hackage, and can be installed with:

cabal install hashabler(see my initial announcement post which has some motivation and pretty pictures)

You can see the CHANGELOG but the main change is an implementation of SipHash. It’s about as fast as our implementation of FNV-1a for bytestrings of length fifty and slightly faster when you get to length 1000 or so, so you should use it unless you’re wanting a hash with a simple implementation.

If you’re implementing a new hashing algorithm or hash-based data structure, please consider using hashabler instead of hashable.