News aggregator

Weird abuse of lexer and the type system (beginner)

Haskell on Reddit - Fri, 12/05/2014 - 1:12pm

I'm still just a novice, so what's weird to me may be trivial to another, but I was completely dumbfounded when this code ran as expected:

(+--) :: Num a => a -> a -> a -> a (x +-- y) r = x - r * (x - y) sample :: (Num a, Show a) => (a, a) -> a -> String sample seg r = concat [show r, ":\t", show $ uncurry (+--) seg r] main :: IO () main = mapM_ (putStrLn . sample (2, 8)) [-1.0, -0.75.. 2.0] -- output: -1.0: -4.0 -0.75: -2.5 -0.5: -1.0 -0.25: 0.5 0.0: 2.0 0.25: 3.5 0.5: 5.0 0.75: 6.5 1.0: 8.0 1.25: 9.5 1.5: 11.0 1.75: 12.5 2.0: 14.0

What's so weird? Well, -- appears outside of a string literal but does not induce a comment string. And I guess it's not surprising that (+--) can take three parameters, since it's a really a function a -> b -> c, where c happens to be a -> a. But it still feels really weird to me.

submitted by lynko
[link] [3 comments]
Categories: Incoming News

Functional Jobs: Senior Software Engineer at McGraw-Hill Education (Full-time)

Planet Haskell - Fri, 12/05/2014 - 9:43am

This Senior Software Engineer position is with the new LearnSmart team at McGraw-Hill Education's new and growing Research & Development center in Boston's Innovation District.

We make software that helps college students study smarter, earn better grades, and retain more knowledge.

The LearnSmart adaptive engine powers the products in our LearnSmart Advantage suite — LearnSmart, SmartBook, LearnSmart Achieve, LearnSmart Prep, and LearnSmart Labs. These products provide a personalized learning path that continuously adapts course content based on a student’s current knowledge and confidence level.

On our team, you'll get to:

  • Move textbooks and learning into the digital era
  • Create software used by millions of students
  • Advance the state of the art in adaptive learning technology
  • Make a real difference in education

Our team's products are built with Flow, a functional language in the ML family. Flow lets us write code once and deliver it to students on multiple platforms and device types. Other languages in our development ecosystem include especially JavaScript, but also C++, SWF (Flash), and Haxe.

If you're interested in functional languages like Scala, Swift, Erlang, Clojure, F#, Lisp, Haskell, and OCaml, then you'll enjoy learning Flow. We don't require that you have previous experience with functional programming, only enthusiasm for learning it. But if you have do some experience with functional languages, so much the better! (On-the-job experience is best, but coursework, personal projects, and open-source contributions count too.)

We require only that you:

  • Have a solid grasp of CS fundamentals (languages, algorithms, and data structures)
  • Be comfortable moving between multiple programming languages
  • Be comfortable with modern software practices: version control (Git), test-driven development, continuous integration, Agile

Get information on how to apply for this position.

Categories: Offsite Blogs

Philip Wadler: Functional Programming is the New Black

Planet Haskell - Fri, 12/05/2014 - 6:24am
Reading the conclusion of my friend Maurice Naftalin's new book on Java 8 brought home to me the extent to which Functional Programming is, indeed, the new black. You can read it here.

Maurice Naftalin, Mastering Lambdas (Conclusion), McGraw-Hill, 2014.
Other links: Java 8 Lambda FAQ, Mastering Lambdas.
Categories: Offsite Blogs

Well-Typed.Com: Compose conference and New York City Haskell courses

Planet Haskell - Fri, 12/05/2014 - 5:56am

Well-Typed is happy to announce that we are sponsoring

C◦mp◦se conference

Friday, January 30 – Sunday, February 1, 2015, New York City

This conference is focused on functional programming and features a keynote by Stephanie Weirich on dependent types as well as invited talks by Anthony Cowley, Maxime Ransan and Don Syme, plus a whole lot of additional contributed talks. There’s also an “unconference” with small workshops and tutorials as well as the opportunity to get your hands dirty and try things out yourself.

For several years now, we have been running successful Haskell courses in collaboration in Skills Matter. For the C◦mp◦se conference, we have decided to bring theses courses to New York! You can participate in our Haskell courses directly before or directly after the conference (or both):

Fast Track to Haskell

Thursday, January 29 – Friday, January 30, 2015, New York City

(Don’t worry, there’s no overlap with Stephanie Weirich’s keynote on Friday evening that marks the start of C◦mp◦se.)

You can register here.

This course is for developers who want to learn about functional programming in general or Haskell in particular. It introduces important concepts such as algebraic datatypes, pattern matching, type inference, polymorphism, higher-order functions, explicit effects and, of course, monads and provides a compact tour with lots of hands-on exercises that provide a solid foundation for further adventures into Haskell or functional programming.

Advanced Haskell

Monday, February 2 – Tuesday, February 3, 2015, New York City

You can register here.

This course is for developers who have some experience in Haskell and want to know how to work on larger projects and how things scale. The course covers important topics such as selecting the right data structures for a task at hand, taking the functional perspective into account, and takes a thorough look at Haskell’s “lazy evaluation” and how to reason about time and space performance of Haskell programs. There’s also some focus on how to use Haskell’s powerful abstraction mechanisms and concepts such as Applicative Functors, Monads and Monad Transformers to help you organize larger code bases. Finally, depending on time and demand, there’s the opportunity to look at Parallelism and Concurrency, or at type-level programming. Once again, the course comes with several carefully designed hands-on exercises and provides room for specific questions that the participants might have.

Both courses will be taught by Duncan Coutts, co-founder and partner at Well-Typed. He’s both an experienced teacher and is involved in lots of commercial Haskell development projects at Well-Typed. He’s seen a lot of Haskell code, and knows perfectly which techniques and approaches work and which do not.

Well-Typed training courses

In general, our courses are very practical, but don’t shy away from theory where necessary. Our teachers are all active Haskell developers with not just training experience, but active development experience as well. In addition to these two courses in New York City, we regularly offer courses in London, and plan to offer courses in other European locations, too.

We also provide on-site training on requests nearly anywhere in the world. If you want to know more about our training or have any feedback or questions, have a look at our dedicated training page or just drop us a mail.

Categories: Offsite Blogs

Autocompletion in Emacs

Haskell on Reddit - Fri, 12/05/2014 - 5:19am

Hello Haskellers,

I have finally got both haskell mode and ghc-mod to work correctly in emacs. This has been a big boon to my productivity and has generally much increased emacs' awesomeness.

The one thing that is missing is good auto completion. What I would like to have is reliable auto completion of modules that I import. (This is generally much more useful to me than autocompletion of my own modules b/c I generally remember pretty well what I have just written but not all the functions that are exported by module ABC).

Example: I have

import qualified Data.ByteString as B

now when I type


and run the auto-completion command I would like to see all functions exported from that module (bonus points for type signatures and docstrings).

Is something like this currently possible?

submitted by paulkoer
[link] [13 comments]
Categories: Incoming News

LHC Team: Compiling to JavaScript.

Planet Haskell - Thu, 12/04/2014 - 8:21pm
Lots of very interesting things are possible when everything (including the runtime system) is translated to LLVM IR. For example, compiling to JavaScript becomes trivial. Consider this ugly version of Hello World:

{-# LANGUAGE MagicHash #-}
module Main (main) where

import LHC.Prim

putStrLn :: List Char -> IO Unit
putStrLn msg = putStr msg `thenIO` putStr (unpackString# "\n"#)

main :: IO Unit
main = putStrLn (unpackString# "Hello World!"#)

entrypoint :: Unit
entrypoint = unsafePerformIO main

Notice the 'List' and 'Unit' types, and the 'thenIO' and  'unpackString#' functions. There's no syntactic sugar in LHC yet. You can get everything sugar-free these days, even Haskell compilers.

Running the code through the LLVM dynamic compiler gives us the expected output:

# lli Hello.ll
Hello World!

Neato, we have a complete Haskell application as a single LLVM file. Now we can compile it to JavaScript without having to worry about the garbage collector or the RTS; Everything has been packed away in this self-contained file.

$ emcc -O2 Hello.ll -o Hello.js # Compile to JavaScript using
# emscripten.
$ node Hello.js # Run our code with NodeJS.
Hello World!

$ ls -lh Hello.js # JavaScript isn't known to be
# terse but we're still smaller
# than HelloWorld compiled with GHC.
-rw-r--r-- 1 lemmih staff 177K Dec 4 23:33 Hello.js
Categories: Offsite Blogs

Oliver Charles: 24 Days of GHC Extensions: Bang Patterns

Planet Haskell - Thu, 12/04/2014 - 6:00pm

Over the last few days, we’ve been looking at various GHC extensions that centre around forming bindings. Today I’d like to look at one more extension in this area - bang patterns. Much like with record wildcards yesterday, the extension is small, yet extremely useful.

> {-# LANGUAGE BangPatterns #-} > import Data.Function (fix) > import Data.List (foldl')

Generally speaking, bang patterns allow us to annotate pattern matches to indicate that they should be strict. To understand this, we should start by understanding the interaction between pattern matching and Haskell’s evaluation strategy. When we are writing functions, any inputs to the function will not be evaluated until we pattern match on them. For example, the following contrived function doesn’t pattern match on its argument, so it doesn’t force any evaluation on it:

> hello :: Bool -> String > hello loud = "Hello."

If we apply hello to various arguments, the behaviour is the same - even for undefined values:

-> hello True "Hello." -> hello False "Hello." -> hello undefined "Hello." -> hello (fix id) "Hello."

However, by pattern matching on the Bool, we force evaluation of loud:

> hello2 :: Bool -> String > hello2 True = "Hello!" > hello2 False = "hello" -> hello2 True "Hello!" -> hello2 False "hello" -> hello2 undefined *** Exception: Prelude.undefined -> hello2 (fix id) "*** Exception: <<loop>>

Specifically, the pattern match will evaluate the input argument enough to perform the pattern match - to determine which pattern is appropriate. Usually this would be evaluation to weak head normal form, but that’s not strictly true with nested pattern matches. For more of a discussion on this, interested readers are pointed to Simon Marlow’s book Parallel and Concurrent Programming in Haskell, which has a fantastic discussion on this.

But what does this all have to do with bang patterns? Bang patterns is an extension that will evaluate specific arguments to weak head normal form regardless of the pattern match performed. If we revisit our example hello function, rewriting it with bang patterns, we have

> hello3 :: Bool -> String > hello3 !loud = "Hello."

This function will now produce values only if loud evaluates to True or False:

-> hello3 True "Hello." -> hello3 False "Hello." -> hello3 undefined *** Exception: Prelude.undefined -> hello3 (fix id) "*** Exception: <<loop>>

So much for theory, but why would you want to do such a thing? Bang patterns are a fantastic extension when you don’t need Haskell’s implicit laziness. A common case is when performing computations over large lists of data. If we’re just summarising a list or collection, forcing the value at every step leads to considerably better memory usage, and that in turn leads to better performance. Johan Tibell - an expert in the realm of high performance haskell - has a lovely example of where bang patterns are useful, in this snippet for calculating the mean of a list of Doubles:

> mean :: [Double] -> Double > mean xs = s / fromIntegral l > where > (s, l) = foldl' step (0, 0) xs > step (!s, !l) a = (s + a, l + 1)

Here we’re finding the mean of a list of numbers. If we kept this entirely lazy, we’ll build up a huge computation - a + b + c + d + e + ... and 0 + 1 + 1 + 1 + 1 + ..., for the entire length of the list! This is a horrible usage of memory, and we don’t need this laziness. It looks like using foldl' should be sufficient, but note that foldl' only evaluates to weak head normal form. In this case, that’s the pair of Doubles but not the Doubles themselves! Therefore we use bang patterns on s and l, forcing every step of the computation to evaluate the underlying Double.

It may be illuminating to consider the desugared version of the program:

mean :: [Double] -> Double mean xs = s / fromIntegral l where (s, l) = foldl' step (0, 0) xs step (s, l) a = let s' = s + a l' = l + 1 in s' `seq` l' `seq` (s', l')

This program is equivalent in strictness, but as you can see - syntactically we had to do a lot more work to get there.

In conclusion, bang patterns are a lovely extension for working with high performance code. I particularly like that we can indicate strictness syntactically, which I find makes scanning through code to understand its evaluation strategy clearer than looking for seqs. Also, BangPatterns are so lightweight, when we are trying to optimise our program - often an inherently experimental process - it’s easy to swap out different variations on strictness.

This post is part of 24 Days of GHC Extensions - for more posts like this, check out the calendar.

Categories: Offsite Blogs

Edward Z. Yang: Ubuntu Utopic upgrade (Xmonad)

Planet Haskell - Thu, 12/04/2014 - 5:48pm

I finally got around to upgrading to Utopic. A year ago I reported that gnome-settings-daemon no longer provided keygrabbing support. This was eventually reverted for Trusty, which kept everyone's media keys.

I'm sorry to report that in Ubuntu Utopic, the legacy keygrabber is no more:

------------------------------------------------------------ revno: 4015 [merge] author: William Hua <> committer: Tarmac branch nick: trunk timestamp: Tue 2014-02-18 18:22:53 +0000 message: Revert the legacy key grabber. Fixes:

It appears that the Unity team has forked gnome-settings-daemon into unity-settings-daemon (actually this fork happened in Trusty), and as of Utopic gnome-settings-daemon and gnome-control-center have been gutted in favor of unity-settings-daemon and unity-control-center. Which puts us back in the same situation as a year ago.

I don't currently have a solution for this (pretty big) problem. However, I have solutions for some minor issues which did pop up on the upgrade:

  • If your mouse cursor is invisible, try running gsettings set org.gnome.settings-daemon.plugins.cursor active false
  • If you don't like that the GTK file dialog doesn't sort folders first anymore, try running gsettings set org.gtk.Settings.FileChooser sort-directories-first true. (Hat tip)
  • And to reiterate, replace calls to gnome-settings-daemon with unity-settings-daemon, and use unity-control-panel to do general configuration.
Categories: Offsite Blogs

Robin KAY: HsQML released: London Edition

Planet Haskell - Thu, 12/04/2014 - 5:18pm
Last week I gave a talk to the London Haskell User Group on my GUI library HsQML. The slides are available now and a video of the talk will be posted on the group's YouTube channel in due course (I'll post again when that happens).

The most distinctive, some might say contentious, thing about HsQML compared to other Haskell GUI libraries is the split between implementing the back-end logic of an application in Haskell and describing its user interface using the QML domain specific language. This tends to frighten people off and I was at pains to stress that while QML does have inbuilt scripting capabilities, we can build real applications with just a thin layer of QML over our Haskell code.

The talk walking through the implementation of a new sample "sticky notes" application. Here, the Haskell back-end takes care of persisting the user's data in an SQLite database and exposes a data model to QML. Several alternate QML front-ends then show how the same data model can be skinned with different user interfaces.

One of the sticky notes application's front-ends uses Qt Quick Controls for a native look and feel, shown here on three platforms.
Belatedly also, I'm announcing version of HsQML which debuted on Hackage at the week-end. This minor release fixes a couple of bugs. Most notably, that fact that HsQML was leaking the QApplication object and this caused programs to occasionally crash on exit under Linux. HsQML now ships an OnExitHook() which should shutdown the Qt framework when the GHC RTS does, or alternatively you can call the new shutdownQt function to do it manually.

release- - 2014.11.29

* Added function to shutdown the Qt framework.
* Fixed intermittent crash on exit under Linux.
* Fixed reanimated objects being passed to QML as undefined.
* Fixed typo in the names of implicit property signals.
Categories: Offsite Blogs

Alessandro Vermeulen: Orchestration support announced on DockerCon

Planet Haskell - Thu, 12/04/2014 - 3:46pm

The philosophy behind docker is that in order to be solved, a large problem has to be divided into its root problems. One can then proceed by solving every one of these problems step by step. Additionally all elements of the solution need to communicate through a common app.

Docker has always been a tool with a single purpose: the creating, transport, and running of images. Until today there where several issues with docker that make using it somewhat trying at times. It lacked in capabilities for orchestration which is categorized by:

  1. Installation of a docker host from scratch;
  2. Clustering of multiple docker hosts to spread resource utilization over the cluster;
  3. Managing inter-container dependencies at runtime.

Today this changed as Docker Inc. announced a new set of tools.

Provisioning: Machine

Machine provides a one step installer for creating a new docker host on your local machine, a publicly hosted cloud, or a private cloud. It will automatically provision a new machine and set the environment variables such that any following docker command runs on the newly created host. This is very similar to what boot2docker provides.

There are several engines for provisioning in different platforms such as:

  • VirtualBox
  • VMWare
  • AWS
  • Microsoft Hypervisor
  • etc. (todo link)

More information can be found at github.

Clustering: Swarm

Ideally you want to control a cluster of docker hosts with the same interface as you control a single host. In other words the interface needs to be transparent or standardized. With swarm you can.

All existing commands on docker work with the swarm as well. Just point your docker binary to the swarm proxy and you are controlling the cluster instead of one single machine. Swarm is location/data center aware and also incorporates resource management. The default strategy is to use as little hosts as possible. The strategy places several lighter containers on the same node in order to reserve other nodes for heavier containers.

The main features are:

  1. Resource management
  2. Scheduling honoring constraints
  3. Health checks on the cluster and nodes
  4. Supporting the entire docker interface

Additionally Mesos can be used to provide the scheduling. Docker Inc. also announced that Mesos will be a first class citizen in Docker. The goal is to be able to run docker containers along side other Mesos jobs in the Mesos cluster.

More information also on github.

It appears that swarm is not supported yet by machine, sadly.

Managing inter-container dependencies: Composer

Setting up applications that require multiple containers to function correctly is difficult. Keeping them running is even harder. Docker proposes the Docker Composer.

Traditionally it ran on one single machine and, until today, orchestration needed to be done manually or through external tools.

Docker Hub

Docker Inc. also announces an enterprise version of the Docker Hub. It is able to run wherever the enterprise needs it to run and comes with safe 1-click upgrades. Enterprises are adopting containers as development is up to 30 times faster with halve the error rate.

Some fun facts:

  • 100000 contributors to docker hub
  • 157 TB of data transmitted each month
  • 50 TB of data stored

The timeline for 2015:

  1. Increase performance of pulls
  2. Increase transparancy by adding and improving on status pages
  3. Engage in partnership with Microsoft. Most notably this will result being able to run Linux on Microsoft Azure.
Categories: Offsite Blogs

ANN: Groundhog-inspector, a tool for generating datatypes from database

Haskell on Reddit - Thu, 12/04/2014 - 11:05am

Hello everyone,

I am pleased to announce groundhog-inspector. It analyzes database schema and creates corresponding datatypes and mapping configuration for Groundhog. It works with PostgreSQL, MySQL, and Sqlite. Composite keys, constraints, references across schemas, and other details of schema are reflected in the output. Groundhog-inspector can be used as a library and provides a standalone tool for simple scenarios.

Here is an example of the standalone tool usage. JSON mapping in the output was manually converted to YAML for brevity.

$ sqlite3 dbfile "CREATE TABLE mytable (id INTEGER PRIMARY KEY NOT NULL, str VARCHAR NOT NULL, ref INTEGER references mytable)"

$ groundhog_inspector sqlite dbfile

data Mytable = Mytable {mytableStr :: String, mytableRef :: (Maybe (AutoKey Mytable))} - entity: "Mytable" dbName: "mytable" constructors: - name: "Mytable" fields: - name: "mytableStr" dbName: "str" - name: "mytableRef" dbName: "ref"

I would be happy to answer the questions. Enjoy!

Regards, Boris Lykah

submitted by lykahb
[link] [2 comments]
Categories: Incoming News

Haskell course offerings

Haskell on Reddit - Thu, 12/04/2014 - 8:10am

Recently, we at Nilcons have been helping NobleProg to extend their offerings with a set of Haskell courses suitable for corporate clients.

These courses can be booked at various locations around the world or to be delivered on the client's premises if that is preferred. A third option is remote delivery via video conference. If you want to have it in a city not currently offered, please contact me at

Currently we offer two courses, one for beginners, who would like to have an introduction to functional ideas and idioms. The goal here is to give participants new tools which make them more efficient while using other programming languages too, and also to serve as a stepping stone if they want to seriously consider Haskell later.

The second course is for people who already have some knowledge of Haskell, but never had the time to get a deeper understanding of the language and learn techniques that make it possible to tackle bigger projects in Haskell.

Still in the works is a third course that we plan to call "State of the art Haskell" (or "Haskell for the Industry", or some other cool name like that) which would discuss important libraries and frameworks from the community, which are rather new or require a deeper understanding (or both). Lens, stream processing (Pipes and Conduit), web frameworks (Yesod and Snap) come to mind. But we are not very decided about the content for this third, most advanced course yet. So, if you have any comments on what you would like to see in it, that is most welcome! Of course, criticism about the current outlines is welcome too!

If you have questions regarding the commercial offering itself or have any other requests about the courses, feel free to ask here or email me at


submitted by errge
[link] [11 comments]
Categories: Incoming News

Upcoming GHC optimizations?

Haskell on Reddit - Wed, 12/03/2014 - 8:49pm

We know that GHC performs many interesting optimizations like strictness analysis, deforestation, and list fusion. But what new optimizations are on the horizon over the next five years? I'm interested mainly in optimizations that the GHC team is actually planning on implementing, but also optimizations that could potentially have a major impact, but haven't actually been decided on.

submitted by bitmadness
[link] [25 comments]
Categories: Incoming News

Oliver Charles: 24 Days of GHC Extensions: Record Wildcards

Planet Haskell - Wed, 12/03/2014 - 6:00pm

Occasionally, you come across a little trick or method for doing something that seems somewhat inconsequential - but rapidly becomes an indispensable item in your programming toolbox. For me, the RecordWildcards extension is a prime example of this scenario.

> {-# LANGUAGE OverloadedStrings #-} > {-# LANGUAGE RecordWildCards #-} > import Data.Aeson

To start with, let’s recap records in Haskell. A record is usually known to be a data type with a single constructor, and the data type is populated with a collection of fields. Records crop up all the time in programming, often when we try to model the real world:

> data Worker = Worker > { workerName :: String > , workerPosition :: String > , workerFirstYear :: Int > }

Of course, data alone isn’t much fun - we probably want to operate on this data too. In this case we’d like to interact with other web services, and we’ll use the common JSON format for communication. If we have a specific schema that we need to conform to, it may be easier to write this by hand:

instance ToJSON Worker where toJSON w = object [ "name" .= workerName w , "position" .= workerPosition w , "first-year" .= workerFirstYear w ]

Having to apply each record field getter to the w variable is a little tedious, and RecordWildCards can allow us to eliminate that bit of boilerplate:

> instance ToJSON Worker where > toJSON Worker{..} = object [ "name" .= workerName > , "position" .= workerPosition > , "first-year" .= workerFirstYear > ]

Here we see the Worker{..} pattern match - this pattern matches on the Worker constructor, and introduces bindings for all of the fields in Worker. Each of these bindings will be named after the respective field in the record. We can see on the RHS that we are now constructing our JSON object just out of variables, rather than function applications.

If you were expecting a lot of ground breaking new features from RecordWildCards you might be disappointed - that’s about all it does! However, did you know that you can also use RecordWildCards when creating data? For example, we could also write a JSON deserialiser as:

> instance FromJSON Worker where > parseJSON = withObject "Worker" $ \o -> do > workerName <- o .: "name" > workerPosition <- o .: "position" > workerFirstYear <- o .: "first-year" > return Worker{..}

Personally, I don’t use this feature as much as creating bindings - in this case I’d just use applicative syntax - but it can occasionally be handy.

RecordWildCards For Modules

I’ve presented a fairly “vanilla” overview of RecordWildCards - and I imagine this is probably how most people use them. However, when used with a record of functions, you can do some interesting tricks to emulate localised imports.

In my engine-io project, I have a data type called ServerAPI - here’s a snippet:

data ServerAPI m = ServerAPI { srvGetQueryParams :: m (HashMap.HashMap BS.ByteString [BS.ByteString]) , srvGetRequestMethod :: m BS.ByteString }

The intention here is that users provide a ServerAPI value when they initialise engine-io, and I then have an abstraction of a web framework to play with. People can instantiate ServerAPI for Snap or Yesod, and engine-io (should!) just work. In engine-io, by using RecordWildCards, the programming experience is natural, as the abstraction created by ServerAPI stays behind the scenes. For example:

handlePoll :: MonadIO m => ServerAPI m -> Transport -> Bool -> m () handlePoll api@ServerAPI{..} transport supportsBinary = do requestMethod <- srvGetRequestMethod ... handler :: MonadIO m => EngineIO -> (Socket -> m SocketApp) -> ServerAPI m -> m () handler eio socketHandler api@ServerAPI{..} = do queryParams <- srvGetQueryParams ...

This is very similar to using a type class - however, using type classes would be very tricky in this situation. Either engine-io would have to depend on both Snap and Yesod (though it needs neither), or I would have to use orphan instances. Neither are particularly desirable. Furthermore, who’s to say there is only one choice of ServerAPI for Snap? It’s entirely possible to provide a debugging version that logs what’s happening, or for people to switch out calls however they see fit. This is possible with newtypes in type classes, but pushes a lot of this work onto users.

Gabriel Gonzalez has a blog post on this very technique that goes into more details, which is well worth a read.

This post is part of 24 Days of GHC Extensions - for more posts like this, check out the calendar.

Categories: Offsite Blogs

Compose and Well-Typed Tutorials in NY, coming end of Jan, 2015!

Haskell on Reddit - Wed, 12/03/2014 - 12:53pm

I'm excited to announce that the Compose Conference in NY (, will be bookended by paid commercial Haskell tutorials to be given by the one and only Duncan Coutts (/u/dcoutts) of Well-Typed ( , in partnership with Skills Matter.

The tutorials are listed here:

So, should you desire, you can register for all three. First, the Fast-Track course from Duncan on Thursday and Friday, Jan 29-30. Then, on Friday evening through Sunday (Jan 30 - Feb 1), there is Compose (talks to be announced shortly. I promise the submissions have been fantastic), and finally on Monday and Tuesday there will be the Advanced Haskell course from Well-Typed.

submitted by gbaz1
[link] [7 comments]
Categories: Incoming News