News aggregator

Haskell — is it growing?

Haskell on Reddit - Tue, 12/02/2014 - 4:57pm

Just a very simple question. Is Haskell a dying language? I note some events in my area (Australia) — AusHac — the last one was 2011.

submitted by princearthur
[link] [26 comments]
Categories: Incoming News

GHC Weekly News - 2014/12/01

Haskell on Reddit - Tue, 12/02/2014 - 2:43am
Categories: Incoming News

Oliver Charles: 24 Days of GHC Extensions: View Patterns

Planet Haskell - Mon, 12/01/2014 - 6:00pm

I’d like to start this series by focussing on what I call binding extensions. These are extensions that are used in conjuction with forming bindings - such as top-level function definitions, where clauses, and let bindings. Today, we’ll begin by looking at a simple yet powerful extension - view patterns.

View patterns extend our ability to pattern match on variables by also allowing us to pattern match on the result of function application. To take a simple example, lets work with a Map from Haskell packages on Hackage, to the amount of downloads. To start with, we’ll look at extracting the amount of downloads for the lens library. Ordinarily, we might write something like:

lensDownloadsOld :: Map HaskellPackage Int -> Int lensDownloadsOld packages = case M.lookup "lens" packages of Just n -> n Nothing -> 0

Notice that the first thing this function does is to immediately pattern match on a function call. Arguably, this obscures the definition of the lensDownloads function which we expect to have two equations defining it - one when the package has a download count, and another for when the package hasn’t been download (for example, when collecting a new batch of statistics). Using view patterns, we can move this lookup from the right-hand side to the left hand side:

lensDownloads :: Map HaskellPackage Int -> Int lensDownloads (M.lookup "lens" -> Just n) = n lensDownloads _ = 0

Now our lookup function is defined by the two equations we would expect. View patterns allows us to “view” the download statistics as a different data type - in this case we view the map as the sum type Maybe Int, by focussing on the value for the key lens.

As we can see, a view pattern is defined by two parts - the view itself, which is a partially applied function; and the pattern match to perform on the result of that function application. In this case, we are given a Map HaskellPackage Int, and our view is M.lookup "lens" :: Map HaskellPackage Int -> Maybe Int. We pattern match on this Maybe Int for the Just case, and this allows us to bind the download count to the variable n. Notice also that if the pattern match against Just fails, we fall through to the next pattern of lensDownloads. GHC will carefully check patterns for exhaustivity, so we’re still forced to consider all possibilites.

Finally, it would be tedious to have to write a function like this for every package - so we would like to abstract the package name out. With view patterns, our view function is able to depend on variables to the left of the view pattern. Thus we are able to write a general download-lookup function as

downloadsFor :: HaskellPackage -> Map HaskellPackage Int -> Int downloadsFor pkg (M.lookup pkg -> Just downloads) = downloads downloadsFor _ _ = 0 View Patterns as a Tool for Abstraction

The functions we’ve seen so far haven’t really benefit from view patterns. The case analysis in the original example isn’t particularly cumbersome, and downloadsFor doesn’t necessarily benefit from the use of view patterns. However, a key benefit to view patterns is that they allow us to view a data type as a definition that is easy to pattern match on, while using a very different data type for the underlying representation.

Take for example, the finger tree - a general purpose data structure suitable for a wide variety of applications, one of which is as a sequence. In Haskell, the Prelude gives us a basic list data type, defined essentially as:

data List a = Nil | Cons a (List a)

However, this data structure has terrible performance for just about anything - it’s just a linked list. Viewing the last element of the list here is O(n) - quite a cost! Seq can be used as a drop in replacement to lists here, but looking up the last element is O(1) - much better! To give such high performance, Seq uses a finger tree, which is a data type which has much better performance characteristics than linked lists. To do so, Seq uses a more complex data definition - a definition that is completely abstract to us, forcing us to use functions to inspect it.

The use of functions moves us away from perhaps more idiomatic Haskell programming, where would like to define our functions in terms of various equations. By using view patterns, we regain much of this style of programming.

As an example, let’s consider analysing a time series. Our time series is simple, and we’ll store a list of data points. To operate on this time series, we’d like to be able to view the last data point in the series - if such a value exists. Intuitively, we know there are two possibilities: the time series is empty, in which case we return Nothing; or the time series is non-empty, in which case we return Just the last value:

last :: Seq a -> Maybe a last ?? = Nothing last ?? = Just _

While we can’t pattern match directly on a Seq, we can view it as a list from the right by using viewr:

data ViewR a = EmptyR | (Seq a) :> a viewr :: Seq a -> ViewR a

Notice that ViewR is similar to a linked list as before, but we have the ability to look at any Seq as a list from the right. Either the sequence is empty, or it’s a smaller sequence with a single element appended. This inductive structure fits perfectly for our purposes:

last :: Seq a -> Maybe a last (viewr -> xs :> x) = Just x last (viewr -> EmptyR) = Nothing

This type of separation is very powerful, and you’ll find it used in many of the high-performance data structures on Hackage.

However, one qualm with this approach is that it brings new syntax - a syntax that it took the author a while to get comfortable with. With new syntax there is always a balance between the overhead of the syntax (which adds something of a context switch), and the productivity gains the extension begets. What would be really nice would be similar functionality of this extension, without the need for new syntax. Thankfully, GHC can do just that. How, you ask? Well, you’ll just have to wait and see…

This post is part of 24 Days of GHC Extensions - for more posts like this, check out the calendar.

Categories: Offsite Blogs

Made a Haskell mindmap

Haskell on Reddit - Mon, 12/01/2014 - 10:04am
Categories: Incoming News

The GHC Team: GHC Weekly News - 2014/12/01

Planet Haskell - Mon, 12/01/2014 - 9:24am

Hi *,

It's that time again for some good ol' fashion GHC news, this time just after the holidays. Some of the things happening in the past week include:

  • Partial Type Signatures has been merged into HEAD. Many thanks to Thomas Winant, who worked on this feature for several months!
  • As mentioned last week, GHC 7.10 will no longer ship haskell98 and haskell2010, nor old-time or old-locale.

Closed tickets this week include: #9827, #7475, #9826, #7460, #7643, #8044, #8031, #7072, #3654, #7033, #9834, #6098, #6022, #5859, #5763, #9838, #9830, #7243, #9736, #9574, #5158, #9844, #9281, #9818, #4429, #8815, #2182, #4290, #9005, #9828, #9833, #9582, and #9850.

Another huge thanks to Thomas Miedema who closed an extraordinary amount of tickets for us - the above list is still not even complete, and he's made a huge impact on the amount of open tickets in the past month or so.

Categories: Offsite Blogs

Announcing Opaleye: SQL-generating embedded domain specific language for Postgres

Haskell on Reddit - Mon, 12/01/2014 - 8:33am

I am pleased to announce the public release of Opaleye, now available on Hackage and at GitHub:

Opaleye is a Haskell library which provides an SQL-generating embedded domain specific language for targeting Postgres. You need Opaleye if you want to use Haskell to write typesafe and composable code to query a Postgres database.

Opaleye allows you to define your database tables and write queries against them in Haskell code, and aims to be typesafe in the sense that if your code compiles then the generated SQL query will not fail at runtime. A wide range of SQL functionality is supported including inner and outer joins, restriction, aggregation, distinct, sorting and limiting, unions and differences. Facilities to insert to, update and delete from tables are also provided. Code written using Opaleye is composable at a very fine level of granularity, promoting code reuse and high levels of abstraction.

For further information please refer to the README and tutorials:

I would be happy to answer any questions anyone might have about this library. Feel free to comment here, send a reddit PM or email me.

submitted by tomejaguar
[link] [80 comments]
Categories: Incoming News

Haskell blog

del.icio.us/haskell - Mon, 12/01/2014 - 4:27am
Categories: Offsite Blogs

FP Complete: Experimental package releases via Stackage Server

Planet Haskell - Mon, 12/01/2014 - 12:00am

Right now, Hackage has no concept of a stable and an unstable release of a package. As a result, authors are hesitant to release code to Hackage unless it's already stable. But it's difficult to get people to test new versions of packages if it's difficult to install. Installing a single new package from Github may not be difficult, but sometimes you want people to test out a new set of versions for multiple packages, which can be tedious. This blog post will demonstrate how you can use Stackage Server to make that easy.

While the primary purpose of Stackage Server is to host the official Stackage snapshots, it has been designed as completely generic server for hosting any set of packages desired, including custom packages not yet released to Hackage. All you need to do is:

  1. Create an account on Stackage Server (by logging in with Google+ or Mozilla Persona)
  2. Create a tarball in the correct format (described below)
  3. Upload it from the snapshot upload page
Tarball format

You can download a sample bundle file by clicking on the "Bundle" link at the top of any snapshot page. It might be useful to open one up as you looking through the rest of this section.

You can view the tarball parsing code in the Stackage Server codebase itself. The format is designed to be simple to replicate and extensible for future functionality. (In fact, the slug file feature I mention below was only recently added.)

The tarball must be tarred in a format that the tar package can read, and then gzipped. Each file in the tarball is treated indepedently. Directory structure inside the tarball is ignored. Using tar cfz mybundle.tar.gz somedirectory is usually sufficient to meet these criterion.

Each file inside the tarball is treated separately. There are four kinds of files recognized:

  • desc gives the human-readable title and description for the snapshot. Put the title on the first line, and the description on the following lines. (Note that, currently, we only display the title on the site, though we may add the description to the display in the future.)
  • slug is a recommendation for the short name of the snapshot. For example, the most recent GHC 7.8 snapshot as I write this is http://www.stackage.org/snapshot/2014-11-26-ghc78-exc, which has a slug of 2014-11-26-ghc78-exc. Slugs must be globally unique, so if someone else has already taken that slug, Stackage Server will append a randomized token to the end.
  • hackage is a list of all the package/version combos to be included in this snapshot from Hackage. For example, you might have:

    foo-1.0.0 bar-1.0.1

    You're free to have multiple versions per package.

  • Any file ending in .tar.gz will be treated as a custom sdist tarball, and will be made available for download from stackage.org. This is how you can provide custom versions of a package not released on Hackage. As an example of this, here's a snapshot with two unreleased packages in it.
Custom snapshot

Another use case is customizing an official Stackage snapshot. For example, you may be using a certain snapshot, but want to get a newer version of one of the packages from Hackage, or write a custom patch for one of the package versions and use that. If so, all you need to do is:

  1. Download the bundle file
  2. Tweak its contents
  3. Upload it
  4. Use the new URL
Replace or augment Hackage?

The instructions for using a Stackage snapshot mention replacing the hackage.haskell.org remote-repo line in your cabal config file with the stackage.org URL. This makes sense if you're providing a snapshot that has all the packages from Hackage that you'll need. However, if you're testing out a few new packages, it's simpler to just provide those few extra packages, and add an extra remote-repo line to your config file instead of replacing the primary entry. Note that this trick can be used to augment a Stackage snapshot in addition to adding extra packages to Hackage.

Caveats

You should keep two things in mind when using Stackage Server in this manner:

  • Snapshots you create live forever. In cases of extreme issues (like accidentally uploading copyrighted data) we will of course assist in removing the snapshot. But generally speaking, a snapshot is forever, just like uploading a package to Hackage makes it available forever.
  • All snapshots are publicly listed, so you don't want to put any sensitive information in there. Of course, the Stackage Server codebase is open source, so you're free to run your own, private instance if you'd like. Alternatively, FP Complete provides private Stackage Server instances as a service, feel free to contact us for more information.
Other uses

Creating a generic tool like that has the advantage that it can be (ab)used to purposes other than the original intent of the author. In this case, I've described some intended alternate use cases for this functionality. If people come up with other unintended use cases, let me know!

Categories: Offsite Blogs

Haskell戦記

del.icio.us/haskell - Sun, 11/30/2014 - 11:59pm
Categories: Offsite Blogs

Haskell戦記

del.icio.us/haskell - Sun, 11/30/2014 - 11:59pm
Categories: Offsite Blogs