News aggregator

Lennart Augustsson: Haskell error reporting with locations

Planet Haskell - Fri, 04/04/2014 - 1:28am
Error reporting in GHC is not always the nicest. For example, I often develop code by using undefined as a placeholder for code I've not written yet. Here's a simple example:
import System.Environment main = do args <- getargs if null args then undefined else undefined Running this looks like this: $ ./H H: Prelude.undefined Which undefined caused that? Looking at the error message we have no idea. Wouldn't it be nice with some location information?

We can actually get location information by using Control.Exception.assert:

import Control.Exception(assert) import System.Environment main = do args <- getargs if null args then assert False undefined else assert False undefined Now running it is much more informative: $ ./H H: H.hs:7:9-14: Assertion failed How is assert able to report the location? If we dig deep enough we discover that it's because the ghc compiler contains a special hack to recognize this function and give it location information. A generalized hackIn a Haskell compiler that I've implemented I've taken this compiler hack and extended it so it can be used for any function.  It comes in two parts, location information and location transparent definitions. __LOCATION__The __LOCATION__ identifier is always defined and utterly magical. Its value is a string that describes the location of that very identifier. This is the very opposite of a referentially transparent name. In fact it's value varies with where it is placed in the code! So it's definitely not for purists. But I'm a practical man, so I sometimes have resort of the ugliness of reality. And in reality we want to report locations in errors.

Enough philosophy, here's an example:

main = do print __LOCATION__ print __LOCATION__ And running it prints: "test/Test.hs:2:11" "test/Test.hs:3:13" And to illustrate the impurity: main = do let loc = __LOCATION__ print loc print loc And running this: "test/Test.mu:2:15" "test/Test.mu:2:15" Location transparencyThe __LOCATION__ identifier gives the location of itself. This is of little use on its own. Imagine the definition we could give for undefined. Somewhere in the Prelude module it could say something like undefined = error ("undefined: " ++ __LOCATION__) But if we use this all that it will tell us is where the definition of undefined is, not where it was used.

To get the point of use instead of the definition I've introduced location transparent definitions. In a location transparent definition the __LOCATION__ identifier will not refer to its own position, but to the position of the reference to the definition. Location transparency is introduced with a pragma.

{-# LOCATIONTRANSPARENT undefined #-} undefined = error ("undefined: " ++ __LOCATION__) With this definition our initial example looks like this when we run it: undefined: test/H.hs:6:9 In fact, the real definition of undefined doesn't look like that. The __LOCATION__ identifier is only used in the definition of error, so it looks something like this: {-# LOCATIONTRANSPARENT error #-} error :: String -> a error s = throw (ErrorCall (__LOCATION__ ++ ": " ++ s)) {-# LOCATIONTRANSPARENT undefined #-} undefined = error "undefined" Since both error and undefined are transparent any use of undefined will be reported with the location of the use.

Furthermore, we can make a few more functions location transparent, e.g.,

{-# LOCATIONTRANSPARENT head #-} head :: [a] -> a head [] = error "Empty list" head (x:xs) = x A simple example: main = putStr (head []) Which will print: test/Head.hs:1:16: Empty list which is the location where head was called. ImplementationThere are different ways to implement this feature, and I'm going to sketch two of them.

First: Every function that has the LOCATIONTRANSPARENT pragma will be inlined at the point of use, and the __LOCATION__ identifier in the inlined code will be updated to reflect the call site. The definitions must be processed in a bottom-up fashion for this to work. It's fairly simple to implement, but will cause some code bloat due to inlining.

Second: Every function that has LOCATIONTRANSPARENT pragma will be rewritten (by the compiler) to have an extra location argument, and each use of this function will be rewritten to pass in the current location. For example (using $$f for the location version of f):

main = putStr ($$head __LOCATION__ []) $$head __LOCATION__ [] = $$error __LOCATION__ "Empty list" $$head __LOCATION__ (x:xs) = x $$error __LOCATION__ s = throw (ErrorCall (__LOCATION__ ++ ": " ++ s)) This should be fairly straightforward to implement, but I've not tried it. (It's somewhat like dynamic binding, so maybe ghc could reuse that mechanism for locations.)

And, of course, the global __LOCATION__ identifier has to be recognized by the compiler and replaced by a string that is its location.

ConclusionI implemented the __LOCATION__ hack quite a while ago, and I like the much improved reporting of error locations. I hope someone will add it to ghc as well.
Categories: Offsite Blogs

.NET Compiler Platform ("Roslyn")

Lambda the Ultimate - Fri, 04/04/2014 - 12:21am

The .NET Compiler Platform (Roslyn) provides open-source C# and Visual Basic compilers with rich code analysis APIs. You can build code analysis tools with the same APIs that Microsoft is using to implement Visual Studio!

In a nutshell: OPEN SOURCE C# COMPILER. Putting aside possible practical implications of this for the .NET ecosystem, I think it is good for programming language geeks to be able to peruse the source code for compilers and language tools.

For the debate about MS being evil, you can head directly to HN where you'll also find an explanation of what bootstrapping a compiler means.

Categories: Offsite Discussion

Future of Programming workshop

Lambda the Ultimate - Thu, 04/03/2014 - 5:36pm

The call for submissions is out. There will be two opportunities this first year to participate: at Strangeloop in September and at SPLASH in October. The call:

We are holding two events. First, in partnership with the Emerging Languages Camp, FPW×ELC will be at Strange Loop on September 17 in St. Louis MO. Second, at SPLASH in Portland OR around Oct. 19 (pending approval).

We want to build a community of researchers and practitioners exploring the frontiers of programming. We are looking for new ideas that could radically improve the practice of programming. Ideas too embryonic for an academic paper yet developed enough to demonstrate a prototype. Show us your stuff!

FPW×ELC will present live demonstrations before an audience. The SPLASH event will be an intense, private writer’s workshop1,2. This process will be a chance to give and take both creative support and incisive criticism.

Submissions will be 15 minute demo screencasts. You can select either or both of the events in your submission. The submission deadline is June 8 and notifications will be sent June 27. After the events participants will have until December 1 to revise their screencasts for archival publication on our website. The submission site is now open. For questions please see the FAQ or ask info@future-programming.org.

Brought to you by Richard Gabriel, Alex Payne, and Jonathan Edwards.

This is a good idea for the more edgy language designers to exhibit their work and receive useful critique to improve presentation, which ultimately helps with our goal of world domination (or at least, pushing the community to take more risks).

Categories: Offsite Discussion

www.youtube.com

del.icio.us/haskell - Thu, 04/03/2014 - 5:25pm
Categories: Offsite Blogs

www.youtube.com

del.icio.us/haskell - Thu, 04/03/2014 - 5:25pm
Categories: Offsite Blogs

Restricted monads for free using Codensity + Constraint Kinds

Haskell on Reddit - Thu, 04/03/2014 - 1:47pm

Code here: http://lpaste.net/102201

I wanted to hack up a quick probability distribution monad, but using [(a,Rational)] bugged me because it didn't group identical events.

So here is the data I want to be a monad:

newtype ProbData a = ProbD { runProbD :: M.Map a Rational } deriving (Show,Eq)

It admits a form of return and bind, but it's not a monad:

retProb :: Ord a => a -> ProbData a retProb a = ProbD $ M.singleton a 1 bindProb :: Ord b => ProbData a -> (a -> ProbData b) -> ProbData b bindProb m k = ProbD $ M.unionsWith (+) $ map (\(a,p) -> M.map (*p) (runProbD $ k a)) $ M.toList $ runProbD m

However, we can get some magic: the Codensity monad transformer doesn't actually require a monad to transform!

newtype Codensity m a = Cod { runCod :: forall r. (a -> m r) -> m r } -- same text as ContT's implementation instance Monad (Codensity m) where return a = Cod (\k -> k a) m >>= f = Cod $ \k -> runCod m $ \a -> runCod (f a) k

However, this doesn't allow us to implement liftProb, so we have no way to get our distributions in:

-- liftProb :: ProbData a -> Codensity ProbData a -- liftProb pd = Cod $ \k -> bindProb pd k -- ERROR: No instance for Ord b ...

On the other hand, ConstraintKinds gives us a restricted version of the codensity monad that is still a monad regardless of what data we want to put into it:

-- a slight tweak from above: newtype RCodensity c m a = Cod { runCod :: forall r. c r => (a -> m r) -> m r } -- copy and paste the same monad instance and it works just fine! liftProb :: ProbData a -> RCodensity Ord ProbData a liftProb pd = Cod $ \k -> bindProb pd k -- compiles fine! runProb :: Ord a => RCodensity Ord ProbData a -> ProbData a runProb p = runCod p retProb

So now we can get yummy monadic syntax out of it:

type Prob a = RCodensity Ord ProbData a uniform :: Ord a => [a] -> Prob a uniform xs = liftProb $ ProbD $ M.fromList $ map (\a -> (a,p)) xs where p = 1 / fromIntegral (length xs) dice n d = liftM sum $ replicateM n (uniform [1..d])

However, this is still not ideal; dice 2 6 >>= whatever will call whatever 36 times, not 11. We can fix that, though:

optimize :: Ord a => Prob a -> Prob a optimize = liftProb . runProb twod6 = optimize (dice 2 6)

In twod6 >>= whatever, whatever will only get called 11 times, once for each value between 2 and 12!

submitted by ryani
[link] [15 comments]
Categories: Incoming News

Thiago Negri: Premature optimization: it's not just about code

Planet Haskell - Thu, 04/03/2014 - 10:46am

When we talk about performance, we shouldn't look only to performance of code execution. There are more types that we should care about, like time to market, development time and processes. Actually, we shouldn't look into any of the performance measures at a single perspective. As I see, all these measures sums up to constitute a single value of business performance.

There are lots of things that can slow down your business, and if you have a slow business, you'll have a hard time to keep pace with your rivals. As you discover that some areas of your company may be slowing you down, you may become tempted to add key performance indicators to each step trying to squeeze all possible value out of each part of your processes. This can bring up a whole new set of problems, your business can be seen as a complex system and it will adapt itself to render good results even in chaotic scenarios. [1]

So, to avoid going nuts with indicators all over the place, you decide to avoid bottlenecks from the start. To each and every new thing you need to deliver, you'll start a precise planning routine, choosing among a plethora of providers, technologies and unicorns to avoid at all costs to have a new bottleneck in the future. This is what I like to call premature optimization in business performance.

Simply put: you can't have a business if you don't have a business. How can you possibly know all the events that will happen to have an impact in the decision you take today? You can't.I'm not saying that planning is wrong or a Bad Thing™. What I really want to avoid is losing energy on stuff that won't make a difference. You may spend a whole year choosing between apples and oranges, and in the end be crushed by some weirdo playing around with random technology. Why? Because he was not worried about future bottlenecks. He was really trying to do something cool, and he did it while you were arguing in endless and purposeless meetings.

You're always trading performance measurements. For example, everyone is talking about service oriented architecture (SOA), but what does it really means in terms of business? It means a tradeoff between code execution performance and others benefits to the business as a whole, like decentralized grow and continuous delivery. Or, you may look at the difference between Apple's App Store and Google's Play Store model. There is a huge tradeoff of quality assurance and time to market between these two players: one offers their customers (smartphone owners), quality assured apps; the other offers their customers (operating system users), fast time to market for their apps.

The big question really is: what are you trying to deliver to your customer? Are you trying to deliver software over time or value over time? I bet most of you are trying to deliver value over time, so why bother with premature optimization of technologies or processes? Let the bad stuff to die on its own: failing fast is orders of magnitude better than not doing anything at all. And let the good stuff to accompany you to where you want to go.

Don't bother guessing the future, embrace the uncertainty of life: you can't predict how a complex system will behave, you can't know how your systems, services or whatever you are doing will be used in the real world. Put it into the wild, take the complaints and tune it (or throw it away) afterwards, this is a constant process. [2]

Start measuring how much you are delivering over time and stop over-planning. Your end goal is to deliver stuff. The world couldn't care less about what you can do, it only cares about what you do.

References

  1. Complex Systems and Key Performance Indicators - Márcio Geovani Jasinski
  2. Complaint-Driven Development - Jeff Atwood

Categories: Offsite Blogs

Daniil Frumin: Heads up: scotty-hastache 0.2.1

Planet Haskell - Thu, 04/03/2014 - 7:59am

To accommodate for the release of hastache-0.6, I’ve updated the scotty-hastache package to the version 0.2.1.


Tagged: haskell, hastache, scotty, web
Categories: Offsite Blogs

Composing list functions with user-supplied functions

Haskell on Reddit - Thu, 04/03/2014 - 7:06am

First of all, I apologize for the confusing title as I don't know how to phrase this problem in a better way.

I've noticed that I often have the need to transform a list using functions like filter, maximumBy, sortBy, etc, where I can do arbitrary computation with the elements to decide how to transform the list. The problem is that I find it's kind of ugly to "chain" those functions together if they need to share intermediate results.

Let me come up with a concrete example: Suppose I want to determine the closest object hit by a ray within a certain distance from a list of objects, and I have:

rayObject :: Ray -> Object -> Maybe Double -- Return `Just t` if ray hits the object at distance t, or `Nothing` otherwise. minimumMay :: [a] -> Maybe a -- The safe version of `minimum`, returns `Nothing` if the list is empty. ray :: Ray objects :: [Object] maxDistance :: Double

and I could write:

f :: Maybe Double f = mfilter (< maxDistance) . minimumMay $ mapMaybe (rayObject ray) objects

which would return the distance to the closest object struck by the ray wrapped in a Maybe. However, sometimes this is not what I want. I would like to know which object gets struck by the ray, which I can't since I've lost the reference to the object right after I pass it to rayObject. My current solution is simple: Pair the object with the intermediate values in a tuple and extract it in the end. It would end up with something like:

f :: Maybe Object f = fmap fst . mfilter ((< maxDistance) . snd) . minimumByMay (comparing snd) $ mapMaybe (\o -> (o,) <$> rayObject ray o) objects

While it works, I feel like this is kind of verbose and ugly with all these fst and snd "noise" that litters in-between the line, and I feel this would easily get out of the hand if the complexity increases. Therefore, I would like to know what would be a more elegant way to solve this.

I feel like there is something in the lens library that would solve this, but unfortunately I'm not very familiar with the library yet. If anyone could show me how to do that, I'd be much appreciated!

submitted by hylomorphism
[link] [1 comment]
Categories: Incoming News

www.cse.unsw.edu.au

del.icio.us/haskell - Thu, 04/03/2014 - 6:50am
Categories: Offsite Blogs

www.cse.unsw.edu.au

del.icio.us/haskell - Thu, 04/03/2014 - 6:50am
Categories: Offsite Blogs

Haskell application architecture best practices

Haskell on Reddit - Thu, 04/03/2014 - 6:24am

I've been playing around with Haskell in my spare time for a while now and have a reasonable grasp on how to implement algorithms and data structures idiomatically, but beyond this there seems to be a lack of good documentation and tutorials on how to actually architect larger systems in Haskell.

I've had great success with Scala and F# on some pretty complex systems, and I'd like to take the next step and start using Haskell in production as well - but this requires I cure my addiction to mutability and OO, and I'm not quite sure exactly how I should architect my applications. I can understand at a basic level how to use monads to help perform computations and maintain state at a localised level, but I'm not sure how to take this to the next step and actually architect a larger solution - i.e. something that maintains state across multiple separate modules with their own encapsulated state.

Are there any good guides that I've overlooked that can provide examples, tutorials, or questions that are focussed more on building real world applications (rather than mathematical algorithms or data structures)?

Edit: Thanks guys - some great advice!

submitted by zoomzoom83
[link] [23 comments]
Categories: Incoming News

NEED HELP!!!

Haskell on Reddit - Thu, 04/03/2014 - 5:36am

This assignment is driving me crazy!!! How do I solve it. http://cs.anu.edu.au/student/comp1100/1-Labs-Assignments.html

I'm a new student to programming. I have only been programming for seven weeks and this due tomorrow. I'm stuck, I fell like an idiot. Only option is to hand in sub pare work.

If you guys have any resources to how to do this, that would be great.

submitted by CanberraStudent
[link] [12 comments]
Categories: Incoming News