# News aggregator

### wren gayle romano: On being the "same" or "different": Introduction to Apartness

Meanwhile, back in math land... A couple-few months ago I was doing some work on apartness relations. In particular, I was looking into foundational issues, and into what an apartness-based (rather than equality-based) dependently-typed programming language would look like. Unfortunately, too many folks think "constructive mathematics" only means BHK-style intuitionistic logic— whereas constructive mathematics includes all sorts of other concepts, and they really should be better known!

So I started writing a preamble post, introducing the basic definitions and ideas behind apartnesses, and... well, I kinda got carried away. Instead of a blog post I kinda ended up with a short chapter. And then, well, panic struck. In the interests of Publish Ever, Publish Often, I thought I might as well share it: a brief introduction to apartness relations. As with my blog posts, I'm releasing it under Creative Commons Attribution-NonCommercial-NoDerivs 4.0; so feel free to share it and use it for classes. But, unlike the other columbicubiculomania files, it is not ShareAlike— since I may actually turn it into a published chapter someday. So do respect that. And if you have a book that needs some chapters on apartness relations, get in touch!

The intro goes a little something like this:

We often talk about values being "the same as" or "different from" one another. But how can we formalize these notions? In particular, how should we do so in a constructive setting?

Constructively, we lack a general axiom for double-negation elimination; therefore, every primitive notion gives rise to both strong (strictly positive) and weak (doubly-negated) propositions. Thus, from the denial of (weak) difference we can only conclude weak sameness. Consequently, in the constructive setting it is often desirable to take difference to be a primitive— so that, from the denial of strong difference we can in fact conclude strong sameness.

This ability "un-negate" sameness is the principal reason for taking difference to be one of our primitive notions. While nice in and of itself, it also causes the strong and weak notions of sameness to become logically equivalent (thm 1.4); enabling us to drop the qualifiers when discussing sameness.

But if not being different is enough to be considered the same, then do we still need sameness to be primitive? To simplify our reasoning, we may wish to have sameness be *defined* as the lack of difference. However, this is not without complications. Sameness has been considered a natural primitive for so long that it has accrued many additional non-propositional properties (e.g., the substitution principle). So, if we eliminate the propositional notion of primitive equality, we will need somewhere else to hang those coats.

The rest of the paper fleshes out these various ideas.

Twitter Facebook Google+ Tumblr WordPresscomments

### wren gayle romano: On being the "same" or "different": Introduction to Apartness

Meanwhile, back in math land... A couple-few months ago I was doing some work on apartness relations. In particular, I was looking into foundational issues, and into what an apartness-based (rather than equality-based) dependently-typed programming language would look like. Unfortunately, too many folks think "constructive mathematics" only means BHK-style intuitionistic logic— whereas constructive mathematics includes all sorts of other concepts, and they really should be better known!

So I started writing a preamble post, introducing the basic definitions and ideas behind apartnesses, and... well, I kinda got carried away. Instead of a blog post I kinda ended up with a short chapter. And then, well, panic struck. In the interests of Publish Ever, Publish Often, I thought I might as well share it: a brief introduction to apartness relations. As with my blog posts, I'm releasing it under Creative Commons Attribution-NonCommercial-NoDerivs 4.0; so feel free to share it and use it for classes. But, unlike the other columbicubiculomania files, it is not ShareAlike— since I may actually turn it into a published chapter someday. So do respect that. And if you have a book that needs some chapters on apartness relations, get in touch!

The intro goes a little something like this:

We often talk about values being "the same as" or "different from" one another. But how can we formalize these notions? In particular, how should we do so in a constructive setting?

Constructively, we lack a general axiom for double-negation elimination; therefore, every primitive notion gives rise to both strong (strictly positive) and weak (doubly-negated) propositions. Thus, from the denial of (weak) difference we can only conclude weak sameness. Consequently, in the constructive setting it is often desirable to take difference to be a primitive— so that, from the denial of strong difference we can in fact conclude strong sameness.

This ability "un-negate" sameness is the principal reason for taking difference to be one of our primitive notions. While nice in and of itself, it also causes the strong and weak notions of sameness to become logically equivalent (thm 1.4); enabling us to drop the qualifiers when discussing sameness.

But if not being different is enough to be considered the same, then do we still need sameness to be primitive? To simplify our reasoning, we may wish to have sameness be *defined* as the lack of difference. However, this is not without complications. Sameness has been considered a natural primitive for so long that it has accrued many additional non-propositional properties (e.g., the substitution principle). So, if we eliminate the propositional notion of primitive equality, we will need somewhere else to hang those coats.

The rest of the paper fleshes out these various ideas.

Twitter Facebook Google+ Tumblr WordPresscomments

### Clojure's Transducers are Perverse Lenses

/u/tel was playing around with a translation of Clojure's transducers to Haskell here. He introduced a type

type Red r a = (r -> a -> r, r)which reminded me of non-van Laarhoven lenses

type OldLens a b = (a -> b -> a, a -> b)We can change tel's Red slightly

type Red r a = (r -> a -> r, () -> r)From this point of view, Red is a perverse form of lens, because the "getter" always returns the same value, which is the value a normal lens would extract a value from! I think the modified "van Laarhoven form" of Red reads

type PerverseLens r a = forall f. Functor f => (() -> f a) -> a -> f rbut I'm not sure. I suspect that you'll be able to use normal function composition with this encoding somehow, and it will compose "backwards" like lenses do. After about 15 minutes, I haven't gotten anywhere, but I'm a Haskell noob, so I'm curious if someone more experienced can make this work.

/u/tel also defined reducer transformers

type RT r a b = PerverseLens r a -> PerverseLens r bFrom the "perverse lens" point of view, I believe an RT would be equivalent to

(. perverseGetter)where a PerverseGetter is PerverseLens specialized to Const, in the same way Getter is Lens specialized to Const.

I'm not sure how helpful or useful any of this is, but it is interesting.

EDIT: Perhaps

type Red r a = (r -> a -> r, (forall x. x -> r)) type PerverseLens r a = forall f. Functor f => (forall x. x -> f a) -> a -> f rwould be better types for perverse lenses?

submitted by kidnapster[link] [23 comments]

### I Did A Haskell: fizzbuzz

I started the Learn You A Haskell tutorial, and as soon as I hit the boomBangs example I was like "ooh! ooh!" and had to do fizzbuzz. I have no idea if this is anywhere close to idiomatic (or whether there's any benefit to declaring the null case explicitly, here). First working version had fz and bz as functions in a where clause because I couldn't figure out how to inline them, which amuses me in retrospect because, ha ha, in lines -- anyway, I suspect that my original version is more proper, but dammit, I'm proud of myself. :D

Okay, done gushing at the internet for now, but this here thing is *neat.*

[link] [26 comments]

### Oliver Charles: Working with postgresql-simple with generics-sop

The least interesting part of my job as a programmer is the act of pressing keys on a keyboard, and thus I actively seek ways to reduce typing. As programmers, we aim for reuse in a our programs - abstracting commonality into reusable functions such that our programs get more concise. Functional programmers are aware of the benefits of higher-order functions as one form of generic programming, but another powerful technique is that of data type generic programming.

This variant of generic programming allows one to build programs that work over arbitrary data types, providing they have some sort of known “shape”. We describe the shape of data types by representing them via a code - often we can describe a data type as a sum of products. By sum, we are talking about the choice of a constructor in a data type (such as choosing between Left and Right to construct Either values), and by product we mean the individual fields in a constructor (such as the individual fields in a record).

Last month, Edsko and Löh announced a new library for generic programming: generics-sop. I’ve been playing with this library in the last couple of days, and I absolutely love the approach. In today’s short post, I want to demonstrate how easy it is to use this library. I don’t plan to go into a lot of detail, but I encourage interested readers to check out the associated paper - True Sums of Products - a paper with a lovely balance of theory and a plethora of examples.

postgresql-simpleWhen working with postgresql-simple, one often defines records and corresponding FromRow and ToRow instances. Let’s assume we’re modelling a library. No library is complete without books, so we might begin with a record such as:

data Book = Book { bookTitle :: Text , bookAuthor :: Text , bookISBN :: ISBN , bookPublishYear :: Int }In order to store and retrieve these in our database, we need to write the following instances:

instance FromRow Book where toRow = Book <$> field <*> field <*> field <*> field instance ToRow Book where toRow Book{..} = [ toField bookTitle , toField bookAuthor , toField bookISBN , toField bookPublishYear ]As you can see - that’s a lot of boilerplate. In fact, it’s nearly twice as much code as the data type itself! The definitions of these instances are trivial, so it’s frustrating that I have to manually type the implementation bodies by hand. It’s here that we turn to generics-sop.

First, we’re going to need a bit of boiler-plate in order to manipulate Books generically:

data Book = ... deriving (GHC.Generics.Generic) instance Generics.SOP.Generic BookWe derive generic representations of our Book using GHC.Generics, and in turn use this generic representation to derive the Generics.SOP.Generic instance. With this out of the way, we’re ready to work with Books in a generic manner.

generics-sopThe generics-sop library works by manipulating heterogeneous lists of data. If we look at our Book data type, we can see that the following two are morally describing the same data:

book = Book "Conceptual Mathematics" "Lawvere, Schanuel" "978-0-521-71916-2" 2009 book = [ "Conceptual Mathematics", "Lawvere, Schanuel", "978-0-521-71916-2", 2009 ]Of course, we can’t actually write such a thing in Haskell - lists are required to have all their elements of the same type. However, using modern GHC extensions, we can get very close to modelling this:

data HList :: [*] -> * where Nil :: HList '[] (:*) :: x -> HList xs -> HList (x ': xs) book :: HList '[Text, Text, ISBN, Int] book = "Conceptual Mathematics" :* "Lawvere, Schanuel" :* "978-0-521-71916-2" :* 2009 :* NilOnce we begin working in this domain, a lot of the techniques we’re already familiar with continue fairly naturally. We can map over these lists, exploit their applicative functor-like structure, fold them, and so on.

generics-sop continues in the trend, using kind polymorphism and a few other techniques to maximise generality. We can see what exactly is going on with generics-sop if we ask GHCI for the :kind! of Book’s generic Code:

> :kind! Code Book Code Book = SOP I '[ '[ Text, Text, ISBN, Int ] ]The list of fields is contained within another list of all possible constructors - as Book only has one constructor, thus there is only one element in the outer list.

FromRow, GenericallyHow does this help us solve the problem of our FromRow and ToRow instances? First, let’s think about what’s happening when we write instances of FromRow. Our Book data type has four fields, so we need to use field four times. field has side effects in the RowParser functor, so we sequence all of these calls using applicative syntax, finally applying the results to the Book constructor.

Now that we’ve broken the problem down, we’ll start by solving our first problem - calling field the correct number of times. Calling field means we need to have an instance of FromField for each field in a constructor, so to enforce this, we can use All to require all fields have an instance of a type class. We also use a little trick with Proxy to specify which type class we need to use. We combine all of this with hcpure, which is a variant of pure that can be used to build a product:

fields :: (All FromField xs, SingI xs) => NP RowParser xs fields = hcpure fromField field where fromField = Proxy :: Proxy FromFieldSo far, we’ve built a product of field calls, which you can think of as being a list of RowParsers - something akin to [RowParser ..]. However, we need a single row parser returning multiple values, which is more like RowParser [..]. In the Prelude we have a function to sequence a list of monadic actions:

sequence :: Monad m => [m a] -> m [a]There is an equivalent in generics-sop for working with heterogeneous lists - hsequence. Thus if we hsequence our fields, we build a single RowParser that returns a product of values:

fields :: (All FromField xs, SingI xs) => RowParser (NP I xs) fields = hsequence (hcpure fromField field) where fromField = Proxy :: Proxy FromField(I is the “do nothing” identity functor).

Remarkably, these few lines of code are enough to construct data types. All we need to do is embed this product in a constructor of a sum, and then switch from the generic representation to a concrete data type. We’ll restrict ourselves to data types that have only one constructor, and this constraint is mentioned in the type below (Code a ~ '[ xs ] forces a to have only one constructor):

gfrowRow :: (All FromField xs, Code a ~ '[xs], SingI xs, Generic a) => RowParser a gfrowRow = to . SOP . Z <$> hsequence (hcpure fromField field) where fromField = Proxy :: Proxy FromFieldThat’s all there is to it! No type class instances, no skipping over meta-data - we just build a list of field calls, sequence them, and turn the result into our data type.

ToRow, GenericallyIt’s not hard to apply the same ideas for ToRow. Recall the definition of ToRow:

class ToRow a where toRow :: a -> [Action]toRow takes a value of type a and turns it into a list of actions. Usually, we have one action for each field - we just call toField on each field in the record.

To work with data generically, we first need move from the original data type to its generic representation, which we can do with from and a little bit of pattern matching:

gtoRow :: (Generic a, Code a ~ '[xs]) => a -> [Action] gtoRow a = case from a of SOP (Z xs) -> _Here we pattern match into the fields of the first constructor of the data type. xs is now a product of all fields, and we can begin turning into Actions. The most natural way to do this is simply to map toField over each field, collecting the resulting Actions into a list. That is, we’d like to do:

map toField xsThat’s not quite possible in generics-sop, but we can get very close. Using hcliftA, we can lift a method of a type class over a heterogeneous list:

gtoRow :: (Generic a, Code a ~ '[xs], All ToField xs, SingI xs) => a -> [Action] gtoRow a = case from a of SOP (Z xs) -> _ (hcliftA toFieldP (K . toField . unI) xs) where toFieldP = Proxy :: Proxy ToFieldWe unwrap from the identity functor I, call toField on the value, and then pack this back up using the constant functor K. The details here are a little subtle, but essentially this moves us from a heterogeneous list to a homogeneous list, where each element of the list is an Action. Now that we have a homogeneous list, we can switch back to a more basic representation by collapsing the structure with hcollapse:

gtoRow :: (Generic a, Code a ~ '[xs], All ToField xs, SingI xs) => a -> [Action] gtoRow a = case from a of SOP (Z xs) -> hcollapse (hcliftA toFieldP (K . toField . unI) xs) where toFieldP = Proxy :: Proxy ToFieldAdmittedly this definition is a little more complicated than one might hope, but it’s still extremely concise and declarative - there’s only a little bit of noise added. However, again we should note there was no need to write type class instances, perform explicit recursion or deal with meta-data - generics-sop stayed out of way and gave us just what we needed.

ConclusionNow that we have gfromRow and gtoRow it’s easy to extend our application. Perhaps we now want to extend our database with Author objects. We’re now free to do so, with minimal boiler plate:

data Book = Book { bookId :: Int , bookTitle :: Text , bookAuthorId :: Int , bookISBN :: ISBN , bookPublishYear :: Int } deriving (GHC.Generics.Generic) instance Generic.SOP.Generic Book instance FromRow Book where fromRow = gfromRow instance ToRow Book where toRow = gtoRow data Author = Author { authorId :: Int , authorName :: Text , authorCountry :: Country } deriving (GHC.Generics.Generic) instance Generic.SOP.Generic Author instance FromRow Author where fromRow = gfromRow instance ToRow Author where toRow = gtoRowgenerics-sop is a powerful library for dealing with data generically. By using heterogeneous lists, the techniques we’ve learnt at the value level naturally extend and we can begin to think of working with generic data in a declarative manner. For me, this appeal to familiar techniques makes it easy to dive straight in to writing generic functions - I’ve already spent time learning to think in maps and folds, it’s nice to see the ideas transfer to yet another problem domain.

generics-sop goes a lot further than we’ve seen in this post. For more real-world examples, see the links at the top of the generics-sop Hackage page.

### cabal repl failing silently on missing exposed-modules

### How would you make a falling sand game in Haskell?

Background: a falling sand game is this. Basically, various types of particles have very naive physics.

So basically, the usual/imperative approach would be to have a list of particles and update them one at a time. For speed, there would be a 2d-array to quickly look up if there is a particle at a position.

It seems like there should be a more idiomatic approach in Haskell. I've thought about it, but even something as simple as abstracting the looping construct seems tricky.

submitted by tailcalled[link] [26 comments]

### wren gayle romano: Imagine that this is not an academic debate

A followup to my previous [reddit version]:

The examples are of limited utility. The problem is not a few bad apples or a few bad words; were that the case it would be easier to address. The problem is a subtle one: it's in the tone and tenor of conversation, it's in the things *not* talked about, in the implicitization of assumptions, and in a decentering of the sorts of communities of engagement that Haskell was founded on.

Back in 2003 and 2005, communities like Haskell Cafe were communities of praxis. That is, we gathered because we do Haskell, and our gathering was a way to meet others who do Haskell. Our discussions were centered on this praxis and on how we could improve our own doing of Haskell. Naturally, as a place of learning it was also a place of teaching— but teaching was never the goal, teaching was a necessary means to the end of improving our own understandings of being lazy with class. The assumptions implicit in the community at the time were that Haskell was a path to explore, and an obscure one at that. It is not The Way™ by any stretch of the imagination. And being a small community it was easy to know every person in it, to converse as you would with a friend not as you would online.

Over time the tone and nature of the Cafe changed considerably. It's hard to explain the shift without overly praising the way things were before or overly condemning the shift. Whereas the Cafe used to be a place for people to encounter one another on their solitary journeys, in time it became less of a resting stop (or dare I say: cafe) and more of a meeting hall. No longer a place to meet those who do Haskell, but rather a place for a certain communal doing of Haskell. I single the Cafe out only because I have the longest history with that community, but the same overall shift has occurred everywhere I've seen. Whereas previously it was a community of praxis, now it is more a community of educationalism. In the public spaces there is more teaching of Haskell than doing of it. There's nothing wrong with teaching, but when teaching becomes the thing-being-done rather than a means to an end, it twists the message. It's no longer people asking for help and receiving personal guidance, it's offering up half-baked monad tutorials to the faceless masses. And from tutorialization it's a very short path to proselytizing and evangelizing. And this weaponization of knowledge always serves to marginalize and exclude very specific voices from the community.

One class of voices being excluded is women. To see an example of this, consider the response to Doaitse Swierstra's comment at the 2012 Haskell Symposium. Stop thinking about the comment. The comment is not the point. The point is, once the problematic nature of the comment was raised, how did the community respond? If you want a specific example, this is it. The example is not in what Swierstra said, the example is in how the Haskell community responded to being called out. If you don't recall how this went down, here's the reddit version; though it's worth pointing out that there were many other conversations outside of reddit. A *very* small number of people acquitted themselves well. A handful of people knew how to speak the party line but flubbed it by mansplaining, engaging in flamewars, or allowing the conversation to be derailed. And a great many people were showing their asses all over the place. Now I want you to go through and read every single comment there, including the ones below threshold. I want you to read those comments and imagine that this is not an academic debate. Imagine that this is *your* life. Imagine that *you* are the unnamed party under discussion. That *your* feelings are the ones everyone thinks they know so much about. That you personally are the one each commenter is accusing of overreacting. Imagine that you are a woman, that you are walking down the street in the middle of the night in an unfamiliar town after a long day of talks. It was raining earlier so the streets are wet. You're probably wearing flats, but your feet still hurt. You're tired. Perhaps you had a drink over dinner with other conference-goers, or perhaps not. Reading each comment, before going on to the next one, stop and ask yourself: would *you* feel safe if this commenter decided to follow you home on that darkened street? Do you feel like this person can comprehend that you are a human being on that wet street? Do you trust this person's intentions in being around you late at night? And ask yourself, when some other commenter on that thread follows you home at night and rapes you in the hotel, do you feel safe going to the author of this comment to tell them what happened? Because none of this is academic. As a woman you go to conferences and this is how you are treated. And the metric of whether you can be around someone is not whether they seem interesting or smart or anything else, the metric is: do you feel safe? If you can understand anything about what this is like, then reading that thread will make you extremely uncomfortable. The problem is not that some person makes a comment. The problem is that masculinized communities are not safe for women. The problem is that certain modes of interaction are actively hostile to certain participants. The problem is finding yourself in an uncomfortable situation and knowing that noone has your back. Knowing that anyone who agrees with you will remain silent because they do not think you are worth the time and energy to bother supporting. Because that's what silence says. Silence says you are not worth it. Silence says you are not one of us. Silence says I do not think you are entirely human. And for all the upvotes and all the conversation my previous comment has sparked on twitter, irc, and elsewhere, I sure don't hear anyone *here* speaking up to say they got my back.

This is not a problem about women in Haskell. Women are just the go-to example, the example cis het middle-class educated able white men are used to engaging. Countless voices are excluded by the current atmosphere in Haskell communities. I know they are excluded because I personally watched them walk out the door after incidents like the one above, and I've been watching them leave for a decade. I'm in various communities for queer programmers, and many of the folks there use Haskell but none of them will come within ten feet of "official" Haskell communities. That aversion is even stronger in the transgender/genderqueer community. I personally know at least a dozen trans Haskellers, but I'm the only one who participates in the "official" Haskell community. Last fall I got hatemail from Haskellers for bringing up the violence against trans women of color on my blog, since that blog is syndicated to Planet Haskell. Again, when I brought this up, people would express their dismay in private conversations, but noone would say a damn thing in public nor even acknowledge that I had spoken. Ours has never been a great community for people of color, and when I talk to POC about Haskell I do not even consider directing them to the "official" channels. When Ken Shan gave the program chair report at the Haskell symposium last year, there was a similarly unwholesome response as with Swierstra's comment the year before. A number of people have shared their experiences in response to Ken's call, but overwhelmingly people feel like their stories of being marginalized and excluded "don't count" or "aren't enough to mention". Stop. Think about that. A lot of people are coming forward to talk about how they've been made to feel uncomfortable, and *while telling those stories* they feel the need to qualify. While actively explaining their own experiences of racism, sexism, heterosexism, cissexism, ablism, sanism, etc, they feel the simultaneous need to point out that these experiences are not out of the ordinary. Experiencing bigotry is so within the ordinary that people feel like they're being a bother to even mention it. This is what I'm talking about. This is what I mean when I say that there is a growing miasma in our community. This is how racism and sexism and ablism work. It's not smacking someone on the ass or using the N-word. It's a pervasive and insidious tone in the conversation, a thousand and one not-so-subtle clues about who gets to be included and who doesn't. And yes the sexual assaults and slurs and all that factor in, but that's the marzipan on top of the cake. The cake is made out of assuming someone who dresses "like a rapper" can't be a hacker. The cake is made out of assuming that "mother" and "professional" are exclusive categories. The cake is made out of well-actuallys and feigned surprise. And it works this way because this is how it avoids being called into question. So when you ask for specific examples you're missing the point. I can give examples, but doing so only contributes to the errant belief that bigotry happens in moments. Bigotry is not a moment. Bigotry is a sustained state of being that permeates one's actions and how one forms and engages with community. So knowing about that hatemail, or knowing about when I had to call someone out for sharing titty pictures on Haskell Cafe, or knowing about the formation of #nothaskell, or knowing about how tepid the response to Tim's article or Ken's report were, knowing about none of these specifics helps to engage with the actual problem.

comments

### How do you apply your knowledge of Haskell to popular OO languages?

I'm just curious; I assume most of us don't have day jobs/typically work directly in Haskell. How much of your Haskell knowledge transfers effectively to your work, particularly if you're using a popular Object-Oriented language like Ruby or Java?

In my case, I've recognized major abstractions when using Javascript (the engine of one of my main projects at work is essentially a bunch of cokleisli composition), but JS has first-class functions which make this kind of thing easy(isn) to work recognize and work with.

What kind of things have you learned in Haskell that help you out with working in other (large) systems written heavily in non-functional languages?

submitted by 5outh[link] [47 comments]

### Are there any new developments in the last seven years that would change the advice given in '8 ways to report errors in Haskell'?

### Philip Wadler: An Open Letter to Alex Salmond

It's not working.

There may be reasons to stick to the story that all will be smooth sailing in the event that Scotland becomes independent, but everyone knows it is not true.

Last night's debate makes it clear that refusal to discuss Plan B for currency hurts more than it helps. The same could be said for how negotiations will proceed over Trident, the EU, Nato, pensions, education, and the economy.

You took too long pressing Darling to admit that Scotland could succeed as an independent country, which you did to show that he too has issues with the truth. Last night's audience failed to get a straight answer from either side, and that led to their verdict: the debate was a disappointment.

It's time for an about face. Tell the truth. Independence will be stormy. The transition will not be easy. We cannot be certain of the outcome, save that it will put Scotland in a position to make its own decisions.

The future is indefinite, whether we opt for independence or no. The probable outcome if Scotland remains in the UK is growing social injustice, austerity for the poor and more for the 1%, bedroom tax, referenda on whether to stay in the EU. You made these points last night, but they were overshadowed by your inability to admit that independence has risks too.

Truth will be refreshing. It will knock all loose and reinvigorate the debate. It may restore part of Scottish people's faith in the political process, something we sorely need regardless of which side wins in September.

If you stick to your current strategy, polls make it clear that No will win. It's time for Plan B.

Yours aye, -- P

### Yesod Web Framework: Announcing auto-update

Kazu and I are happy to announce the first release of auto-update, a library to run update actions on a given schedule. To make it more concrete, let's start with a motivating example.

Suppose you're writing a web service which will return the current time. This is simple enough with WAI and Warp, e.g.:

{-# LANGUAGE OverloadedStrings #-} import Data.ByteString.Lazy.Char8 (pack) import Data.Time (formatTime, getCurrentTime) import Network.HTTP.Types (status200) import Network.Wai (responseLBS) import Network.Wai.Handler.Warp (run) import System.Locale (defaultTimeLocale) main :: IO () main = run 3000 app where app _ respond = do now <- getCurrentTime respond $ responseLBS status200 [("Content-Type", "text/plain")] $ pack $ formatTime defaultTimeLocale "%c" nowThis is all well and good, but it's a bit inefficient. Imagine if you have a
thousand requests per second (some people *really* like do know what time it
is). We will end up recalculating the string representation of the time a 999
extra times than is necessary! To work around this, we have a simple solution:
spawn a worker thread to calculate the time once per second. (Note: it will
actually calculate it slightly less than once per second due to the way
threadDelay works; we're assuming we have a little bit of latitude in
returning a value thats a few milliseconds off.)

Now we will calculate the current time once per second, which is far more efficient... right? Well, it depends on server load. Previously, we talked about a server getting a thousand requests per second. Let's instead reverse it: a server that gets one request every thousand seconds. In that case, our optimization turns into a pessimization.

This problem doesn't just affect getting the current time. Another example is flushing logs. A hot web server could be crippled by flushing logs to disk on every request, whereas flushing once a second on a less popular server simply keeps the process running for no reason. One option is to put the power in the hands of users of a library to decide how often to flush. But often times, we won't know until runtime how frequently a service will be requested. Or even more complicated: traffic will come in spikes, with both busy and idle times.

(Note that I've only given examples of running web servers, though I'm certain there are plenty of other examples out there to draw from.)

This is the problem that auto-update comes to solve. With auto-update, you declare an update function, a frequency with which it should run, and a threshold at which it should "daemonize". The first few times you request a value, it's calculated in the main thread. Once you cross the daemonize threshold, a dedicated worker thread is spawned to recalculate the value. If the value is not requested during an update period, the worker thread is shut down, and we go back to the beginning.

Let's see how our running example works out with this:

{-# LANGUAGE OverloadedStrings #-} import Control.AutoUpdate (defaultUpdateSettings, mkAutoUpdate, updateAction) import Data.ByteString.Lazy.Char8 (ByteString, pack) import Data.Time (formatTime, getCurrentTime) import Network.HTTP.Types (status200) import Network.Wai (responseLBS) import Network.Wai.Handler.Warp (run) import System.Locale (defaultTimeLocale) getCurrentTimeString :: IO ByteString getCurrentTimeString = do now <- getCurrentTime return $ pack $ formatTime defaultTimeLocale "%c" now main :: IO () main = do getTime <- mkAutoUpdate defaultUpdateSettings { updateAction = getCurrentTimeString } run 3000 (app getTime) where app getTime _ respond = do time <- getTime respond $ responseLBS status200 [("Content-Type", "text/plain")] timeIf you want to see the impact of this change, add a putStrLn call to getCurrentTimeString and make a bunch of requests to the service. You should see just one request per second, once you get past that initial threshold period (default of 3).

Kazu and I have started using this library in a few places:

- fast-logger no longer requires explicit flushing; it's handled for you automatically.
- wai-logger and wai-extra's request logger, by extension, inherit this functionality.
- Warp no longer has a dedicated thread for getting the current time.
- The Yesod scaffolding was able to get rid of an annoying bit of commentary.

Hopefully others will enjoy and use this library as well.

Control.ReaperThe second module in auto-update is Control.Reaper. This provides something similar, but slightly different, from Control.AutoUpdate. The goal is to spawn reaper/cleanup threads on demand. These threads can handle such things as:

- Recycling resources in a resource pool.
- Closing out unused connections in a connection pool.
- Terminating threads that have overstayed a timeout.

This module is currently being used in Warp for slowloris timeouts and file descriptor cache management, though I will likely use it in http-client in the near future as well for its connection manager management.

### Dominic Steinitz: Fun with (Kalman) Filters Part II

Suppose we have particle moving in at constant velocity in 1 dimension, where the velocity is sampled from a distribution. We can observe the position of the particle at fixed intervals and we wish to estimate its initial velocity. For generality, let us assume that the positions and the velocities can be perturbed at each interval and that our measurements are noisy.

A point of Haskell interest: using type level literals caught a bug in the mathematical description (one of the dimensions of a matrix was incorrect). Of course, this would have become apparent at run-time but proof checking of this nature is surely the future for mathematicians. One could conceive of writing an implementation of an algorithm or proof, compiling it but never actually running it purely to check that some aspects of the algorithm or proof are correct.

The Mathematical ModelWe take the position as and the velocity :

where and are all IID normal with means of 0 and variances of and

We can re-write this as

where

Let us denote the mean and variance of as and respectively and note that

Since and are jointly Gaussian and recalling that = as covariance matrices are symmetric, we can calculate their mean and covariance matrix as

We can now use standard formulæ which say if

then

and apply this to

to give

This is called the measurement update; more explicitly

Sometimes the measurement residual , the measurement prediction covariance and the filter gain are defined and the measurement update is written as

We further have that

We thus obtain the Kalman filter prediction step:

Further information can be found in (Boyd 2008), (Kleeman 1996) and (Särkkä 2013).

A Haskell ImplementationThe hmatrix now uses type level literals via the DataKind extension in ghc to enforce compatibility of matrix and vector operations at the type level. See here for more details. Sadly a bug in the hmatrix implementation means we can’t currently use this excellent feature and we content ourselves with comments describing what the types would be were it possible to use it.

> {-# OPTIONS_GHC -Wall #-} > {-# OPTIONS_GHC -fno-warn-name-shadowing #-} > {-# OPTIONS_GHC -fno-warn-type-defaults #-} > {-# OPTIONS_GHC -fno-warn-unused-do-bind #-} > {-# OPTIONS_GHC -fno-warn-missing-methods #-} > {-# OPTIONS_GHC -fno-warn-orphans #-} > {-# LANGUAGE DataKinds #-} > {-# LANGUAGE ScopedTypeVariables #-} > {-# LANGUAGE RankNTypes #-} > module FunWithKalmanPart1a where > import Numeric.LinearAlgebra.HMatrix hiding ( outer ) > import Data.Random.Source.PureMT > import Data.Random hiding ( gamma ) > import Control.Monad.State > import qualified Control.Monad.Writer as W > import Control.Monad.LoopsLet us make our model almost deterministic but with noisy observations.

> stateVariance :: Double > stateVariance = 1e-6 > obsVariance :: Double > obsVariance = 1.0And let us start with a prior normal distribution with a mean position and velocity of 0 with moderate variances and no correlation.

> -- muPrior :: R 2 > muPrior :: Vector Double > muPrior = vector [0.0, 0.0] > -- sigmaPrior :: Sq 2 > sigmaPrior :: Matrix Double > sigmaPrior = (2 >< 2) [ 1e1, 0.0 > , 0.0, 1e1 > ]We now set up the parameters for our model as outlined in the preceeding section.

> deltaT :: Double > deltaT = 0.001 > -- bigA :: Sq 2 > bigA :: Matrix Double > bigA = (2 >< 2) [ 1, deltaT > , 0, 1 > ] > a :: Double > a = 1.0 > -- bigH :: L 1 2 > bigH :: Matrix Double > bigH = (1 >< 2) [ a, 0 > ] > -- bigSigmaY :: Sq 1 > bigSigmaY :: Matrix Double > bigSigmaY = (1 >< 1) [ obsVariance ] > -- bigSigmaX :: Sq 2 > bigSigmaX :: Matrix Double > bigSigmaX = (2 >< 2) [ stateVariance, 0.0 > , 0.0, stateVariance > ]The implementation of the Kalman filter using the hmatrix package is straightforward.

> -- outer :: forall m n . (KnownNat m, KnownNat n) => > -- R n -> Sq n -> L m n -> Sq m -> Sq n -> Sq n -> [R m] -> [(R n, Sq n)] > outer :: Vector Double > -> Matrix Double > -> Matrix Double > -> Matrix Double > -> Matrix Double > -> Matrix Double > -> [Vector Double] > -> [(Vector Double, Matrix Double)] > outer muPrior sigmaPrior bigH bigSigmaY bigA bigSigmaX ys = result > where > result = scanl update (muPrior, sigmaPrior) ys > > -- update :: (R n, Sq n) -> R m -> (R n, Sq n) > update (xHatFlat, bigSigmaHatFlat) y = > (xHatFlatNew, bigSigmaHatFlatNew) > where > -- v :: R m > v = y - bigH #> xHatFlat > -- bigS :: Sq m > bigS = bigH <> bigSigmaHatFlat <> (tr bigH) + bigSigmaY > -- bigK :: L n m > bigK = bigSigmaHatFlat <> (tr bigH) <> (inv bigS) > -- xHat :: R n > xHat = xHatFlat + bigK #> v > -- bigSigmaHat :: Sq n > bigSigmaHat = bigSigmaHatFlat - bigK <> bigS <> (tr bigK) > -- xHatFlatNew :: R n > xHatFlatNew = bigA #> xHat > -- bigSigmaHatFlatNew :: Sq n > bigSigmaHatFlatNew = bigA <> bigSigmaHat <> (tr bigA) + bigSigmaXWe create some ranodm data using our model parameters.

> singleSample ::(Double, Double) -> > RVarT (W.Writer [(Double, (Double, Double))]) (Double, Double) > singleSample (xPrev, vPrev) = do > psiX <- rvarT (Normal 0.0 stateVariance) > let xNew = xPrev + deltaT * vPrev + psiX > psiV <- rvarT (Normal 0.0 stateVariance) > let vNew = vPrev + psiV > upsilon <- rvarT (Normal 0.0 obsVariance) > let y = a * xNew + upsilon > lift $ W.tell [(y, (xNew, vNew))] > return (xNew, vNew) > streamSample :: RVarT (W.Writer [(Double, (Double, Double))]) (Double, Double) > streamSample = iterateM_ singleSample (1.0, 1.0) > samples :: ((Double, Double), [(Double, (Double, Double))]) > samples = W.runWriter (evalStateT (sample streamSample) (pureMT 2))Here are the actual values of the randomly generated positions.

> actualXs :: [Double] > actualXs = map (fst . snd) $ take nObs $ snd samples > test :: [(Vector Double, Matrix Double)] > test = outer muPrior sigmaPrior bigH bigSigmaY bigA bigSigmaX > (map (\x -> vector [x]) $ map fst $ snd samples)And using the Kalman filter we can estimate the positions.

> estXs :: [Double] > estXs = map (!!0) $ map toList $ map fst $ take nObs test > nObs :: Int > nObs = 1000And we can see that the estimates track the actual positions quite nicely.

Of course we really wanted to estimate the velocity.

> actualVs :: [Double] > actualVs = map (snd . snd) $ take nObs $ snd samples > estVs :: [Double] > estVs = map (!!1) $ map toList $ map fst $ take nObs test BibliographyBoyd, Stephen. 2008. “EE363 Linear Dynamical Systems.” http://stanford.edu/class/ee363.

Kleeman, Lindsay. 1996. “Understanding and Applying Kalman Filtering.” In *Proceedings of the Second Workshop on Perceptive Systems, Curtin University of Technology, Perth Western Australia (25-26 January 1996)*.

Särkkä, Simo. 2013. *Bayesian Filtering and Smoothing*. Vol. 3. Cambridge University Press.