# metaperl's blog

## The Gematria of Satan Explored Through Haskell. in #haskell

[16:40:15] ihope/ sum [1..sum [1..8]]

[16:40:17] lambdabot 666

[16:40:27] GeniXPro/ nice

[16:40:40] GeniXPro/ You created the best number!

[16:40:44] GeniXPro/ beast*

[16:40:58] ihope/ What a typo.

[16:43:38] ihope/ "This calls for wisdom. If anyone has insight, let him calculate the number of the beast, for it is man's number. His number is 666."

## circular definitions in "Monads as Containers"

We first see

What bind does is to take a container of type (m a) and a function of type (a -> m b). It first maps the function over the container, (which would give an m (m b))

and then appliesto the result to get a container of type (m b).join

But then we see

Joining is equivalent to

binding a container with the identity map. This is indeed still called join in Haskell:

So then the question becomes: if bind uses join and join uses bind, then we have a serious circularity issue...

## Sometimes imperative works out better?

From the ocaml book we have the following

Certain algorithms are easier to write in this (imperative) programming style. Take for instance the computation of the product of two matrices. Even though it is certainly possible to translate it into a purely functional version, in which lists replace vectors, this is neither natural nor efficient compared to an imperative version.

## non-deterministic monad of streams easier in Ocaml?

I was reading Cale Gibbard's Monads as Containers and thought "now this is what I learned Haskell for" and then I began to wonder about Ocaml and monads, which led me to google which led me to this post on non-deterministic monad of streams which the author says "cannot be (naively) done in either Prolog or in Haskell's

|MonadPlus monad, both of which would go into an infinite loop on this example."

## oo versus fp : it all boils down to where you get your extensibility

```
```[19:33:05] /Cale/ But in FP, things are usually sort of 'dual' to OO in a strange way. Data is inextensible, but the operations on it are very extensible, which is sort of the reverse of the situation in OO-land.

- Login to post comments

## simply put, but oh so true:

[19:26:57] /Cale/ loufoque: another thing is that it's just

fun to program in Haskell -- you don't feel so much like

you're writing boilerplate code all the time, and if it

compiles, it usually works, since the typesystem catches

80 or 90 percent of all the stupid mistakes which the

compilers in other languages wouldn't.

```
```

- Login to post comments

## how the (m a) of m (m a) can be represent a several containers for a data type

- Login to post comments

## analyzing the stringification of a tree data structure

In section 8.3 of the discussion on stringifying tree structures Hudak says:

Because (++) has time complexity linear in the length of its left argument, showTree is potentially quadratic in the size of the tree.

in response to this code:

showTree (Leaf x) = show x

showTree (Branch l r) = "<" ++ showTree l ++ "|" ++ showTree r ++ ">"

So this brings up two questions:

- Why does
`(++)`

have time complexity linear in the length of its left argument? - Why is showTree potentially quadratic in the size of the tree?

## Novice Questions

- For a function
`fn (x:xs) ...`

what happens if it is called like this`fn []`

?

- An error
- A convenient way to alias a type is how?

`type String = [Char]`

- Login to post comments