News aggregator

Dominic Steinitz: Girsanov’s Theorem

Planet Haskell - Sat, 10/31/2015 - 11:23am
Introduction

We previously used importance sampling in the case where we did not have a sampler available for the distribution from which we wished to sample. There is an even more compelling case for using importance sampling.

Suppose we wish to estimate the probability of a rare event. For example, suppose we wish to estimate where . In this case, we can look up the answer . But suppose we couldn’t look up the answer. One strategy that might occur to us is to sample and then estimate the probability by counting the number of times out of the total that the sample was bigger than 5. The flaw in this is obvious but let’s try it anyway.

> module Girsanov where > import qualified Data.Vector as V > import Data.Random.Source.PureMT > import Data.Random > import Control.Monad.State > import Data.Histogram.Fill > import Data.Histogram.Generic ( Histogram ) > import Data.Number.Erf > import Data.List ( transpose ) > samples :: (Foldable f, MonadRandom m) => > (Int -> RVar Double -> RVar (f Double)) -> > Int -> > m (f Double) > samples repM n = sample $ repM n $ stdNormal > biggerThan5 :: Int > biggerThan5 = length (evalState xs (pureMT 42)) > where > xs :: MonadRandom m => m [Double] > xs = liftM (filter (>= 5.0)) $ samples replicateM 100000

As we might have expected, even if we draw 100,000 samples, we estimate this probability quite poorly.

ghci> biggerThan5 0

Using importance sampling we can do a lot better.

Let be a normally distributed random variable with zero mean and unit variance under the Lebesgue measure . As usual we can then define a new probability measure, the law of , by

Thus

where we have defined

Thus we can estimate either by sampling from a normal distribution with mean 0 and counting the number of samples that are above 5 or we can sample from a normal distribution with mean 5 and calculating the appropriately weighted mean

Let’s try this out.

> biggerThan5' :: Double > biggerThan5' = sum (evalState xs (pureMT 42)) / (fromIntegral n) > where > xs :: MonadRandom m => m [Double] > xs = liftM (map g) $ > liftM (filter (>= 5.0)) $ > liftM (map (+5)) $ > samples replicateM n > g x = exp $ (5^2 / 2) - 5 * x > n = 100000

And now we get quite a good estimate.

ghci> biggerThan5' 2.85776225450217e-7 Random Paths

The probability of another rare event we might wish to estimate is that of Brownian Motion crossing a boundary. For example, what is the probability of Browian Motion crossing the line ? Let’s try sampling 100 paths (we restrict the number so the chart is still readable).

> epsilons :: (Foldable f, MonadRandom m) => > (Int -> RVar Double -> RVar (f Double)) -> > Double -> > Int -> > m (f Double) > epsilons repM deltaT n = sample $ repM n $ rvar (Normal 0.0 (sqrt deltaT)) > bM0to1 :: Foldable f => > ((Double -> Double -> Double) -> Double -> f Double -> f Double) > -> (Int -> RVar Double -> RVar (f Double)) > -> Int > -> Int > -> f Double > bM0to1 scan repM seed n = > scan (+) 0.0 $ > evalState (epsilons repM (recip $ fromIntegral n) n) (pureMT (fromIntegral seed))

We can see by eye in the chart below that again we do quite poorly.

We know that where .

> p :: Double -> Double -> Double > p a t = 2 * (1 - normcdf (a / sqrt t)) ghci> p 1.0 1.0 0.31731050786291415 ghci> p 2.0 1.0 4.550026389635842e-2 ghci> p 3.0 1.0 2.699796063260207e-3

But what if we didn’t know this formula? Define

where is the measure which makes Brownian Motion.

We can estimate the expectation of

where is 1 if Brownian Motion hits the barrier and 0 otherwise and M is the total number of simulations. We know from visual inspection that this gives poor results but let us try some calculations anyway.

> n = 500 > m = 10000 > supAbove :: Double -> Double > supAbove a = fromIntegral count / fromIntegral n > where > count = length $ > filter (>= a) $ > map (\seed -> maximum $ bM0to1 scanl replicateM seed m) [0..n - 1] > bM0to1WithDrift seed mu n = > zipWith (\m x -> x + mu * m * deltaT) [0..] $ > bM0to1 scanl replicateM seed n > where > deltaT = recip $ fromIntegral n ghci> supAbove 1.0 0.326 ghci> supAbove 2.0 7.0e-2 ghci> supAbove 3.0 0.0

As expected for a rare event we get an estimate of 0.

Fortunately we can use importance sampling for paths. If we take where is a constant in Girsanov’s Theorem below then we know that is -Brownian Motion.

We observe that

So we can also estimate the expectation of under as

where is now 1 if Brownian Motion with the specified drift hits the barrier and 0 otherwise, and is Brownian Motion sampled at .

We can see from the chart below that this is going to be better at hitting the required barrier.

Let’s do some calculations.

> supAbove' a = (sum $ zipWith (*) ns ws) / fromIntegral n > where > deltaT = recip $ fromIntegral m > > uss = map (\seed -> bM0to1 scanl replicateM seed m) [0..n - 1] > ys = map last uss > ws = map (\x -> exp (-a * x - 0.5 * a^2)) ys > > vss = map (zipWith (\m x -> x + a * m * deltaT) [0..]) uss > sups = map maximum vss > ns = map fromIntegral $ map fromEnum $ map (>=a) sups ghci> supAbove' 1.0 0.31592655955519156 ghci> supAbove' 2.0 4.999395029856741e-2 ghci> supAbove' 3.0 2.3859203473651654e-3

The reader is invited to try the above estimates with 1,000 samples per path to see that even with this respectable number, the calculation goes awry.

In General

If we have a probability space and a non-negative random variable with then we can define a new probability measure on the same -algebra by

For any two probability measures when such a exists, it is called the Radon-Nikodym derivative of with respect to and denoted

Given that we managed to shift a Normal Distribution with non-zero mean in one measure to a Normal Distribution with another mean in another measure by producing the Radon-Nikodym derivative, might it be possible to shift, Brownian Motion with a drift under a one probability measure to be pure Brownian Motion under another probability measure by producing the Radon-Nikodym derivative? The answer is yes as Girsanov’s theorem below shows.

Girsanov’s Theorem

Let be Brownian Motion on a probability space and let be a filtration for this Brownian Motion and let be an adapted process such that the Novikov Sufficiency Condition holds

then there exists a probability measure such that

  • is equivalent to , that is, .

  • .

  • is Brownian Motion on the probabiity space also with the filtration .

In order to prove Girsanov’s Theorem, we need a condition which allows to infer that is a strict martingale. One such useful condition to which we have already alluded is the Novikov Sufficiency Condition.

Proof

Define by

Then, temporarily overloading the notation and writing for expectation under , and applying the Novikov Sufficiency Condition to , we have

From whence we see that

And since this characterizes Brownian Motion, we are done.

The Novikov Sufficiency Condition The Novikov Sufficiency Condition Statement

Let and further let it satisfy the Novikov condition

then the process defined by

is a strict martingale.

Before we prove this, we need two lemmas.

Lemma 1

Let for be a non-negative local martingale then is a super-martingale and if further then is a strict martingale.

Proof

Let be a localizing sequence for then for and using Fatou’s lemma and the fact that the stopped process is a strict martingale

Thus is a super-martingale and therefore

By assumption we have thus is a strict martingale.

Lemma 2

Let be a non-negative local martingale. If is a localizing sequence such that for some then is a strict martingale.

Proof

By the super-martingale property and thus by dominated convergence we have that

We also have that

By Chebyshev’s inequality (see note (2) below), we have

Taking limits first over and then over we see that

For and we have . Thus

Again taking limits over and then over we have

These two conclusions imply

We can therefore conclude (since is a martingale)

Thus by the preceeding lemma is a strict as well as a local martingale.

The Novikov Sufficiency Condition Proof Step 1

First we note that is a local martingale for . Let us show that it is a strict martingale. We can do this if for any localizing sequence we can show

using the preceeding lemma where .

We note that

Now apply Hölder’s inequality with conjugates and where is chosen to ensure that the conjugates are both strictly greater than 1 (otherwise we cannot apply the inequality).

Now let us choose

then

In order to apply Hölder’s inequality we need to check that and that but this amounts to checking that and that . We also need to check that but this amounts to checking that for and this is easily checked to be true.

Re-writing the above inequality with this value of we have

By the first lemma, since is a non-negative local martingale, it is also a supermartingale. Furthermore . Thus

and therefore

Step 2

Recall we have

Taking logs gives

or in diferential form

We can also apply Itô’s rule to

where denotes the quadratic variation of a stochastic process.

Comparing terms gives the stochastic differential equation

In integral form this can also be written as

Thus is a local martingale (it is defined by a stochastic differential equation) and by the first lemma it is a supermaringale. Hence .

Next we note that

to which we can apply Hölder’s inequality with conjugates to obtain

Applying the supermartingale inequality then gives

Step 3

Now we can apply the result in Step 2 to the result in Step 1.

We can replace by for any stopping time . Thus for a localizing sequence we have

From which we can conclude

Now we can apply the second lemma to conclude that is a strict martingale.

Final Step

We have already calculated that

Now apply Hölder’s inequality with conjugates and .

And then we can apply Jensen’s inequality to the last term on the right hand side with the convex function .

Using the inequality we established in Step 2 and the Novikov condition then gives

If we now let we see that we must have . We already now that by the first lemma and so we have finally proved that is a martingale.

Notes
  1. We have already used importance sampling and also touched on changes of measure.

  2. Chebyshev’s inequality is usually stated for the second moment but the proof is easily adapted:

  1. We follow Handel (2007); a similar approach is given in Steele (2001).
Bibliography

Handel, Ramon von. 2007. “Stochastic Calculus, Filtering, and Stochastic Control (Lecture Notes).”

Steele, J.M. 2001. Stochastic Calculus and Financial Applications. Applications of Mathematics. Springer New York. https://books.google.co.uk/books?id=fsgkBAAAQBAJ.


Categories: Offsite Blogs

testing for exceptions

haskell-cafe - Sat, 10/31/2015 - 11:16am
Hello, I have made this exercise which can be found at the Craft of Functional Programming book.
Categories: Offsite Discussion

Where to find Haskell stickers?

Haskell on Reddit - Fri, 10/30/2015 - 11:27pm

I want to plaster a bunch of Haskell stickers on the back of my laptop in a silly shape like an idiot.

They are not difficult to find (Google will very quickly find me some Haskell-related merchandise) but I would be particularly interested in outlets that are non-profit or such that if I purchase from them I would be somehow contributing to Haskell world domination.

There's the Haskell wiki page on merchandise but it seems a little outdated, last update was in 2009.

Can you help me?

Edit: I'm located in SF Bay Area

submitted by noeda
[link] [20 comments]
Categories: Incoming News

Determining the length of a Foldable Applicative.

haskell-cafe - Fri, 10/30/2015 - 9:03pm
Hi all, I thought I had a simple way to determine the “length" (i.e. - number of elements in) of a Foldable Applicative container: import Prelude hiding (sum) import Data.Foldable (Foldable(..), sum) import Control.Applicative -- Calculate the "length" (i.e. - number of elements in) an Applicative container. app_len :: (Applicative f, Foldable f) => f a -> Int app_len = sum $ pure 1 but I didn’t: app_len_test.hs:9:11: Could not deduce (Foldable t0) arising from a use of ‘sum’ from the context (Applicative f, Foldable f) bound by the type signature for app_len :: (Applicative f, Foldable f) => f a -> Int at app_len_test.hs:8:12-52 The type variable ‘t0’ is ambiguous Note: there are several potential instances: instance Foldable ((,) a) -- Defined in ‘Data.Foldable’ instance GHC.Arr.Ix i => Foldable (GHC.Arr.Array i) -- Defined in ‘Data.Foldable’ instance Foldable (Const m) -- Defined in ‘D
Categories: Offsite Discussion

Joey Hess: a tiling region manager for the console

Planet Haskell - Fri, 10/30/2015 - 7:44pm

Building on top of concurrent-output, and some related work Joachim Breitner did earlier, I now have a kind of equivilant to a tiling window manager, except it's managing regions of the console for different parts of a single program.

Here's a really silly demo, in an animated gif:

Not bad for 23 lines of code, is that? Seems much less tedious to do things this way than using ncurses. Even with its panels, ncurses requires you to think about layout of various things on the screen, and many low-level details. This, by contrast, is compositional, just add another region and a thread to update it, and away it goes.

So, here's an apt-like download progress display, in 30 lines of code.

Not only does it have regions which are individual lines of the screen, but those can have sub-regions within them as seen here (and so on).

And, log-type messages automatically scroll up above the regions. External programs run by createProcessConcurrent will automatically get their output/errors displayed there, too.

What I'm working on now is support for multiline regions, which automatically grow/shrink to fit what's placed in them. The hard part, which I'm putting the finishing touches on, is to accurately work out how large a region is before displaying it, in order to lay it out. Requires parsing ANSI codes amoung other things.

STM rules

There's so much concurrency, with complicated interrelated data being updated by different threads, that I couldn't have possibly built this without Software Transactional Memory.

Rather than a nightmare of locks behind locks behind locks, the result is so well behaved that I'm confident that anyone who needs more control over the region layout, or wants to do funky things can dive into to the STM interface and update the data structures, and nothing will ever deadlock or be inconsistent, and as soon as an update completes, it'll display on-screen.

An example of how powerful and beuatiful STM is, here's how the main display thread determines when it needs to refresh the display:

data DisplayChange = BufferChange [(StdHandle, OutputBuffer)] | RegionChange RegionSnapshot | TerminalResize (Maybe Width) | EndSignal () ... change <- atomically $ (RegionChange <$> regionWaiter origsnapshot) `orElse` (RegionChange <$> regionListWaiter origsnapshot) `orElse` (BufferChange <$> outputBufferWaiterSTM waitCompleteLines) `orElse` (TerminalResize <$> waitwidthchange) `orElse` (EndSignal <$> waitTSem endsignal) case change of RegionChange snapshot -> do ... BufferChange buffers -> do ... TerminalResize width -> do ...

So, it composes all these STM actions that can wait on various kinds of changes, to get one big action, that waits for all of the above, and builds up a nice sum type to represent what's changed.

Another example is that the whole support for sub-regions only involved adding 30 lines of code, all of it using STM, and it worked 100% the first time.

Available in concurrent-output 1.1.0.

Categories: Offsite Blogs

Douglas M. Auclair (geophf): October 2015 1HaskellADay Problems and Solutions

Planet Haskell - Fri, 10/30/2015 - 6:53pm
October 2015
  • October 29th, 2015: This is a perfect introduction to today's #haskell problem: dynamic predictions http://lpaste.net/6734184072140029952 because cats. And today's #haskell problem has the added benefit of containing the longest epic mid-type-declaration-comment of epic epicness. Epically. ... but what you didn't see for today's #haskell problem is the preparation #fallingasleepoverthekeyboard #again And the 'S' in the anSwer is not for 'S'tatistician, but for geophf waiting for a 'S'uper heroine to give the anSwer http://lpaste.net/8373695753289203712
  • October 28th, 2015: Today's #haskell problem, we DEFINE WHAT 'AVERAGE' IS! Nope. But we do take on predictive analytics! http://lpaste.net/6882676007984168960 So there's that. And here's the predictions-distributions. One day we'll even do ROC-analysis. Or not. http://lpaste.net/4234314648314183680
  • October 27th, 2015: For today's #haskell problem we say "HEY! YOU! GET YOU SOME RANDOM, YO!" and then define a random number generator http://lpaste.net/5973373084290252800 A (random) solution (not really) to yesterday's random (really) #haskell problem http://lpaste.net/3547465428253016064
  • October 26th, 2015: Well, bleh! It only took all day to compose, but here's today's #haskell problem! "Learning R...in Haskell!"http://lpaste.net/2843567468654362624 Okay, that (randomly) hurt! -- one possible solution to this problem is posted at http://lpaste.net/4854130028864077824
  • October 23rd, 2015: Today's #haskell problem is thanks to Jim Webber's keynote at @GraphConnect is about triadic closurehttp://lpaste.net/2004709237044805632
  • October 22nd, 2015: Today's #haskell problem is thanks to Jim Webber's keynoteat the @neo4j @GraphConnect: WWI Alliances http://lpaste.net/4042786156616613888  WWI-Allianceshttp://lpaste.net/4413387094903226368 … and as a @neo4j-graph 
  • October 16th, 2015: Today's #haskell problem asks you to create MAJYCK! with LENSES over MATRICES using SCIENCE! (lens = magic ICYMI) http://lpaste.net/4391386661800378368
  • October 15th, 2015: Today's #haskell problem is a real (silly) problem: 'efficientize' row and col definitions for Data.Matrix http://lpaste.net/7329174284021006336 Zippidy Doo-Dah! Zippidy day! My, oh, my we've 'efficientized' Matrix RowCol (that scans. Kinda) http://lpaste.net/4076800205951860736
  • October 14th, 2015: For today's #haskell problem we look at multiplying matrices, because SCIENCE! http://lpaste.net/2775082411233378304 Today criss-cross is gonna JUMP-JUMP! ... and sauce the apples http://lpaste.net/6379071958448865280 (What this has to do with matrix-multiplication, I do not know)
  • October 13th, 2015: A rose by any other name would smell as sweet. A matrix-transpose by any other name is still today's #haskell problem http://lpaste.net/7639242339784851456 Today we transpose matrices ... LIKE A GANGSTA! http://lpaste.net/4495861517937278976
  • October 12th, 2015: We go from eh-matrices to ÜBERMATRICES for today's #haskell problem http://lpaste.net/3386266226073272320 And we übered those matrices at http://lpaste.net/4104557754952187904
  • October 8th, 2015: We haven't touched Data.Matrix in a while, and it didn't age well. Let's fix this for today's #haskell problem http://lpaste.net/4256620462181711872 Matrices, REBORN! http://lpaste.net/5942967859750633472 (or at least, prenatal, but we'll get there)
  • October 7th, 2015: So, after all that work making DList Foldable/Traversable/Monadible (eh?) TODAY's #haskell problem relaxes MultiMap http://lpaste.net/2435920706567929856 That MultiMap is now hella-relaxed, yo! http://lpaste.net/8471070633348825088
  • October 6th, 2015: So YESTERDAY we looked at Foldable. @argumatronic said "Step it up: do Traversable!" So for TODAY'S #haskell problem http://lpaste.net/4868288594713772032 So we WuTang Traversible CLANNEDthat solution! http://lpaste.net/4953464088320540672
  • October 5th, 2015: For today's #haskell problem we go from Monadical to Foldable, thanks to @argumatronic http://lpaste.net/2602721740102565888 Wait. Is 'monadical' a word? DList. Foldable instance. Done. http://lpaste.net/8044707151110733824
  • October 2nd, 2015: For today's #haskell problem we make multimaps fast with difference lists ... OR. DO. WE! http://lpaste.net/8174150842572603392 And today we find out HOW FAST MULTIMAPS ARE WITH DLISTS! (in all caps, no less) http://lpaste.net/341094126416035840
  • October 1st, 2015: Yesterday we made Difference Lists Applicative, for today's #haskell problem we make them monadichttp://lpaste.net/1062399498271064064 So, difference lists are monadic now ... so there's that ... http://lpaste.net/4828960021565931520
Categories: Offsite Blogs

Equivalent async code (F#) in Haskell?

Haskell on Reddit - Fri, 10/30/2015 - 4:44pm

I'd like to delve deeper into Haskell but I had no luck to write the following code (F#).

async { let! html = Http.AsyncRequestString("https://www.dartlang.org/f/jabberwocky.txt") html.Split('\n') |> Array.filter (fun x -> x.ToLower().Contains("jabberwock")) |> Array.iter (fun x -> printfn "%s" x) }

I hope someone can help me.

submitted by __aurelius__
[link] [9 comments]
Categories: Incoming News

[Noob] Cabal question

Haskell on Reddit - Fri, 10/30/2015 - 4:41pm

How do you change where cabal installs libraries? I spent at least 2 hours researching my problem, where when I try to install wget using cabal, I get an error saying '/usr/bin/ar: permission denied.' Apparently, on El Capitan, this is a common error, but I'm still not sure how to fix it. Any thoughts?

submitted by TheRealRichardParker
[link] [9 comments]
Categories: Incoming News

New release Csound-expression 4.9 is out

haskell-cafe - Fri, 10/30/2015 - 3:41pm
*The 4.9.0 is out! New features:* csound-expression - Functions for creation of FM-synthesizers. We can create the whole graph of FM-units (with feedback). Check out the module Csound.Air.Fm - Support for Monosynth patches. See atMono in the module Csound.Air.Patch see the function atMono and atMonoSharp. - Easy to use Binaural panning. See the module Csound.Air.Pan It’s like: headPan :: (Sig, Sig) -> Sig -> Sig2headPan (azimuth, elevation) asig = (aleft, aright) the compiler can supply the right extra files by reading the header of .csd - Construction of patches for sound fonts (sfPatch, sfPatchHall). - Table of tables. We can create a table that contains tables. - Harmonic oscillators for subtractive synth: buz and gbuz (the functions are adapted from the Csound ones) - Reverbs for patches. It’s very easy to add a reverb to your patch (withSmallHall patch, withLargeHall patch, etc) - Some bug-fixes csound-catalog - Many
Categories: Offsite Discussion

Why any pattern is allowed in a top level declaration

Haskell on Reddit - Fri, 10/30/2015 - 11:23am

Update:

  • Sorry for the confusing title, it should be “why any pattern is allowed in the left hand side of a top level declaration.”
  • This also works in normal where clauses, but is forbidden in where clauses in instance/class.

In Haskell 2010 Language Report’s Syntax Reference, I noticed the following definition in the 10.5 Context-Free Syntax section:

decl → gendecl | (funlhs | pat) rhs

Which means on the left hand side of a top level function declaration there can be any pattern, including, for example, literals:

-- GHC is watching you. 5 = 2 + 2

And it compiles! Though typing 5 in ghci still gives me 5.

What’s the point of this strange design? Is there any historical reason behind it? I am currently working on a parser of a Haskell-like syntax and considering removing it.

Thank you.

submitted by minsheng
[link] [26 comments]
Categories: Incoming News

A link error in HaskellR

Haskell on Reddit - Fri, 10/30/2015 - 11:15am

When building HaskellR--or anecdotally, a project that has an inline-r dependency--I get the following error:

/usr/bin/ld: .stack-work/dist/x86_64-linux/Cabal-1.22.4.0/build/IHaskell/Display/InlineR.dyn_o: relocation R_X86_64_PC32 against symbol `ihaskzuGpNmu8SkGG3G5wblPMLDRu_IHaskellziDisplayziInlineR_rprint5_closure' can not be used when making a shared object; recompile with -fPIC /usr/bin/ld: final link failed: Bad value collect2: error: ld returned 1 exit status

The work around (on Ubuntu 15.04) is to change the default ld to ld.gold--as in the following:

sudo update-alternatives --install /usr/bin/ld ld /usr/bin/ld.gold 20 sudo update-alternatives --install /usr/bin/ld ld /usr/bin/ld.bfd 10 sudo update-alternatives --config ld

I can work with the above but I'd rather have a hack that's restricted to a project. Is there way through a cabal file to specify a linker in the ghc tool-chain?

submitted by metaml
[link] [4 comments]
Categories: Incoming News

Phd position starting in April 2016

General haskell list - Fri, 10/30/2015 - 11:01am
*Department of Computing and Communication Technologies* *3 year full-time funded PhD Scholarship* 3 years full-time fees will be paid by the University. Bursary: £10,500 pa (without inflation increase). Start date: April 2016 Eligibility: Home/EU and International students *Deadline: 14 December 2015* *Interview date: *Interviews will be held in week beginning 11 January 2016 The Department of Computing and Communication Technologies at Oxford Brookes University is pleased to offer a three year full-time PhD Scholarship to a new student commencing in April of 2016 The successful applicant will receive an annual bursary of £10,500 for three years (with no inflation increase) with fees paid by the University. Topic of Research:* Adaptive Technology for Healthcare* The topic of this Phd is the development of ontologies for healthcare that capture diverse protocols, terminologies and standards. The vocabulary associated with any branch of medicine is huge, including symptoms, scientific name
Categories: Incoming News

what do I have to do exactlry with this exercises

haskell-cafe - Fri, 10/30/2015 - 7:20am
Hello, Im self studing Haskell with the Craft o ffunctional programmimg of Hutton. Now I see two exercises that I do not understand what is really the purpose here. The two exercises are : 4.21 Given a function f of type Integer -> Integer give a recursive definition of a function of type Integer -> Integer which on input n returns the maximum of the values f 0, f 1, ..., f n. You might find the max function defined in Section 3.4 useful. To test this function, add to your script a definition of some values of f thus: f 0 = 0 f 1 = 44 f 2 = 17 f _ = 0 and so on; then test your function at various values. 4.22 Given a function f of type Integer -> Integer give a recursive definition of a function of type Integer -> Bool which on input n returns True if one or more of the values f 0, f 1, ..., f n is zero and False otherwise.
Categories: Offsite Discussion

Writing (->) from scratch

Haskell on Reddit - Fri, 10/30/2015 - 5:55am

It is very common to see handmade versions of Lists, Trees, etc... in Haskell tutorials everywhere, which gives an impression that almost everything is possible to remake from scratch.

What about function (->)? Is it something part of the compiler/language spec or is it possible to reproduce it with the remaining parts of the language? I can't see how it could be possible, so I'm curious.

submitted by guaraqe
[link] [31 comments]
Categories: Incoming News