News aggregator

Side conditions on Data.Profunctor.Choice?

haskell-cafe - Sat, 01/31/2015 - 10:31am
Hi, I was expecting to see coherence conditions for instances of Data.Profunctor.Choice along the lines of: dimap (either (Left . f) Right) id . right' === dimap id (either (Left . f) Right) . right' and similarly for left', but there's nothing like that in the docs. Does anyone know whether (a) those conditions aren't true in general, or (b) they are true but provable from other theorems so not worth mentioning? Many thanks, David
Categories: Offsite Discussion

JP Moresmau: Searching for food using a LambdaNet neural network

Planet Haskell - Sat, 01/31/2015 - 9:22am
From time to time I have a fit and start doing a bit of AI. I saw that a new version of LambdaNet was released so I though I would take it for a spin and try something (a little) bit more complicated that their XOR example.

The problem is simple. In a rectangular world, there is food is one place. The food "smells" and so each position in the world has a smell associated with it, the higher the smell meaning the closer to the food. Can we have a neural network that can navigate to the food?

A few definitions:

-- | Direction to go to
data Direction = TopLeft | Left | BottomLeft | Top | Center | Bottom | TopRight | Right | BottomRight
  deriving (Show,Read,Eq,Ord,Bounded,Enum)

-- | Strength of the smell
type Smell = Int
-- | Input information
type Input = [(Direction,Smell)]
-- | Position in world
type Position = (Int,Int)
-- | Size of the world
type Size = (Int,Int)
-- | Maximum number of steps to take
type Max = Int

-- | Number of directions
dirLength :: Int

dirLength = 1 + fromEnum (maxBound :: Direction)

-- | The world
data World = World
    { wSize   :: Size -- ^ size
    , wSmell  :: Smell -- ^ Smell of the food position
    , wSmells :: DM.Map Position Smell -- ^ All smell strengths by position
    }
    deriving (Show,Read,Eq,Ord)

-- | Function deciding in which direction to move
type StepFunction = Position -> Input -> Direction

Fundamental is the concept of Direction, since we want to move. Basically, when we are in a given position in a world, we can get nine directions and their associated smell (staying in the same place is one position). The function to decide what to do in a given position given all the smells of the neighbouring positions is called StepFunction.

The algorithm is easy to write for a human brain:

-- | The base algorithm: just go toward the highest smell
baseAlg :: StepFunction
baseAlg _ = fst . maximumBy (comparing snd)

Note that we ignore the current position, we only work with the input structure.

On top of that, we need function to build the world with the proper smell indicators, run the algorithm till we find the food, etc. All this code can be found in the GitHub project but is not really critical for our understanding of neural networks. One function of interest is running one step of the algorithm, showing the intermediate structures generated:

-- | Perform one step and return the information generated: direction/smell input, direction output
algStepExplain :: World -> StepFunction -> Position -> (Position,([(Direction,Smell)],Direction))

We get the position back, and the second element of the tuple is the input and the output of the StepFunction.

What we want to do is train a neural network, which should be easy since we have an algorithm we know will work well to find the best position to move to, and then use that network as an implementation of StepFunction.

The hardest in neural network programming is to design the input and output structures, so that they represent adequately the information about your problem in a format that the network can deal with. Here, we have a fixed input size: the smells of the 9 neighbouring positions. The StepFunction returns a Direction, and a Direction is an enum of nine values, so the output of the network could also be 9 values, the highest of these indicating the direction chosen by the network.

The networks in LambdaNet requires Vectors as their input and output data, so lets format the inputs:

-- | Format the inputs suitable for the network
formatInputs ::  World -> [(Direction,Smell)] ->  Vector Float
formatInputs w =   fromList . map (\i-> fromIntegral (snd i) / fromIntegral (wSmell w))    

So an input of 1 means we're on the food itself, and the input value will decrease as we're further from the food, while staying between 0 and 1.

If we have a network, the implementation of StepFunction is straightforward:

-- | Use the network to give the answer 
neuralAlg ::  World -> Network Float -> StepFunction
neuralAlg w n _ is = toEnum $ maxIndex $ predict (formatInputs w is) n 

We format the input, run predict, retrieve the index for the maximum value in the output vector, and use that as the index in the Direction enum. We just need a trained network!

To get that, we generate the training data from a given world. We list all possible positions in the world, calculate the corresponding inputs, run the basic algorithm on the input to get the optimal answer. For the result direction will set the output value to 1, and zero for all the others

-- | Training data: for each position in the world, use the base algorithm to get the training answer
trainData ::  World -> [(Vector Float, Vector Float)]
trainData w = map onePos $ allPositions w
  where
    onePos p = 
      let (_,(is,dir)) = algStepExplain w baseAlg p
          os = map (\(d,_)->if dir==d then 1 else 0) is 
      in (formatInputs w is,fromList os) 

From here, we unimaginatively reuse the LambdaNet tutorial code to build a network...

-- | Create the network
buildNetwork :: RandomGen g => g -> Network Float
buildNetwork g = createNetwork normals g $ replicate 3 $ LayerDefinition sigmoidNeuron dirLength connectFully

And train it:

-- | Train a network on several given worlds
train :: Network Float -> [World] -> Network Float
train n ws = 
  let t = BackpropTrainer (3 :: Float) quadraticCost quadraticCost'
      dat = concatMap trainData ws
  in trainUntilErrorLessThan n t online dat 0.01

What is critical here is that we train the network on several different worlds. I tried training only one world, the resulting network would perform well on worlds of the same size or smaller, but not bigger worlds, because it was too fit for the actual smell values. Training even on only two quite different worlds brought big enhancements in the intelligence of the network, at the code of longer learning time.

Once the network is trained, you can run it on several different worlds and see how it can find the food. There is a simple visualization module that allows you to see clearly the moves, for example:

Iteration 1
##########
#........#
#........#
#........#
#...X....#
#........#
#........#
#........#
#.......@#
#........#
#........#
########## 

(X being the food, @ the current position)

Iteration 3
##########
#........#
#........#
#........#
#...X....#
#........#
#........#
#......@.#
#........#
#........#
#........#
##########

Iteration 6
##########
#........#
#........#
#........#
#...@....#
#........#
#........#
#........#
#........#
#........#
#........#
##########

Yummy!

If you're interested, the full source code with tasty unit tests in on Github.

This is of course very basic, and only begs to be enhanced with more complicated worlds (maybe with walls, several sources of food, places with no smell at all, etc). What do you do when you don't know the best algorithm yourself? Maybe I'll come back for more later to find out!


Categories: Offsite Blogs

[ANN] dtw - Dynamic Time Warping

Haskell on Reddit - Sat, 01/31/2015 - 7:57am

I am pleased to announce the first public release of my dynamic time warping (DTW) implementation: https://github.com/fhaust/dtw

DTW is used in several fields to measure the similarity of time series. Both standard and fast (approximative) DTW algorithms are implemented.

I am publishing this with a version of 0.9 with the plan to bump this up to 1.0 once I get some feedback here. So please have a look at the code and tell me if there are any obvious flaws regarding gaping no-gos or ignored best-practices.

As this is a fundamental algorithm I am especially interested in problems regarding long term maintainability.

submitted by goliatskipson
[link] [7 comments]
Categories: Incoming News

ANN: processor-creative-kit

haskell-cafe - Sat, 01/31/2015 - 3:10am
Dear Haskellers, I am pleased to announce the first release of processor-creative-kit package for playing processors. You can create your processors with your own instruction set. It simulates a simple microprocessor(cpu) with some development tools. https://hackage.haskell.org/package/processor-creative-kit https://github.com/takenobu-hs/processor-creative-kit feature: * easy try, easy modify * a purely functional CPU core (without IO) * including a very simple prototype assembler * including a very simple prototype profiler * including a very simple prototype debugger Thank you, Takenobu _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe< at >haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Categories: Offsite Discussion

Restricted Template Haskell

glasgow-user - Sat, 01/31/2015 - 1:39am
Hello GHC friends! I am starting up a proposal for variants of Template Haskell that restrict what operations are available. The goal is to make TH easier for users to reason about and to allow for an easier compilation story. Here is the proposal page: https://ghc.haskell.org/trac/ghc/wiki/TemplateHaskell/Restricted Right now the proposal does not have any details and the goal is to write out a clear specification. If this sounds interesting to you, let me know or leave some feedback on the wiki. Thanks, Greg Weber _______________________________________________ Glasgow-haskell-users mailing list Glasgow-haskell-users< at >haskell.org http://www.haskell.org/mailman/listinfo/glasgow-haskell-users
Categories: Offsite Discussion

Reverse dependencies for parallel

del.icio.us/haskell - Sat, 01/31/2015 - 12:50am
Categories: Offsite Blogs

Control.Parallel.Strategies

del.icio.us/haskell - Sat, 01/31/2015 - 12:48am
Categories: Offsite Blogs

Using m4 as a preprocessor

haskell-cafe - Fri, 01/30/2015 - 10:26pm
Hello, today I tried to compile a Haskell program using m4 as preprocessor, with the {-# OPTIONS_GHC -F -pgmF m4 #-} pragma on top of a .hs file. I didn't add anything else, there are no ''' or '`' in the source. Upon compiling, ghc (7.8.3) complains: m4: cannot open `/tmp/ghc10655_0/ghc10655_1.hspp': No such file or directory I searched the net for a solution, but no dice (apparently there are no Haskell programs using -F -pgmF m4?), so I am writing here. Any ideas on what is wrong?
Categories: Offsite Discussion

Turtle.Tutorial

del.icio.us/haskell - Fri, 01/30/2015 - 6:48pm
Categories: Offsite Blogs

Turtle.Tutorial

del.icio.us/haskell - Fri, 01/30/2015 - 2:00pm
Categories: Offsite Blogs

Upcoming GHC version (was: Re: Drastic Prelude changes imminent)

libraries list - Wed, 01/28/2015 - 1:16am
If BBP is going to be released as-is, can we please call the next release GHC 8.0? Good stuff: http://semver.org/ Thanks, Greg On Tue, Jan 27, 2015 at 10:22 AM, David Luposchainsky <dluposchainsky< at >googlemail.com> wrote:
Categories: Offsite Discussion

Drastic Prelude changes imminent

libraries list - Tue, 01/27/2015 - 12:32pm
The next version (7.10) of GHC is slated to have a drastically changed Prelude. This message is very late in the release process, but I would urge caution before changing. The changes are (aptly) named the Burning Bridges Proposal (BBP). Even though the work has been going on for a while, it seems that this change is coming as a surprise to many people (including Simon Peyton Jones). In summary, it generalizes many list operation, e.g., foldr, to be overloaded. There is much to welcome in BBP, but changing the Prelude cannot be done lightly since it really is like changing the language. So I think it's really important for a large number of people to be able to try out such changes before they come into effect, and to have time to let the changes stabilize (you rarely get it right the first time). I've discussed this with a number of people, including Simon PJ, and we have concrete proposals. Proposal 1: * Add a new pragma {-# LANGUAGE Prelude=AlternativePrelude #-} * This is a new fe
Categories: Offsite Discussion

Problem using Text.Regex on OS X.

libraries list - Mon, 01/26/2015 - 1:29am
Hi, I'm new to Haskell so I'm not sure how to solve this issue, yet. I was directed to this list from the IRC channel. Here's the issue: Using GHCi on OS X: Prelude Data.Text> :m Text.Regex Prelude Text.Regex> let r = mkRegex "_" ... eliding successful "Loading package ..." lines ... Loading package regex-compat-0.95.1 ... can't load .so/.DLL for: /Library/Haskell/ghc-7.8.3-x86_64/lib/regex-compat-0.95.1/libHSregex-compat-0.95.1-ghc7.8.3.dylib (dlopen(/Library/Haskell/ghc-7.8.3-x86_64/lib/regex-compat-0.95.1/libHSregex-compat-0.95.1-ghc7.8.3.dylib, 9): Library not loaded: < at >rpath/libHSmtl-2.1.3.1-ghc7.8.3.dylib Referenced from: /Library/Haskell/ghc-7.8.3-x86_64/lib/regex-compat-0.95.1/libHSregex-compat-0.95.1-ghc7.8.3.dylib Reason: image not found) After running otool over the regex-compat dylib I can see lines like: ... Load command 24 cmd LC_RPATH cmdsize 80 path /Users/mark/Projects/hp/build/package/mtl-2.1.3.1/build/dist/build (offset 12) ... Which is not a path that is
Categories: Offsite Discussion

New gtk2hs 0.12.4 release

gtk2hs - Wed, 11/21/2012 - 12:56pm

Thanks to John Lato and Duncan Coutts for the latest bugfix release! The latest packages should be buildable on GHC 7.6, and the cairo package should behave a bit nicer in ghci on Windows. Thanks to all!

~d

Categories: Incoming News