News aggregator

Ken T Takusagawa: [zljfhron] Lagged Fibonacci digits

Planet Haskell - 16 hours 46 min ago

Part 1: Two newest taps

The Fibonacci sequence modulo 10 starting with 0 and 1 repeats with a period of 60, the Babylonians' favorite number: 0 1 1 2 3 5 8 3 1 4 5 9 4 3 7 0 7 7 4 1 5 6 1 7 8 5 3 8 1 9 0 9 9 8 7 5 2 7 9 6 5 1 6 7 3 0 3 3 6 9 5 4 9 3 2 5 7 2 9 1. One could imagine a very confusing digital clock in which two digits slide by every minute (or second).

Other starting pairs give other periods:
Period 20: 0 2 2 4 6 0 6 6 2 8 0 8 8 6 4 0 4 4 8 2.
Period 12: 2 1 3 4 7 1 8 9 7 6 3 9. (Also usable for a clock.)
Period 4: 4 2 6 8.
Period 3: 0 5 5.
Period 1: 0 0.
These periods sum to 100, which exhausts all possibilities of two starting digits.

Part 2: Lagged Fibonacci, newest and oldest taps

Start with the digits 0 0 0 0 0 0 1 at the top of a 7-column tableau. Fill in the following rows left-to-right by adding modulo-10 the digit to left and the digit immediately above. For the leftmost digit on each row, the digit "to the left" is the last digit on the previous row. This is equivalent to the recurrence a[i] = (a[i-1] + a[i-7]) mod 10. The sequence repeats with an impressively long period of 2480437. The original motivation was a method for a person to produce by hand a page of seemingly random digits with little effort. The tableau begins

0 0 0 0 0 0 1
1 1 1 1 1 1 2
3 4 5 6 7 8 0
3 7 2 8 5 3 3
6 3 5 3 8 1 4

and ends 2480437 digits later:

5 1 7 4 0 4 8
3 4 1 5 5 9 7
0 4 5 0 5 4 1
1 5 0 0 5 9 0
1 6 6 6 1 0 0
1 7 3 9 0 0 0
1 8 1 0 0 0 0
1 9 0 0 0 0 0

Periods of other column widths (lag lengths) and their factorizations, starting with 0 0 ... 1:
1 4 = 2*2
2 60 = 2*2*3*5
3 217 = 7*31
4 1560 = 2*2*2*3*5*13
5 168 = 2*2*2*3*7
6 196812 = 2*2*3*3*7*11*71
7 2480437 = 127*19531
8 15624 = 2*2*2*3*3*7*31
9 28515260 = 2*2*5*73*19531
10 1736327236 = 2*2*7*19*31*127*829
11 249032784 = 2*2*2*2*3*7*11*13*71*73
12 203450520 = 2*2*2*3*5*7*13*31*601
13 482322341820 = 2*2*3*5*11*17*31*71*19531

The sequence of periods 4, 60, 217, 1560, 168 does not appear on OEIS. These first four terms I have confirmed are the longest possible period among all start seeds, not just 0 0 0 ... 1. It is curious that the factor 19531 occurs multiple times.

Part 3: Two oldest taps

We consider the recurrence a[i] = (a[i-6] + a[i-7]) mod 10, that is, the two "oldest" values. To calculate the next digit on a seven-column tableau, add the number above to the number above and to the right (northeast). Or, this could also be done with a more compact six-column tableau adding the number above and the number above and to the left (northwest). This recurrence repeats with a period 661416, smaller than the corresponding lag-7 sequence in Part 2.

Periods for lag lengths are given below. The periods seem neither uniformly longer nor shorter than Part 2.

2 60 = 2*2*3*5
3 168 = 2*2*2*3*7
4 1560 = 2*2*2*3*5*13
5 16401 = 3*7*11*71
6 196812 = 2*2*3*3*7*11*71
7 661416 = 2*2*2*3*7*31*127
8 15624 = 2*2*2*3*3*7*31
9 8894028 = 2*2*3*11*13*71*73
10 1736327236 = 2*2*7*19*31*127*829
11 3712686852 = 2*2*3*7*31*73*19531
12 203450520 = 2*2*2*3*5*7*13*31*601
13 25732419240 = 2*2*2*3*5*11*17*31*71*521

The sequence 60, 168, 1560 does not appear in OEIS. But the analogous sequence modulo 2 (instead of modulo 10) is A046932, and curiously, the analogous sequence of Part 2 modulo 2 seems to be the exact same sequence. Also, there's the factor 19531 again. Searching for the number on OEIS gives a hint of what is going on. The VIC cipher used this recurrence in base 10 with width 10.

Source code

Here is a Haskell implementation including Floyd's cycle-detection algorithm. It is only a partial implementation of cycle detection because it does not go back to test that the found period is not a multiple of a shorter period. I'm hoping this second step wasn't necessary.

Because the generated sequences were implemented as giant lists, it was very easy to have catastrophic space leaks. I've left in old versions of code demonstrating a few such failures. Even printing out a list followed by its length required some care. This aspect of Haskell, compared to an imperative programming language, I strongly dislike.

The code is far from optimized, most notably because it was an exercise in learning to use Data.Sequence to hold the state. Unboxed arrays would probably have been better.

Categories: Offsite Blogs

Haskell Platform 2014.2.0.0 Release Candidate 3

haskell-cafe - Thu, 07/31/2014 - 9:48pm
Small update to the Haskell Platfrom 2014.2.0.0 release: We have new Release Candidate 3 versions of the source tarball... and a new generic-linux bindist of the platform! - source tarball: haskell-platform-2014.2.0.0-srcdist-RC3.tar.gz <> - generic linux: haskell-platform-2014.2.0.0-unknown-linux-x86_64-RC3.tar.gz <> *Windows and OS X users: There are no RC3 versions - as the RC2 versions seem to be holding up fine!* *General* - hptool (and hence ./ script) take a new --prefix parameter that is used for generic (non-OS X, non-Windows) builds: It sets the root under which haskell installations are located. Defaults to /usr/local/haskell. Everything will be placed in a directory named ghc-7.8.3-<arch> under this prefix. - activate-hs script for default Posix-like builds - sma
Categories: Offsite Discussion

[ANN] hsimport 0.5: configurable pretty printing and placing of imports

haskell-cafe - Thu, 07/31/2014 - 8:59pm
Hi all, hsimport[1] is a command line program for extending the import list of a Haskell source file. There's an integration[2] for the vim editor, which can automatically extend the import list for the symbol under the cursor. hsimport 0.5 changes: - configurable[3] pretty printing of imports - configurable placing of new imports - support of multi line imports - better handling of incomplete/invalid Haskell source files Grettings, Daniel [1] [2] [3]
Categories: Offsite Discussion

Seeking comments on proposed fusion rules

libraries list - Thu, 07/31/2014 - 7:38pm
The rules: The boring explanation: As I mentioned earlier, we currently can't fuse things like foldr (x : build g) The foldr can't see the build. Earlier, I considered a simple cons/build rule that would rewrite x : build g to build (\c n -> x `c` g c n) I wasn't quite satisfied with this approach, largely because it treats foldr c n (x1 : x2 : ... : xn : build g) one way, and foldr c n ([x1, x2, ..., xn) : build g) another way. The former expands out the foldr, while the latter (via various rules involving augment) translates into a foldr over [x1, x2, ..., xn]. I therefore came up with the idea of translating x1:x2:...:xn:build g into [x1:x2:..:xn]++build g, and then letting the current rules do their thing. dolio and carter (in #ghc) were concerned about what would happen if fusion *didn't* occur, in which case the resulting code appears to be *worse*. So then I wrote some rules to fix that up, which actually look likely to be good rules in general. They turn [x1:x2:...:xn
Categories: Offsite Discussion

Does ghc leak memory or is something else going on?

Haskell on Reddit - Thu, 07/31/2014 - 5:33pm

If I do "cabal install cabal-install" then while it is building Cabal I get:

[46 of 78] Compiling Distribution.Simple.Setup ( Distribution/Simple/Setup.hs, dist/build/Distribution/Simple/Setup.o ) ghc: out of memory (requested 1048576 bytes)

But if I do a "cabal unpack Cabal" and then go in the directory and do a "cabal build" I get the same error, but I can just run "cabal build" again and it picks up where it left off and finishes just fine. This seems to happen with any package that contains a lot of files to be compiled. Is this a memory leak in GHC or is something else going on? Is there any way to tell cabal to compile each file with a fresh ghc process instead of reusing the same ghc process for the whole thing to at least work around the problem easier?

submitted by haskellnoob
[link] [4 comments]
Categories: Incoming News

wren gayle romano: Transitioning is a mindfuck.

Planet Haskell - Thu, 07/31/2014 - 3:48pm

[Content warning: discussion of rape culture and child abuse]

Transitioning is a mindfuck. Doesn't matter how prepared you are, how sure you are, how long and deeply you've thought about gender/sexuality issues. Outside of transitioning1 we have no way of inhabiting more than one position in any given discourse. Sure, we can understand other positions on an intellectual level, we may even sympathize with them, but we cannot empathize with what we have not ourselves experienced, and even having experienced something in the past does not mean we can continue to empathize with it in the present. Julia Serano emphasizes this epistemic limit in her books. And it's no wonder that no matter how prepared you may be, completely uprooting your sense of self and reconfiguring the way the world sees, interprets, and interacts with you is going to fundamentally alter whatever notions you had going into it all.

Since transitioning none of the major details of my identity have changed. I'm still a woman. Still feminine. Still a flaming lesbo. Still kinky, poly, and childfree. Still attracted to the same sorts of people. Still into the same sorts of fashion (though now I can finally act on that). Still interested in all the same topics, authors, and academic pursuits. And yet, despite —or perhaps because of— all this consistency, transitioning is still a mindfuck.

Read more... )

Categories: Offsite Blogs

wren gayle romano: New skin

Planet Haskell - Thu, 07/31/2014 - 3:47pm

Hello all,

I just changed the theme/skin for my blog and have been playing around with new fonts, css, handling of footnotes, etc. Let me know what you think and whether you run into any issues (especially on older posts). It's been years since I've done webdev, long before CSS3 and HTML5 so thing's are a bit different than I'm used to.

In other news, I haven't heard back from the Haskell Planet admins about getting the feed switched over to Haskell/coding/math content only. So, if you've been annoyed by the OT, sorry bout that!

Categories: Offsite Blogs

FP Complete: FP Haskell Center is Going Free

Planet Haskell - Thu, 07/31/2014 - 2:11pm
FP Haskell Center Open Publish Announcement

As you know, FP Complete’s mission is to drive the wide-scale adoption of Haskell functional programming. As our next step in this mission, I’m pleased to announce we are releasing our full FP Haskell Center (FPHC) free for developers of open projects. This model will be very familiar to Haskellers and GitHub users.

FPHC has been live online for a little less than a year and has been very successful. I’m proud of our hard-working team who have achieved so much so quickly, using many important open-source components as well as some very good commercial services. Because of this progress and the support and activity from our users, we have reached the point where we can give even more back to the Haskell community and do more to promote the overall advancement of Haskell.

As of October 1, users of our free Community or “Open Publish” edition will have access to features previously offered only to FPHC Commercial customers. This includes complete remote git support, complete git command line integration for Emacs clients, as well as git “mega repos,” and inter-project dependencies.

With these increased free features, the paid Personal edition is no longer needed. If you have a Personal license, we’ll stop charging you, and will renew you to a free Open Publish license. Similarly, Academic accounts will also convert to Open Publish accounts. For more details see the schedule below and our FAQ.

Unlike Commercial licenses, Open Publish accounts will automatically publish all projects on the FPHC site with each commit. Open Publish accounts aren’t available on private servers, and won’t include a shared or private FP Application Server. Of course we continue to offer the paid Commercial license for those who need it — but most people won’t.

We’re confident that access to more capable tools will inspire others to be more involved with pushing Haskell forward, and we believe this move will better suit open source projects and independent developers.

Now that our corporate clients are expanding their work orders with us, we can offer even more for open source developers. This has also afforded us the opportunity to focus on our own innovative Haskell projects which we will be rolling out over time. Our corporate clients are excited by what we’ve been able to deliver with our Haskell-built tools, and this is just the beginning. Of course we also continue to contribute to Yesod, Fay, GHC, Stackage, and other important Haskell community projects beyond FP Haskell Center itself.

Innovation is about motivation, and we hope our free tools for open source Haskell developers provide the resources and motivation to build more projects. What has already been amassed in our School of Haskell is proof that the Haskell community knows how to build great resources given the right tools and forums. Keep up the amazing work and let us know what else we can do to help.

Aaron Contorer, CEO, FP Complete

Planned Release Schedule:
  • As of July 31, we will no longer be offering the yearly Personal edition. The monthly edition will continue to be available to allow Personal subscribers to continue using git while we prepare the new FPHC Open Publish license. At this time we will continue to offer FPHC Professional and Community licenses.
  • On August 30, we will stop accepting new monthly Personal edition subscriptions.
  • October 1, we will release the new enhanced free Open Publish Community edition and discontinue online purchases and monthly subscriptions. At this time, all previous Community and personal licenses will become FPHC Open Publish licenses. New yearly paid FPHC commercial licenses may still be ordered by contacting FP Complete Sales.

Categories: Offsite Blogs

[ANN] hsimport 0.5: configurable pretty printing and placing of imports

Haskell on Reddit - Thu, 07/31/2014 - 2:04pm

hsimport is a command line program for extending the import list of a Haskell source file.

There's an integration for the vim editor, which can automatically extend the import list for the symbol under the cursor.

hsimport 0.5 changes:

  • configurable pretty printing of imports
  • configurable placing of new imports
  • support of multi line imports
  • better handling of incomplete/invalid Haskell source files
submitted by dan00
[link] [comment]
Categories: Incoming News

Bad interaction of inlinePerformIO and mutablevectors

haskell-cafe - Thu, 07/31/2014 - 11:57am
I'm trying to understand why some code isn't behaving as I'd expect, and to determine whether it's a bug or not (and where that bug might be). Here's the simplest version of the code: import Data.ByteString.Internal (inlinePerformIO) import qualified Data.Vector as V import qualified Data.Vector.Mutable as VM main :: IO () main = do vm <- 1 VM.write vm 0 'A' let x = inlinePerformIO $ VM.write vm 0 'B' x `seq` (V.freeze vm >>= print) A more complete example is available on lpaste[1]. The problem is that I would expect the output to be "B", but in fact "A" is still printed. From the longer paste that I linked to, you can see that: * When using unsafePerformIO and unsafeDupablePerformIO, the semantics work as I would have expected: "B" is printed after forcing evaluation of the result. * If I add a ` vm 0` call after the write, it also works. * Using IORef, the behavior is also as I would have expected: as soon as the result is evaluated, th
Categories: Offsite Discussion

Newtype derivation understanding problem

haskell-cafe - Thu, 07/31/2014 - 11:55am
Excuse me if the question is silly. Thought this MonadCatchIO will be removed in next version of snap, current version has it and I can't understand this derivation: [0] L.Lensed seems like just a newtype wrapper around a function [1], and there is no instance of MonadCatchIO for a function. How is it derived? [0]: [1]: _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe< at >
Categories: Offsite Discussion

Feedback on my probability distribution package?

Haskell on Reddit - Thu, 07/31/2014 - 9:49am

Hello Reddit!

I made a package for manipulating finite discrete probability distributions, and before submitting it to Hackage I would love to get some feedback from you.

The code is available on github:

The package has modules for creating and transforming distributions, measuring them, efficiently sampling random values from them and plotting them to file.

I'm wondering if including the plotting feature is worthwhile considering the number of dependencies. Would you put that module in a package of its own, or would you leave it as is?

I am also unsure about the Data.Distribution.Aggregator package. The naming of the module and some of its functions doesn't feel right. The point of the module is to provide functions that can modify the probabilities of lists of pair of value and probability. This is useful for instance for getting cumulative probability distributions for instance.

Thanks for any feedback you might have!

submitted by wargotad
[link] [7 comments]
Categories: Incoming News

Gabriel Gonzalez: Scalable program architectures

Planet Haskell - Thu, 07/31/2014 - 7:30am

Haskell design patterns differ from mainstream design patterns in one important way:

  • Conventional architecture: Combine a several components together of type A to generate a "network" or "topology" of type B

  • Haskell architecture: Combine several components together of type A to generate a new component of the same type A, indistinguishable in character from its substituent parts

This distinction affects how the two architectural styles evolve as code bases grow.

The conventional architecture requires layering abstraction on top of abstraction:

Oh no, these Bs are not connectable, so let's make a network of Bs and call that a C.

Well, I want to assemble several Cs, so let's make a network of Cs and call that a D


Wash, rinse, and repeat until you have an unmanageable tower of abstractions.

With a Haskell-style architecture, you don't need to keep layering on abstractions to preserve combinability. When you combine things together the result is still itself combinable. You don't distinguish between components and networks of components.

In fact, this principle should be familiar to anybody who knows basic arithmetic. When you combine a bunch of numbers together you get back a number:

3 + 4 + 9 = 16

Zero or more numbers go in and exactly one number comes out. The resulting number is itself combinable. You don't have to learn about "web"s of numbers or "web"s of "web"s of numbers.

If elementary school children can master this principle, then perhaps we can, too. How can we make programming more like addition?

Well, addition is simple because we have (+) and 0. (+) ensures that we can always convert more than one number into exactly number:

(+) :: Int -> Int -> Int

... and 0 ensures that we can always convert less than one number into exactly one number by providing a suitable default:

0 :: Int

This will look familiar to Haskell programmers: these type signatures resemble the methods of the Monoid type class:

class Monoid m where
-- `mappend` is analogous to `(+)`
mappend :: m -> m -> m

-- `mempty` is analogous to `0`
mempty :: m

In other words, the Monoid type class is the canonical example of this Haskell architectural style. We use mappend and mempty to combine 0 or more ms into exactly 1 m. The resulting m is still combinable.

Not every Haskell abstraction implements Monoid, nor do they have to because category theory takes this basic Monoid idea and generalizes it to more powerful domains. Each generalization retains the same basic principle of preserving combinability.

For example, a Category is just a typed monoid, where not all combinations type-check:

class Category cat where
-- `(.)` is analogous to `(+)`
(.) :: cat b c -> cat a b -> cat a c

-- `id` is analogous to `0`
id :: cat a a

... a Monad is like a monoid where we combine functors "vertically":

-- Slightly modified from the original type class
class Functor m => Monad m where
-- `join` is analogous to `(+)`
join :: m (m a) -> m a

-- `return` is analogous to `0`
return :: a -> m a

... and an Applicative is like a monoid where we combine functors "horizontally":

-- Greatly modified, but equivalent to, the original type class
class Functor f => Applicative f where
-- `mult` is is analogous to `(+)`
mult :: f a -> f b -> f (a, b)

-- `unit` is analogous to `0`
unit :: f ()

Category theory is full of generalized patterns like these, all of which try to preserve that basic intuition we had for addition. We convert more than one thing into exactly one thing using something that resembles addition and we convert less than one thing into exactly one thing using something that resembles zero. Once you learn to think in terms of these patterns, programming becomes as simple as basic arithmetic: combinable components go in and exactly one combinable component comes out.

These abstractions scale limitlessly because they always preserve combinability, therefore we never need to layer further abstractions on top. This is one reason why you should learn Haskell: you learn how to build flat architectures.

Categories: Offsite Blogs