I’ve just updated my web page with links to some new papers that are now available:
- “Homotopical Patch Theory” by Carlo Angiuli, Ed Morehouse, Dan Licata, and Robert Harper. To appear, ICFP, Gothenburg, October 2014. We’re also preparing an expanded version with a new appendix containing material that didn’t make the cut for ICFP. (Why do we still have such rigid space limitations? And why do we have such restricted pre-publication deadlines as we go through the charade of there being a “printing” of the proceedings? One day CS soon will step into its own bright new future.). The point of the paper is to show how to apply basic methods of homotopy theory to various equational theories of patches for various sorts of data. One may see it as an application of functorial semantics in HoTT, in which theories are “implemented” by interpretation into a universe of sets. The patch laws are necessarily respected by any such interpretation, since they are just cells of higher dimension and functors must behave functorially at all dimensions.
- “Cache Efficient Functional Algorithms” by Guy E. Blelloch and Robert Harper. To appear, Comm. ACM Research Highlight this fall. Rewritten version of POPL 2013 paper meant for a broad CS audience. Part of a larger effort to promote integration of combinatorial theory with logical and semantic theory, two theory communities that, in the U.S. at least, ignore each other completely. (Well, to be plain about it, it seems to me that the ignoring goes more in one direction than the other.) Cost semantics is one bridge between the two schools of thought, abandoning the age-old “reason about the compiled code” model used in algorithm analysis. Here we show that one can reason about spatial locality at the abstract level, without having to drop-down to low-level of how data structures are represented and allocated in memory.
- “Refining Objects (Preliminary Summary)” by Robert Harper and Rowan Davies. To appear, Luca Cardelli 60th Birthday Celebration, Cambridge, October, 2014. A paper I’ve been meaning to write sometime over the last 15 years, and finally saw the right opportunity, with Luca’s symposium coming up and Rowan Davies visiting Carnegie Mellon this past spring. Plus it was a nice project to get me started working again after I was so rudely interrupted this past fall and winter. Provides a different take on typing for dynamic dispatch that avoids the ad hoc methods introduced for oop, and instead deploying standard structural and behavioral typing techniques to do more with less. This paper is a first cut as proof of concept, but it is clear that much more can be said here, all within the framework of standard proof-theoretic and realizability-theoretic interpretations of types. It would help to have read the relevant parts of PFPL, particularly the under-development second edition, which provides the background elided in the paper.
- “Correctness of Compiling Polymorphism to Dynamic Typing” by Kuen-Bang Hou (Favonia), Nick Benton, and Robert Harper, draft (summer 2014). Classically polymorphic type assignment starts with untyped -terms and assigns types to them as descriptions of their behavior. Viewed as a compilation strategy for a polymorphic language, type assignment is rather crude in that every expression is compiled in uni-typed form, complete with the overhead of run-time classification and class checking. A more subtle strategy is to maintain as much structural typing as possible, resorting to the use of dynamic typing (recursive types, naturally) only for variable types. The catch is that polymorphic instantiation requires computation to resolve the incompatibility between, say, a bare natural number, which you want to compute with, and its encoding as a value of the one true dynamic type, which you never want but are stuck with in dynamic languages. In this paper we work out an efficient compilation scheme that maximizes statically available information, and makes use of dynamic typing only insofar as the program demands we do so. Of course there are better ways to compile polymorphism, but this style is essentially forced on you by virtual machines such as the JVM, so it is worth studying the correctness properties of the translation, which we do here making use of a combination of structural and behavioral typing.
I hope to comment here more fully on these papers in the near future, but I also have a number of other essays queued up to go out as soon as I can find the time to write them. Meanwhile, other deadlines loom large.
[Update: added fourth item neglected in first draft. Revise formatting. Add links to people. Brief summary of patch theory paper. Minor typographical corrections.]
[Update: the promised expanded version of the forthcoming ICFP paper is now available.]
Filed under: Programming, Research Tagged: behavioral typing, cache efficient algorithms, compilation, cost semantics, dynamic dispatch, homotopy type theory, ICFP, polymorphism, structural typing, type refinements
Someone asked recently whether it's bad to seek "signs" of being trans from the past, and why or why not. This question is one which deserves to be more widely circulated. Within trans circles a fair number of people have an understanding of the situation and it's complexity, but it's something I think non-trans circles should also be aware of— especially given the recent publicity surrounding trans lives.
The problems are twofold:
A lot of people look for signs because they're seeking some sort of validation. The problem here is that you end up misinterpreting and overanalyzing your own life in search of that validation. It's not that the past cannot provide validation for your present, it's just missing the point. What we want (more often than not) is acceptance of who we are now and recognition for our current experience. There's more to current identities, pains, and experiences than the past that gave rise to them, so validation can come from sources other than the past. Moreover, it's all too easy for people to "validate" your past while simultaneously invalidating your present, so validation from the past is not stable. Altogether, none of this is trans-specific: it's a general problem with seeking retrospective validation; and it also applies to people who've suffered abuse, experience mental illness, have changed careers, etc.
The second problem is that, in overanalyzing our pasts in search of validation, we all too often end up reinscribing "standard" trans narratives. If our pasts do not fit the "standard" narrative then we will not find the validation we seek, thus we will call our current understanding even further into question, and this sense of invalidation will only make us feel worse. If our pasts only partially fit the "standard" narrative then, in search of validation, we will highlight those memories and background the others; thus denying ourselves the full actualization of our personal history, and invalidating at least in part who we are. And if our pasts (somehow) completely fit the "standard" narrative then, in holding that history up as "proof" of our legitimacy, we end up marginalizing and invalidating everyone with different narratives. Again, this isn't a trans-specific problem (cf., "standard" narratives of gay lives or depression prior to, say, the 1970s.); though it's especially problematic for trans people because of the dearth of public awareness that our narrative tapestries are as rich and varied as cis narrative tapestries.
There's nothing wrong with seeking support for your current self from your past memories. Doing so is, imo, crucial in coming to understand, respect, and take pride in our selves. The problems of retrospection are all in the mindset with which it is pursued. We shouldn't rely on "born this way" narratives in order to justify the fact that, however we were born, we are here now and in virtue of our presence alone are worthy of respect and validation.
Fwiw, I do very much value my "signs", and often share them as amusing anecdotes— both to foster understanding, and to destabilize people's preconceived notions. But I do not seek validation in these signs; they're just collateral: symptoms of, not support for, who I am.Twitter Facebook Google+ Tumblr
A common pattern for me is iterating over and writing to every slot in a 2D array stored in a vector (having the 2D coordinate is a requirement, so no simple V.map etc.). One questions for me is always, what's the fastest way to write such a loop? Since I just came across this very pattern again I thought I'll quickly do some measurements and also give the newly released 'loop' package a try:-- 9.6 10.3 FPS forM_ [(x, y) | y <- [0..h - 1], x <- [0..w - 1]] $ \(px, py) -> -- 2.7 10.7 FPS numLoop 0 (h - 1) $ \py -> numLoop 0 (w - 1) $ \px -> -- 9.6 11.0 FPS forLoop 0 (< h) (+1) $ \py -> forLoop 0 (< w) (+1) $ \px -> -- 10.7 10.8 FPS forM_ [0..h - 1] $ \py -> forM_ [0..w - 1] $ \px ->
Sorry for the stupid 'frames per second' performance numbers, this loop has a quickly thrown together Mandelbrot fractal for its body and loops over a texture image buffer to be uploaded to OpenGL for display, so that's the number I got at hand. The first number is running in 'StateT ReaderT IO', the second is for plain IO. There clearly there is a difference for these loops even with a non-trivial payload in the body. The 'numLoop' one is rather strange, seems to be way more affected by the transformer overhead than the other solutions. This kind of code tends to perform a lot better with -fllvm, so the fastest way might depend on the backend.
Is there some accepted known best way of doing this?submitted by SirRockALot1
[link] [5 comments]
Does every monad transformer t have a map t (t I) a -> t I a?
I'm using I as an abbreviation for Identity.
I've checked this is indeed the case for MaybeT, EitherT/ErrorT, WriterT, StateT, ContT. Is there a sensible monad transformer that doesn't support this transformation?
(NB: I previously submitted this with an incorrect type signature so I resubmitted)submitted by tomejaguar
[link] [5 comments]
Lindley, Wadler, and Yallop published The Arrow Calculus way back in 2009, and though this work appears to significantly advance reasoning about and expression of computation with arrows beyond Paterson's refinements (represented in the current Control.Arrow module) on Hughes' foundational work, I can't find any Haskell implementations of it. Does such code exist?submitted by ribald_eagle
[link] [5 comments]
An aside (get used to them, by the way), people are never sure if I'm serious.
I'm, like, seriously? C'mon!
I am always silly (well, oftentimes silly, compared to my dour workmates), but I am always serious in my silliness.
Chesterton said it well: the opposite of funny is not serious: the opposite of funny is not funny.
Okay, that aside is done. Now onto the topic at hand.
Hands up, those who have ever used your education in your jobs. Hands up, those who needed proof of your degree to prove competency as a prerequisite of employment.
(sound of crickets.)
Actually, more of you might raise your hands to me than to most of my colleagues, because why? Because I have close ties to academe, that's why. So there are more than a few of you who are required to have your Master's degree or, those of you who are post-docs, to have your Ph.D. to get that research position or grant that you're working on.
The rest of the world?
Education, in the real world, is a detriment to doing your job.
Across the board.
Do you know how many Ph.D.s we fire? Do you know how many Ph.D.s and matriculated students we turn away, because of their education and their lack of real-world experience?
We did a survey: a sure bell-weather for a prospective employee? The amount of their education: the more they have, the more likely they are to be useless on the job.
It's the Ph.D-disease: 'Ph.D: "piled high and deep."' People get ed-ju-ma-kated and then they think because they have a sheepskin or that they have a certain GPA at a certain school and know a certain ... educated way of doing things, that they know how to do this-and-that other thing, totally unrelated on the job.
"I've studied the Martin-Löf's intuitionistic type theory."
"Great, how do you connect to our database."
You'll bridle at this, but you know who agrees most strongly with me?
I went to the ICFP2006 specifically looking to roll dependent types into the programming language I was using, in industry, and I could not get the time of day from a single implementer of the type-theory in Twelf, and you know why?
"Why would you be interested in this? This is purely theoretical."
Uh, huh. As in "not applicable to industry."
I reread the paper again, later: "Dependent types make programs easier to understand and more reliable."
And why would I want that in my programs in industry, where it mattered.
I spent four years at the United States Coast Guard Academy: 10,000 students apply, only 300 are allowed in, only 150 graduate, each year. It cost, at the time, $250,000 to graduate a student from the Academy, making it the most expensive school in the nation.
Thank you, taxpayers, for my education.
I graduated with a dual major: mathematics and computer science.
How much of my education did I use on the job to save 150 lives from a capsized Arctic research exploration vessel (boy, they surely used their education, didn't they! ... to capsize their ship), act as the translator when we boarded Japanese trawlers, provide civil rights education and mediate EEO complaints and ...
None. Zip. Zilch.
After my stint, how much of my college education did I use on the job.
My job was encoding matrices in FORTRAN. How much FORTRAN did I study in college?
Zip. Zilch. Nada.
How much of the Advanced Calculus II did I use on the job.
People, it was frikken matrix manipulation! Something you can (now) look up on wikipedia and pick up in, oh, two hours if you're really slow.
All these things (Python, Ruby on Rails) can be taught in college, but they can be taught in high school, and I learned them in school, did I?
No, I did not. I learned them on my own, thank you very much.
Design patterns, frameworks, data structures. Do educated people know these things?
Some of them do. Most people with a 'computer science' degree DO NOT, people.
They do not. They woefully do not, as comp sci teachers lament over and over again, and as I, the hiring manager, scratch my head wondering what, precisely, did these kids learn in school, because, insofar as I see, they did not learn abstraction, polymorphism, typing, or data structures.
They learned the if-statement.
They learned the if-statement that n-dispatches within a for-loop from 1 to n off an array. That's the data structure they know: the array.
We got you covered: the if-statement.
Well, there's always main(String args), with a big-ole if-statement.
The word on the street is education is a canard at best and a detriment at most, and at worst, it's a project-sinker.
That's a shame, because there are educated people who are productive and smart and effective in their field, and can help.
How to Solve It: a Modern Approach claims that one billion dollars is wasted on software because it's written in the absence of very simple techniques, such as linear programming.
Our authors, Michalewicz and Fogel, are off. Way off. I know. By a factor of at least one-hundred.
We wasted a billion dollars on a file-indexing system for the FBI. Oh, it had email, too. Government project. Never used because it was never delivered in a useable state. Do you know how many go through that cycle?
I don't. I do know know I've seen project after project just ...
God. The waste.
And I have friends. And they tell me stories.
But, you get MITRE in there, or you get a Ph.D. or two in there, and what happens?
They study the issue.
They study it for six-plus months, and then they write you a nice little white paper that states the obvious search criteria that you knew from day one, but what do you have to say? Where is your ROC-analyses? So your bayesian system that was cranking out results month after month was killed by the bean-counting pointy-heads and they submit a ... working solution that could go into production? Oh, no. They submit a white paper calling for a research grant to allow for a year of surveying and further study of the issue.
Then they get fired or they move on to more interesting research areas leaving behind us to clean up the mess and get a working system out the door in some serviceable shape that used zero percent of their research.
You see, they modeled the situation, but the model doesn't fit the data, which is raw and dirty, so their solution solved the model, not your problem, not even close.
How much have you used your degree on your job?
If you're a researcher, you probably use quite a bit of what you've studied in your research, and you are contributing more to your field of study.
If you're not, then you're told this, day one, on your job: "Friend, put those books away, you're never going to use them again."
I mean, seriously: did you really bring your college books to your job thinking you'd use them?
This, here, is the real world. The ivory tower is for the academics. In the real world, you roll up your sleeves, get to work, and get some results; because if you don't, the door is right over there.
You were expecting to use your degree on your job? This is America, people. We don' need no edjumakashun.
Now, if this were Soviet Russia, your degree uses you.
So, silliness, and serious silliness aside.
You were expecting to use your degree on your job?
English major. This is America, we don't talk English, we talk American, so, that's nice that you have that degree.
Mathematics major. This is America, we don't do 'maths,' nor trig, nor geometric algebras, nor category theory, how would you use any of that on your job?
I was seriously asked that on my interview for a job overseeing a 1.7 petabyte-sized database.
I said: 'uh, map-reduce are straight from category theory.'
"Yes, but how do you use that on your job?"
We both blinked at each other dumbly.
You don't go to school to get trained to do a job well, ladies and gentlemen.
I mean, too many of you do that, and too many others go do school to party some, to sex some, to blaze some, and then get to work after your folks financed your four-plus year bacchanal.
College is not a technical training-institute and has nothing to do with acquiring skills or proficiency on repetitive stress disorder, oh, I meant: 'your job.' Your job, almost without exception, can be just as proficiently performed by nearly anyone they drag off the street and put in your chair for eight hours a day. They sit in your chair for a few days, and everyone else won't even know you're gone.
My wife beat the pants off her entire payroll division with an excel spreadsheet because they didn't have simple accounting principles and deductive reasoning. Why? Because they were well-regulated at their jobs, proficient at it, in fact, and their job was to make continuous clerical errors because they had absolutely no rigor. Why would they? They weren't paid for rigor. They were paid for doing their jobs, which was: don't make waves.
I regularly go into situations where other software engineers (a misnomer, they are more like computer programmers, not engineers) say such-and-so cannot be done in programming language X.
Then, I implement a little bit of category theory, in programming language X, do some simple mappings and natural transformations, and, voilà! those 50,000 lines of code that didn't solve the problem but only made things worse? I replace all that with 500 lines of code that actually delivers the solution.
Unit tested: all the edge cases.
And meeting their requirements, because I've translated the requirements into a declarative DSL on top of their programming language X.
Of course they couldn't solve the insurmountable problem in programming language X, not because they were using programming language X (although it helped with the colossal fail being object-disoriented and improvably/mutatatively impure), but because they couldn't think outside the box that 'you can only do this and that' as a software engineer. They were caught in their own domain and can't even see that they had boxed themselves in.
Because they were educated that way. Comp Sci 101: this is how you write a program. This is the 'if'-statement. This is the for-loop. If that doesn't work, add more if-statements wrapped by more for-loops, and this statement is perfectly acceptable:
x = x + 1
Go to town.
That's what their education gave them: they went to school to acquire a trade and a proficiency at the if-statement, and gave up their ability to see and to think.
And some, many, academics are the most bigoted, most blundering blinders-on fools out there, because they see it their way, and they see their way as the only way, which requires a six-month research grant and further study after that.
With co-authorship on the American Mathematical Society journal article.
And the uneducated are the worst, most pigheaded fools out there, so sure that the educated have nothing to offer, that they have no dirt under their perfectly manicured fingernails attached silky smooth hands that have never seen an honest-day's work nor, God forbid! a callous, so what do they know, these blowhards, so the uneducated ignore the advances of research into type-theory, category theory, object theory (polymorphism does help at times), any theory, and just code and code and code until they have something that 'looks good.'
How to solve this?
Start with you.
Not with your education, that is: not with your education that tells you who you are.
Start with how you can help, and then help.
- Project one: I saw how fractal dimensionality would solve a spectrum analysis problem. Did I say the words 'fractal' or 'dimensions'? No. I was working with real-programmers. If I asked if I could try this, do you know what they would say?
Pfft. Yeah, right. Get back to work, geophf!
But, instead, I implemented the algorithm. I sat with a user who had been working on those signals and knew what he needed, iterated through the result a week.
Just a week. While I did my job-job full time. I did the fractal spectrum analysis on my own time.
My 'thing' floored the software management team. They had seen straight-line approximations before. They thought I was doing actual signal analysis. I mean: with actual signals.
They showed my 'thing' to the prospective customer. And got funded.
- Another project: data transformation and storage, built a system that encompassed six-hundred data elements using a monadic framework to handle the semideterminism. That was an unsolvable problem in Java.
I used Java.
Java with my monadic framework, yes, but Java, to solve the problem.
- Third project: calculating a 'sunset date' over a data vector of dimension five over a time continuum.
Unsolvable problem. Three teams of software developers tackled it over six months. Nobody could get close to the solution.
I used a comonadic framework.
Took me, along with a tester who was the SME on the problem, and a front-end developer to get the presentation layer just right, about a month, and we solved that baby and put it to bed.
Unit tested. All edge cases.
Did I tell them I used a comonadic framework?
Nah, they tripped over themselves when they saw the word 'tuple.'
No joke, my functional programming language friends: they, 'software engineers,' were afraid of the word 'tuple.'
So I explained as much as anyone wanted to know when anyone asked. I wrote design documents, showing unit test case results, and they left me alone. They knew I knew what I was doing, and I got them their results. That's what they needed.
They didn't need my degree.
They didn't need to know I used predicate logic to optimize SQL queries that took four fucking hours to run to a query that took forty-five seconds.
They didn't need to know I refactored using type theory, that A + B are disjoint types and A * B are type instances and A ^ B are function applications so I could look at a program, construct a mathematical model of it and get rid of 90% of it because it was all redundantly-duplicated code inside if-clauses, so I simply extracted (2A + 2B ... ad nauseam) to 2(A + B ...) and then used a continuation, for God's sake, with 'in the middle of the procedure' code, or, heaven help me, parameterization over a simple functional decomposition exercise to reduce a nightmare of copy-and-paste to something that had a story to tell that made sense.
How do you connect to a database?
Do you need a college degree for that?
Kids with college degrees don't know the answer to that simple interview question.
And they don't know the Spring framework, making 'how to connect to a database' a stupid-superfluous question.
They don't know what unit tests give them. They don't know what unit tests don't give them. Because they, college kids and crusty old 'software engineers,' don't write them, so they have no consistency nor security in their code: they can't change anything here because it might break something over there, and they have no unit tests as a safety net to provide that feedback to them, and since they are programming in language X, a nice, strict, object-oriented programming language, they have no programs-as-proofs to know that what they are writing is at all good or right or anything.
A college degree gives you not that. A not college degree gives you not that.
A college degree is supposed to what, then?
It's suppose to open your mind to the possibility of a larger world, and it's supposed to give you the tools to think, and to inquire, so that you can discern.
"This, not that. That, and then this. This causes that. That is a consequence of this. I choose this for these reasons. These reasons are sound because of these premises. This reason here. Hm. I wonder about that one. It seems unsound. No: unfamiliar. Is it sound or unsound? Let me find out and know why."
English. Mathematics. Art. Literature. Music. Philosophy. All of these things are the humanities. The sciences and the above. Law. Physics. All these lead one to the tools of inquiry.
In school, you are supposed to have been given tools to reason.
Then, you're sent back out into the world.
And then you are supposed to reason.
And with your reason, you make the world a better place, or a worse place.
These things at school, these are the humanities, and they are there to make you human.
Not good at your job, not so you can 'use' your degree as a skill at work, but to make you human.
And, as human, are you good at your job?
And, as human, do you make your world a place such that others are good and happy at their jobs?
The end of being human is not to be skilled, nor proficient ... 'good' at your job.
But it's an accident of it, a happy accident.
The 'end' of being human?
Well: that's your inquiry.
That's what school, that's what everything, is for: for you to answer the unanswered question.
And, if you accept that, and are fully realized as a human being, then your way is the best way in the world, and your way has the ability to change lives. First, your own, then others. Perhaps your coworkers.
Perhaps hundreds of others.
Perhaps you will change the entire world.
But you won't know that until you take that first step of inquiry.
Then the next.
Then the next.
And you look back, and you see how far you've come, and ... wow.
That's what school is for. Not for your job.