Blog

Rounds 1 – 5

A summary of what has happened so far, our lessons, and our plans for the second half of the 2015 impact purchase.

The story so far

We have finished 5 rounds of the impact purchase.

The net transfers that have occurred so far:

  • We paid $200 for 17% of Ben Kuhn’s post on donation matching.
  • We paid $4700 for 4.46% of Ryan Carey and Brayden McLean’s work organizing EA Melbourne
  • We paid $100 for 0% of Oliver’s organization of HPMOR wrap parties.
  • Owen Cotton-Barratt paid $400 for 33% of Ben’s post on donation matching
  • Owen paid $200 for 0.4% of Ryan and Brayden’s work on EA Melbourne.
  • Larks paid $2k for 19.8% of Oliver’s organization of HPMOR wrap parties.

These purchases imply prices of:

  • $1200 for Ben’s post on donation matching
  • About $10,000 for Oliver’s work organizing HPMOR wrap parties
  • About $50,000 to $100,000 for Brayden and Ryan’s work organizing EA Melbourne

These prices are likely to change, and in some cases have already changed, as the projects age and more information is available about their long-term impacts. Overall our evaluations are intended to be conservative, such that we expect them to increase over time as information becomes available (and such that older projects will tend to have higher prices).

Going forward

We intend to take a break for a few months, and make three more $2,000 rounds of funding at the end of October, November, and December. We’ll make a post with our revised plans at the beginning of October.

We’d love any feedback on the process—what you think we should do differently in the upcoming rounds? (See “lessons” below for some of our thoughts.)

What to make of these prices

As a seller, we estimate that these prices imply values for time on the order of $50-$200/h.

We made crude  estimates of each intervention’s impact (here, here), estimating that these activities cost about $2 per dollar of EA donations stimulated. Our estimates were very noisy and tended to be pessimistic.

These prices are incomparable to estimates of “marginal good done per dollar” for most charities, which cost labor at non-profit salaries rather than opportunity cost. A more conservative approach would use foregone wages instead, which can easily be several times larger. Given that an EA working at a non-profit decided it wasn’t worth earning to give, even this much higher estimate presumably understates the real opportunity cost of labor at EA organizations.

By contrast, if recipients reported honestly (it’s not clear if they did), then the prices in the last section are at least the all-things-considered cost of the activities. In most cases they are substantially above, since we used a second-price auction and had only a few submissions.

We expect that these prices are higher than the average all-things-considered costs of accomplishing similar good in existing EA organizations, and are probably higher than the marginal costs. If you believe that, then the only reason to fund certificates at this stage is to explore an alternative funding mechanism. But we don’t think the comparison is a slam dunk, and we are optimistic that costs will fall further, at which point purchases could be competitive with marginal spending at other organizations.

Lessons

Guaranteeing incentive compatibility in dynamic two-sided markets is essentially impossible. We shouldn’t have tried, and going forward we will dismantle this guarantee as quickly as we can without undermining our existing commitments. We will continue to make our own decisions altruistically, we will encourage other funders to do the same, and we will avoid using our information to extract the lowest possible price. But we won’t try to make formal guarantees.

We were very grateful to Larks and Owen for participating in the experiment. We’re optimistic that a thicker market for certificates would be able to attract additional funding, so we see the largest bottleneck as finding additional sellers. Our guess is that the current sellers represent a small fraction of in-principle-available projects.

We felt guilty publishing sloppy evaluations of interventions that people care about. That said, we don’t think it did too much damage / we don’t feel too guilty, (which was a concern in advance) so overall this was a positive update for the current level of transparency. That said, participants seemed at least a bit unhappy about the prospect of transparency for unfunded projects, so we will give people a clearer opportunity to submit without committing to publication if unfunded.

The majority of our submissions were for outreach activities, and we think it would have been helpful if more people had been willing to submit research. We don’t have any clear ideas for improving this, though some of the other changes may help.

At least a handful of people were deterred by the cost of applying coupled with uncertainty about our evaluations. We will probably alter the process to involve lower-effort “expressions of interest” by would-be sellers, followed by a more extensive discussion if we are interested. We may also be more proactive in asking people if they are interested in selling. We will probably also start to talk with people prospectively about our likely willingness to buy work that they are considering doing.

As expected, we didn’t have enough time to make reasonable evaluations and so had to make very sloppy ones. But we felt happy enough distributing money on the basis of our rough evaluations that we overall updated in favor of the plausibility of making crude evaluations.

Round 2

Round 2 of the impact purchase is over. At the deadline, we had twelve submissions.

This round, we are buying a certificate for 1/70th Ryan Carey and Brayden McLean’s founding of and involvement in EA Melbourne during 2013, for $1000.

The deadline for applications to round 3 is May 25th. Apply here!

Below is our evaluation of EA Melbourne. As usual, this is a very rough and quick evaluation. No one involved with EA Melbourne has endorsed this as an account of their impact, and we expect that this evaluation would change significantly if we put in more time.

Evaluation of Founding EA Melbourne

We think the EA community in Melbourne has had a significant impact, with many people either making large donations or working on EA projects full time who would not have done so otherwise. We’re evaluating this impact at around $100k / year, even considering only donations+replacement costs at EA organizations, which we expect to significantly understate the real impact.

Many of these effects seem likely to be long-lasting, and we feel comfortable extending the benefits over at least 5 years.

It is much harder to attribute these impacts to the formal EA Melbourne group, as compared to the informal community, the LW group, TLYCS, the actions of individual EA’s in Melbourne, and so on. We had a few conversations to try to get a better sense of this allocation of responsibility. In the end we certainly didn’t get a confident answer, but we got a vague intuitive feeling for the situation.

Based on this impression, we allocated 10% of the responsibility to the formal organization of EA Melbourne. Interpreted narrowly we think this is more likely to be an overestimate than an underestimate.

But we think that there are other benefits from EA Melbourne that can justify this estimate, and which will tend to be on the same scale as the direct effect. For example, the broader EA community in Melbourne clearly had a positive effect; but EA Melbourne looks like it will have a big impact building and growing a similar community in the future. And to the extent that other organized communities and the online presence of EA played a big role, it seems that EA Melbourne has made similar contributions back to the broader EA community.

Our estimates concern the impact of EA Melbourne during its first 6 months. We assumed that Brayden and Ryan were almost entirely responsible for the founding of EA Melbourne. People other than Brayden and Ryan were clearly involved with the operation of EA Melbourne over this period. Conversely, EA Melbourne continued to exist after this initial period and it seems likely that its founding will continue to have positive impacts going forward beyond those already mentioned. In the end we called this a wash.

Overall, we evaluated EA Melbourne at $70k in stimulated EA donations. In this round, we purchased impact at a rate of one dollar per  dollar of stimulated EA donations, for a total price of $70k. This was about half of the price paid in the previous round.

Some notes on our evaluations:

We are more concerned with future generations than with current suffering, and have valued “dollars moved to GiveWell top charities” several times lower than “EA donations stimulated.” We welcome submissions with both kinds of benefits, but wanted to explain why they have not been purchased so far. We do believe that donations to GiveWell’s top charities have positive direct and indirect effects on the future, and we also value reductions in current suffering. But these benefits seem substantially smaller than the direct and indirect benefits of more focused interventions, or of organic growth of the EA movement that generates additional donations.

Some submissions involved impacts that have not yet materialized, for example positive effects on people who haven’t yet done anything differently as a result, but who may do something differently in the future. For the most part, we have held off on purchasing these.

We continue to evaluate influencing people at a discount based on an uncertain allocation of responsibility. But many of our submissions have been for EA outreach, and they have been the cheapest submissions despite this discount.

First round results

The first round of the 2015 impact purchase had eight submissions, including research, translation, party planning, mentoring, teaching and money to GiveDirectly.

We expected the evaluations would have to be rough, and would like to emphasize that they really were rough: we had to consider lots of things very quickly to get through them in a reasonable time for the scale of the funding. Please forgive us for our inaccuracies, and don’t read too much into our choices!

This round, we are buying certificates of impact for:

What does this mean? If everything is working correctly, it suggests that for about $1,200 you can buy an investigation as good as Ben’s. And if you can make an investigation as good as Ben’s, it suggests you can get $1,200 for it.

It does not mean that we think 50% of the blog post is worth 1.5x as much as 3% of the wrap party organization—the price mechanism is more complicated than that (read about it here).

(Note that these prices should include more costs of the labor than are usually accounted for when paying for altruistic projects. Usually if someone pays me to write an EA blog post, say, I am willing to do it for less than what I consider the value of my time, because I also want the blog post to be written. These prices are designed to be the full price without this discounting.)

The submissions

Here are all of the submissions so far. Everything not bought in this round can still be bought in the next rounds:

  1. Teaching at SPARC in 2014 (50%), Ben Kuhn
  2. Post “Does Donation Matching Work?” (50%), Ben Kuhn
  3. Inducing the translation of many papers and posts by Bostrom, Yudkowksy and Hanson to Portuguese, as part of IERFH (40%), Diego
  4. A donation of $100 to GiveDirectly, Telofy
  5. Research comparing modafinil and caffeine as cognitive enhancers, including these blog posts (50%), Joao Fabiano
  6. A chapter of a doctoral thesis defending a spin-off version of Eliezer’s complexity of value thesis (20%), Joao Fabiano
  7. Organization of Harry Potter and the Methods of Rationality wrap parties, including organization of the Berkeley party and central organization of other parties (50%), Oliver Habryka
  8. Mentoring promising effective altruists (50%), Oliver Habryka

The evaluations

Too hard to evaluate

We decided not to evaluate teaching at SPARC, inducing the translation of papers, or mentoring. Paul’s involvement in SPARC made buying teaching there complicated, and it would already have been difficult to separate the teaching from others’ work on SPARC. Inducing the translation of papers also seemed too hard to separate from actually translating the papers, without much more access to exactly what happened between the participants. The value of mentoring EAs seemed too hard to assess.

Purchased projects

We evaluated the other five projects, and it looked as if we would buy the two that we did. We then evaluated those two somewhat more thoroughly. Here are summaries of our evaluations for them.

Ben Kuhn’s blog post on donation matching
  1. We estimate that EAs and other related groups move around $500k annually through donation matching. We are thinking of drives run by MIRI, CFAR, GiveDirectly, Charity Science, Good Ventures, among others.
  2. We think a full and clear understanding of donation matching would improve this by around $6k, through such drives being better optimized. We lowered this figure to account for the data being less relevant to some matching drives, and costs and inefficiencies in the information being spread.
  3. We think this work constitutes around 1/30th of a full and clear understanding of donation matching.
  4. We used a time horizon of three years, though in retrospect it probably should have been longer. This implicitly included some general concerns about the fraction of people who have seen it being smaller in the future, and information accruing from other sources and conditions changing, and so on.
  5. We get $6,000 * 3 years / 30 = $600 of stimulated EA donations
Oliver Habryka’s organization of HPMOR wrap parties
  1. We estimate that around 1300 people went to wrap parties (adjusted somewhat for how long they were there for). This was based on examining the list of events and their purported attendances, and a few quick checks for verification.
  2. We estimated Oliver’s impact was 1/4 of the impact of the wrap parties. We estimated that the existence of cental organization doubled the scale of the event, and we attributed half of that credit to the central organization and half of the credit to other local organizers and non-organizational inputs (which also had to scale up).
  3. We estimated that the attendance of an additional person was worth around $15 of stimulated EA donations. This was a guess based on a few different lines of reasoning. We estimated the value of the EA/LW community in stimulated donations, the value of annual growth, and the fraction of that growth that comes from outreach (as opposed to improving the EA product, or natural social contact), and the fraction of outreach that came from the wrap parties. We also guessed what fraction of people were new, and would become more involved in the EA/LW community as a result, and would end up doing more useful things on our values as a result of that. We sanity checked these numbers against the kind of value participants probably got from the celebration individually.
  4. Thus we have 1300 * 15 /4 = $4,875 of stimulated EA donations, which we rounded up to $5,000.

Note that while we evaluated both items in terms of dollars of stimulated EA donations, these numbers don’t have much to do with real dollars in the auction—their only relevance is in deciding the ratio of value between different projects. So systematic errors one way or the other won’t much matter.

For both projects, we paid more than a dollar for each estimated stimulated dollar of EA funding. This suggests we are getting a bad deal. However we are actually pretty happy with our purchases: our sense is that our evaluations were fairly pessimistic, and that the projects were worth at least what we paid. We invite you to judge for yourself what the projects were worth.

Notes on our experience

Quick estimates

It was tough to evaluate things fast enough to be worth it given how little we were spending, while also being meaningfully accurate. To some extent this is just a problem with funding small, inhomogeneous projects. But we think it will get better in the future for a few reasons, if we or others do more of this kind of thing:

  1. Having a reference class of similar things that are already evaluated makes it much easier to evaluate a new project. You can tell how much to spend on a bottle of ketchup because you have many similar ketchup options which you have already judged to be basically worth buying, and so you mostly just have to judge whether it is worth an extra $0.10 for less sugar or more food dye or whatever. If you had never bought food before and had to figure out from first principles how much a bottle of ketchup would improve your long term goals, you would have more trouble. Similarly, if we had established going prices for different kinds of research blogging, it would be easier to evaluate Ben’s post relative to nearby alternatives.
  2. We will cache many parts of the analysis that come up often. e.g. how much is it worth to attract a new person to the EA movement? And only make comparisons between similar activities.
  3. We will get better with practice.

Shared responsibility

We said we would not buy certificates for collaborative projects unless the subset of people applying had been explicitly allocated a share of responsibility for the project. Collaborative versus not turned out to be a fairly unclear distinction. No project was creating objects of ultimate value directly; so all of these projects are instrumental steps, to be combined with other people’s instrumental steps, to make further, bigger instrumental steps. Is a donation to GiveDirectly its own project, or is it part of a collaboration with GiveDirectly and their other donors? Happily, we don’t care. We just want to be able to evaluate the thing we are buying. So we were willing to purchase a donation to GiveDirectly from the donor, but not to purchase the output of a cash transfer from a GiveDirectly donor. In some cases it is hard to assess the value of one intermediate step in isolation, and then will be less likely to purchase it (or will purchase it only at a discount).

Call for more proposals

The next deadline will be April 25. If you have any finished work you’d like to partially sell, please consider applying!

First impact purchase deadline

We are approaching the deadline for the first round of funding (March 25th).

If you have not yet, consider applying! The application should take less than twenty minutes, and you will probably make a substantial profit, as well as helping to make the world better.

I’m told it would be good to have more examples of things to sell. I’m not selling things, because I’m one of the ones buying them. But if I were selling things, here are some examples of things I might try to sell:

  • This blog post about the altruistic value of being vegetarian
  • These investigations into improving personal effectiveness via typing faster and taking drugs
  • Work I did on AI Impacts before it was at MIRI, e.g. an interview with an AI researcher that I haven’t put up yet
  • Organization of the High Impact Philosophy discussion group

Don’t misunderstand—I like all of these things. But there is some amount of money I would prefer to each of them, and I think it looks plausible that I could get it (were I on the selling things side).

If you have questions, please ask them in the comments—we will endeavor to answer.

This blog will have updates on the process and results of each round. It will also have reminders about future deadlines. If you’d like to hear about any of these things, we also now have an RSS feed (see left).