Rounds 1 – 5

A summary of what has happened so far, our lessons, and our plans for the second half of the 2015 impact purchase.

The story so far

We have finished 5 rounds of the impact purchase.

The net transfers that have occurred so far:

  • We paid $200 for 17% of Ben Kuhn’s post on donation matching.
  • We paid $4700 for 4.46% of Ryan Carey and Brayden McLean’s work organizing EA Melbourne
  • We paid $100 for 0% of Oliver’s organization of HPMOR wrap parties.
  • Owen Cotton-Barratt paid $400 for 33% of Ben’s post on donation matching
  • Owen paid $200 for 0.4% of Ryan and Brayden’s work on EA Melbourne.
  • Larks paid $2k for 19.8% of Oliver’s organization of HPMOR wrap parties.

These purchases imply prices of:

  • $1200 for Ben’s post on donation matching
  • About $10,000 for Oliver’s work organizing HPMOR wrap parties
  • About $50,000 to $100,000 for Brayden and Ryan’s work organizing EA Melbourne

These prices are likely to change, and in some cases have already changed, as the projects age and more information is available about their long-term impacts. Overall our evaluations are intended to be conservative, such that we expect them to increase over time as information becomes available (and such that older projects will tend to have higher prices).

Going forward

We intend to take a break for a few months, and make three more $2,000 rounds of funding at the end of October, November, and December. We’ll make a post with our revised plans at the beginning of October.

We’d love any feedback on the process—what you think we should do differently in the upcoming rounds? (See “lessons” below for some of our thoughts.)

What to make of these prices

As a seller, we estimate that these prices imply values for time on the order of $50-$200/h.

We made crude  estimates of each intervention’s impact (here, here), estimating that these activities cost about $2 per dollar of EA donations stimulated. Our estimates were very noisy and tended to be pessimistic.

These prices are incomparable to estimates of “marginal good done per dollar” for most charities, which cost labor at non-profit salaries rather than opportunity cost. A more conservative approach would use foregone wages instead, which can easily be several times larger. Given that an EA working at a non-profit decided it wasn’t worth earning to give, even this much higher estimate presumably understates the real opportunity cost of labor at EA organizations.

By contrast, if recipients reported honestly (it’s not clear if they did), then the prices in the last section are at least the all-things-considered cost of the activities. In most cases they are substantially above, since we used a second-price auction and had only a few submissions.

We expect that these prices are higher than the average all-things-considered costs of accomplishing similar good in existing EA organizations, and are probably higher than the marginal costs. If you believe that, then the only reason to fund certificates at this stage is to explore an alternative funding mechanism. But we don’t think the comparison is a slam dunk, and we are optimistic that costs will fall further, at which point purchases could be competitive with marginal spending at other organizations.


Guaranteeing incentive compatibility in dynamic two-sided markets is essentially impossible. We shouldn’t have tried, and going forward we will dismantle this guarantee as quickly as we can without undermining our existing commitments. We will continue to make our own decisions altruistically, we will encourage other funders to do the same, and we will avoid using our information to extract the lowest possible price. But we won’t try to make formal guarantees.

We were very grateful to Larks and Owen for participating in the experiment. We’re optimistic that a thicker market for certificates would be able to attract additional funding, so we see the largest bottleneck as finding additional sellers. Our guess is that the current sellers represent a small fraction of in-principle-available projects.

We felt guilty publishing sloppy evaluations of interventions that people care about. That said, we don’t think it did too much damage / we don’t feel too guilty, (which was a concern in advance) so overall this was a positive update for the current level of transparency. That said, participants seemed at least a bit unhappy about the prospect of transparency for unfunded projects, so we will give people a clearer opportunity to submit without committing to publication if unfunded.

The majority of our submissions were for outreach activities, and we think it would have been helpful if more people had been willing to submit research. We don’t have any clear ideas for improving this, though some of the other changes may help.

At least a handful of people were deterred by the cost of applying coupled with uncertainty about our evaluations. We will probably alter the process to involve lower-effort “expressions of interest” by would-be sellers, followed by a more extensive discussion if we are interested. We may also be more proactive in asking people if they are interested in selling. We will probably also start to talk with people prospectively about our likely willingness to buy work that they are considering doing.

As expected, we didn’t have enough time to make reasonable evaluations and so had to make very sloppy ones. But we felt happy enough distributing money on the basis of our rough evaluations that we overall updated in favor of the plausibility of making crude evaluations.