20Delivery Rates On Kickstarter
University of Pennsylvania – Wharton School
December 4, 2015
Using a large survey with 47,188 backers of Kickstarter projects, I examined the factors that led to projects failing to deliver their promised rewards. Among funded projects, a failure to deliver seems relatively rare, accounting for around 9% of all projects, with a possible range of 5% to 14%. There are few indicators at the time of project funding as to which projects might ultimately fail to deliver rewards, though small projects (and to a lesser extent very large projects) are more likely to fail to deliver rewards, as are some project categories. The demographics of project creators (including gender, education level, and family status) did not significantly affect the chance of a project succeeding.
Delivery Rates On Kickstarter – Introduction
Kickstarter, the largest reward-based crowdfunding site, has facilitated the raising of over $2 billion from 9.5 million people, funding over 93,000 projects. Though many projects on Kickstarter have gone on to be artistic or financial successes for project creators, to date there has been no clear evidence about how often projects actually deliver on their promises to backers. What evidence we have suggests that creators are generally honest, if overconfident – many projects take longer to deliver than creators estimate1 and overall fraud rates are low2. However, while Kickstarter warns potential backers about the risk of non-delivery in supporting projects, the actual share of projects that fail – that is, either do not deliver a promised reward, or deliver a reward that is very far from expectations – has been unknown, and a subject of considerable speculation. This paper provides a first attempt to systematically understand delivery rates on Kickstarter.
In order to discover delivery rates, I conducted a survey with the help of Kickstarter. In total 456,751 backers were surveyed, representing 65,326 projects. All projects from 2009 through May, 2015 that raised over $1,000 were included in the sample, as well as half the projects that raised less than $1,000 but over $250, and a quarter of projects raising less than $250. Backers were selected randomly, without replacement, to maximize the number of backers per project. A mean of 7.2 backers were surveyed per project, with 7 backers surveyed in 89% of projects and 10 backers surveyed in 7.8% of projects.
A total of 47,188 backers (10.3%) responded. In total, there is at least one response for 30,323 projects, (46.4% of all projects), with 1.56 backer responses per project on average. The mean backer in the sample contributed $76.43 to the project they backed.
Response rates were higher for projects that traditionally produce consumer products, such as games (83% of all projects), technology (72% of all projects), design (70% of all projects) and comics (72% of all projects). They were lower for categories focused more on traditionally artistic pursuits, such as those in theater (35% of all projects), dance (31%), music (36%) and film (37%). Larger projects and more recent projects also had higher response rates. Across all categories, however, response rates were acceptable, and are unlikely to bias the findings.
Measuring Failure Rates
One challenge in analyzing the results was to determine what a “failed” project might be. Backers might consider a project failed if it did not deliver on its promises, if it delivered something different than expected, or for any one of a number of reasons. For the purpose of this study, I focus specifically on the rewards promised to project backers in return for backing projects (rewards on Kickstarter include a mix of physical, digital, and intangible rewards). The delivery of rewards seems to be the major way in which project backers evaluate the success of a project. At the same time, it is important to note that rewards are but one potential outcome of a project as there are many ways by which a project could “succeed” but still fail to deliver rewards – for example, an art exhibit may have been successfully staged, but not deliver a promised t-shirt or sticker to backers. Given this caveat, the degree to which backers believe they receive the expected outcome is a reasonable measure of one kind of success or failure.
There are many potential ways to classify projects as failed, based on our data. Respondents were asked to select from one of five reward status options, see Table 1 below.
For this paper, I consider failures to be those projects where backers answer that they “never expect to get the promised reward” (5.2% of all responses) or that they “received the reward but it was not what was promised” (2% of all responses).
This issue becomes more complicated when considering projects rather than individual backer opinions, because there might be multiple backers who answer the survey about a single project, and they might disagree about whether promised rewards were delivered. Thus, at the level of projects, rather than respondents, there is a need to decide how to classify projects as failures. The broadest definition is to say if anyone reported the project as a failure, then the project has failed. This would classify 9.95% of all projects as failures. However, given that individual complaints are not uncommon, this is likely too harsh a definition. If instead, we classify projects where at least half of backers considered the project as a failure (which I will refer to as the “middle definition” of failure), the rate drops to 8.6%, and if we take the strict definition that all backers should consider the project a failure, the failure rate is 5.6%. Figure 1 shows the failure rates by category under all three definitions.
See full PDF below.