All great companies collect and analyze user data to refine their products and stay competitive. From large-scale analytics (e.g. big data as seen with Predictable Data, BloomReach, and Infochimps) down to small test-and-iterate methods like Lean Startup, there is always room for companies to learn and improve how they use data. During 2013, 3 Day Startup grew quickly, and we found that our (somewhat ad-hoc) methods for data collection became harder to scale.

Having a focus on measuring impact as a nonprofit from day one, 3 Day Startup’s growth warranted the implementation of more scalable avenues to track customer experience. In 2014, we began working with Dr. Andrew Zimbroff, an Assistant Professor at the University of Nebraska, Lincoln, to develop an easily-distributable Program Assessment Tool to take 3DS’ data collection and analysis to the next level. Dr. Zimbroff gives an overview of the tool and a summary of the results below:

This past year I worked with 3 Day Startup to launch a pilot of the Program Assessment Tool. We surveyed over 200 students from programs in the US and abroad with a short and simple two-part survey. The first part used the Net Promoter Score (a metric originally developed to quickly measure customer loyalty to a company) to gauge participant opinions about their experience. The second part analyzed students’ perception of 3DS programs helping advance their educational and career endeavors. The survey results provided us with many insights about the programs we administered this fall. In addition to written feedback, we collected a sizable demographic and program-response data set, which is furthering our knowledge about entrepreneurship education, delivery, and impact.

One of the most useful characteristics of this data set is that it can be easily sorted into different segments. Using pivot tables (yes, like startups, data tables can also pivot) for all questions asked, we sorted the data by program, demographic, etc. For example, the first table below shows the Net Promoter Score broken down by international and domestic programs. We can also look at the Net Promoter Score by participant degree status (second below). While we had great results for all demographics (in fact, according to Satmetrix, 3DS’ NPS ranks up there with customer-loved companies like Southwest and, as shown in the graphs below), we found out which specific participant groups were getting the most out of our programs.


Experiments like this allow 3 Day Startup to look at each program individually and provide customized, tailored feedback for improving repeat programs. For example, we asked all students if the 3 Day Startup program they experienced would help them in a future job or internship. Results from this question were broken down by each program and shown in the graph below (not all programs are shown in the resulting graph).


In the spirit of practicing what we preach regarding feedback and input from our users, we are opening the survey data up for comments and suggestions. What other factors should 3DS look into to learn insights about our programs? Are there any specific data sets you would like to see? Dr. Zimbroff will be checking comments frequently to find new experiments to look into.