cover image for The Lean Startup

The Lean Startup

Eric Reis

6/10
Fun and Solid ideas with lots of cool terminology and context as well as agile like ideas to bolster this attitude
  • *** Build product, measure customer response, learn what and why. Repeat. This is the loop
  • How to measure progress, how to setup milestones, how to prioritize work
  • Idea to measure progress as learning rather than specific to field/job output
  • Startups should use a different management model due to their extreme uncertainty
  • *** Rapid testing, lots of ideas and lots of testing 500 instead of 1. 1 creates politicians fighting for their ideas. The other creates entrepreneurs. Managers should create these systems
  • *** Site AB testing for half of users redirected to different webpages. (Many engineering changes have no impact on customer behavior), Split test experiment, feature test is only sent to half of users. To have a measure / baseline.
  • *** Talk to real users often and early, Learn from real people.
  • Mental model for users can and will be wrong (assumptions will likely be wrong. What you assume will be barriers, what actually end up being barriers to use)
  • Value vs waste. Only do value based things where value is providing benefit to the customer.
  • But what the customer finds valuable is initially unknown, so you must find this.
  • Don't go for vanity metrics and short term advertising stuff.
  • Should not be can this product be built, nowadays it typically can. But should this product be build and can you build a sustainable business around it's product / services
  • Experiments, scientific method. Hypothesis and then result relative. Goal is to build a sustainable business.
  • Zappos, online shoes. Took pictures to post online, would come back IRL and buy shoes if someone bought online.
  • Question, is there already sufficent demand to make a business for online shoes.
  • Better customer interactions, better data, surprised when customer act in unexpected ways (returns?)
  • *** Value hypothesis, does it really provide value? Proxy for value of thing, do they come back for seconds?
  • *** Growth hypothesis, how new customers discover a product. Behavior, do people actually spread the thing. Point is to find early adopters (better for feedback)
  • Measure what they actually did, not what they think.
  • If 10/10 people arean't repeat customers thats bad. But we ask them again.
  • *** Questions, do consumers recognize that they have a problem. If their was a solution would they be willing to buy it. Would they buy it from us. Can we build it.
  • *** MVP, even if hard to use can prove value given and need if people want to use at all.
  • Also complaints about lack of certain feature validates the need for those features (and make them prioritize over all other features)
  • For a service based can experiment with side services. Ironing, or extra pay for speeds. (Ex laundry)
  • When experimenting target a very small geographic area, localized adds or marketing
  • Planning only works with a long and stable operating history. Else agile and experiments are better given uncertainty
  • Hypothesis, design w/ feedback metrics, Build, measure, learn, pivot?, Repeat
  • *** Big assumptions like we assume super markers would be willing to carry our product or we assume customers will find out thing valuable etc. Break down the assumptions you have that are required for your product to succeed and test them early
  • Their are value destroying kinds of profit and growth, like ponzi schemes. This is bad
  • Customer archetype - their personality and problems
  • Early adopters are, willing to use an 80% solution and you should target them
  • How many features do we really need to appeal to early adopters. Anything extra is waste
  • Examples -
  • Drop box, seamless synchronization and great UI. Others cannot imagine how to use. Because to out there.
  • Cannot build MVP because too technically difficult to do without knowing there is defined interest.
  • *** Made a video, to show how it would work as MVP. Targeted at early adopters. video is MVP, and validation is beta list testers
  • *** Concierge MVP, tried to make a sale. They got a personal version of the service without the tech. Did it IRL. The idea was personalized meal plan based on your preferences and what's on sale at your grocery store.
  • Learning more and more about the system and customers.
  • Only when they got to much people to do, start doing automation. (Send lists via email, take payment via credit card online)
  • Virtual personal assistant, for subjective questions. MVP 4 week projects to test with beta users, question of what would it take for users to engage and tell their friends. Hired 8 people to be the backend.
  • Worked via instant messaging. Worked by sending question to friends in personal network.
  • *** Wizard of Oz testing. Humans they're interacting with a product but really people are making it work behind the scenes.
  • Customers don't care how much time it took to do something by the builders. Only that it meets their needs. Sims style walking and obstacle avoidance vs cheap and easy tap to teleport. Same outcome but one is far far easier to build and both server the need
  • MVP speed bumps
  • Legal issues, fear over competitors, branding risksz moral
  • Patent risks. Some places window begins when released to public. Seek legal counsel
  • Competitors, it's hard to have competitors notice any idea. Or anyone to notice your idea. The issues with most established companies is that they have to many good ideas already.
  • Only way to win is to learn faster.
  • Branding is hard, so be a different brand name than your parent.
  • Cannot give up hope on MVP, commit to interations. Crisis of faith is bad for traditional management. But okay for us because expectations are tempered and focus is different. Must measure learning.
  • Rigorously measure current state and devise experiments to figure out how to get where we want.
  • Persevering to much is baaaad.
  • Innovation accounting. Get current baseline with MVP. Tune engine from baseline towards ideal. Pivot or persevere (strategy working? Or fundamentally flawed. Pivot worked if further experiment are more successful at getting results)
  • Main MVP or individual per assumption. Smoke test for will people buy even if you don't have it yet.
  • *** Test riskiest first as thing hinge the most.
  • *** Must know what a design or engineering change is trying to accomplish (increase sure conversions, signup, retention etc) and be measure before and after to judge successfulness
  • Cohort analysis, each days customers are independent of the days before and are categorized as such. Don't look at cumulative numbers but numbers based on small chunks of users based on time.
  • Company executing a plan well, it's just the wrong plan. Are current actions actually have desired impacts
  • Vanity metrics (top line overall). This isn't true cohort based means that on a cohort by cohort basis your % convention or whatever isn't getting better
  • Kanban, only so many stories can be in any state at a time of (backlog, in development, done, seeking split test validation) buckets fill up and finish.
  • Needs hypothesis.
  • Metric value
  • Actionable - clear cause and effect for metric change, replicable.
  • Accessible - needs to be understandable metrics, simple and concrete units. People and their actions. Wide spread access to reports for all employees.
  • Auditable - tempted to blame others, when something goes bad. Must be able to spot check with real customers.
  • Runway, the amount of pivots you can still make. Usually just the idea of the amount of money and burn rate instead. Potentially you can get to each learning and pivot faster
  • Pivoting is scary because
  • Acknowledging failure is scary for moral
  • Measuring is scary and people are afraid it isn't representative of their full idea in final form
  • Schedule a regular pivot or persevere meeting in advanced
  • Should look into other concepts by validating other concepts and pre conceptions with various customers.
  • *** Pivot, given data do you need to change fundamentally what you're doing.
  • Zoom in pivot. Refocus product identity around one of it's core features to nullify a negative and enhance positives.
  • Customer segment pivot, solving a real problem but not the initial intended audience, hard because early adopters and main stream have very different wants and need.
  • Zoom out pivotz sometime single featur is insufficient and needs a suite of support.
  • Customer needs pivot, our problem isn't important but new problem is worth solving
  • Platform pivot, application to platform or vice versa
  • Business architecture, low margin high volume or high margin low volume
  • Value capture pivot.
  • Engine of growth pivot. Viral, sticky, paid
  • Channel pivot. Distribution channel.
  • Technology pivot, use different tech to do same thing.
  • What create value vs waste. Where value is validated learning.
  • Batch size is how many of a thing is move from each stage to another in the process. Batch size of 1 is progen better, unintuitively. The reason being that our intuition doesn't account for the time it takes to sort and move giant stacks. We also assume that we'll get better at the minor repetitive task the more we do it which may be true but isn't as important as the other thing
  • Also any errors in the finals steps or any step cause all the other steps to be invalidated and potentially their work undone (unpacking boxes if you find out your stuff won't fit or is slightly flawed)
  • Smaller batch size, allows for more cuetomizeablitiy potential is higher.
  • Also smaller batch size means QA issues can be identified much sooner
  • If they don't want what we're selling, finding out sooner is better than later. Small batches help
  • Design, develop and ship one feature at a time. Work together on 1 feature that goes out only to a smaller number of users. Test suite that makes sure main functionality is still working.
  • Value of most hardware is determined by software and software can be changed quickly, for faster feedback. 3D printing and fast prototyping.
  • A designer will be interrupted with engineering questions while trying to design the next batch
  • Large inventory is expensive because has to be stored, bought and tracked. Especially because you might not need all that inventory.
  • Better to keep one spare on hand, and then order when used.
  • Try to use common materials and infrastructure, to make small batches more feasible
  • Sustainable Growth, old customers help attain new customers. Ways,
  • Word of mouth, as a side effect of common use, through funded advertising (must be revenue base, cost of acquiring is less then customer purchase), repeat purchase or reuse,
  • *** Sticky engine of growth, attract and retain customers.
  • Track attrition rate or churn rate. Fraction who fail to remain engaged with product. Growth rate - churn rate. If churn is high you need to focus on making customer happier.
  • *** Viral engine of growth. Depend on person to person transmission as a side effect of use.
  • Viral loop, viral coefficient, how many new cusomter will a single customer bring with them by using. Coefficient needs to be above 1 meaning each new person bring in avg 1 other person with them. Viral products tend to be free and have indirect revenue (like ads), to help their engine.
  • Facebook
  • Value is to advertisers is eyes, the customer is getting a network site
  • *** Payed engine of growth, you pay a certain amount to aquire a new cusomter, and make a margin on the acquisition. 2 ways to speed up, decrease cost of acquisition or decrease cost of servicing
  • LTV life time value,
  • CPA, cost per acquisition
  • Marginal profit, difference
  • Product market fit, product finds customers wide spread and do it well.
  • Each engine has metric for product market fit. 0.9 for Viral
  • Payed, marketing and sales
  • Every engine runs out eventually, Thus you must make new products and growth ideas,
  • *** 5 why's, help uncover main true cause. Usually move towards human issue, playbook or training issues.
  • Proportional investment in training to pain, minor glitch - minor incremental training.
  • 5 blames. Not this
  • Bad process not bad people. Everyone in the room who interacted, customer service, people who tried to fix, people who made, people who made process. Whoever is left out, tends to be blamed
  • Mutual trust needed
  • Tolerant of all mistakes the first time, never allow mistakes twice.
  • Executive leadership needs to support and get behind this.
  • *** Innovation requires 3 things. Scarce but secure resource, independent authority to develope their product, and a personal stake in the outcome.
;