Some things first need to go wrong before there can be improvement
I'm currently reading the book "Black Box Thinking: The Surprising Truth About Success."
The author looks at different types of professional fields (healthcare, aviation, the justice system) and examines how their cultures deal with mistakes and failure. The key (and maybe a little trite) takeaway so far is that the better a system is dealing with failure (= learning from their mistakes) the faster it can improve.
Sounds obvious, right? That's what Lean Startup is all about: trying "fail fast" so that you can learn and build your product towards a success. Nothing very new.
However, something struck me while reading the book. The problem with the systems that didn't work as well as others, wasn't that people believed this process would be bullshit, but rather people just didn't believe there was any failure. For example there was this cult that just altered their beliefs after the prophecy of an ending world didn't turn out as their leader told them: they just went with "we saved the world" afterwards.
It's not that we people in the startup world don't believe in stuff like Lean Startup. It's just very hard to admit that something didn't work as expected. The always be pitching mentality makes us stay positive and find excuses and upsides for anything. But doesn't that affect learning?
We've done this too much at HashtagNow. It was just too much "ok, but if we build this, it's getting better" without clear hypotheses to test. This is why I want to focus on testable experiments rather than tasks in the following months. Let's see how this works.