Have you ever had a simulation game that bombed?By
Later this week, I have a podcast with Jamie Flinchbaugh, author of The Hitchhiker’s Guide to Lean discussing simulation games and how the Lean Learning Center designs, uses and the expected outcomes of them. After reviewing the podcast, I have to agree with Jamie in his description of applying PDCA to them. An individual and company that practice what they preach. Below is an excerpt from the upcoming podcast.
Have you ever had a game that just bombed, before? In the middle of it, wondered what you were doing there, or why you were doing it?
That’s a good question. I did go through that experience once. Not with one that we had designed, but it was with somebody that they had designed. I was sitting through it to give feedback and there was just an execution flaw based on how it was designed, and because it was executed wrong, you couldn’t get from here to there. I was a participant in it; I literally sat down and looked at the rest of my team members and said, “I don’t know what to do next.” I literally had to wait for someone to come and save us, because they had done it wrong and because of that, we were literally stuck. We were dead in the water. We really try to make sure our facilitator guides, in our training the folks on running it themselves, that they know all the things that could go wrong, and know how to avoid them.
Because executing it poorly can be a big problem and in the end, if people feel cheated, or feel wronged, that experience can be propagated to the rest of their learning, not just the simulation learning, but the rest of things.
We also had a simulation design; that simulation designed bombed. The simulation itself actually worked pretty well, but unfortunately relied on a supply of material which was highly unreliable, and so our ability to replicate it was very, very challenging, and I think we probably put more effort into buying materials the ever put into running the thing.
That was a design flaw, but it was still a fun experience and made the learning objective. We still believe in PDCA in everything we do. So, we design, and we have very clear objectives that we evaluate ourselves against. We do the check against.
We do dry runs, actually not dry runs; we do pilot runs with safe audience where we can test things out. We then do full runs, which are still part of the PDCA cycle, and we continue from there. So, our simulations that are 10 years old, we are still continuing to improve.
We really try to make sure that we have tested the conditions and the process well enough before we release anything for real use.
This is part of a series of blog posts outlined in A Lean Service Design Approach to Gaming your Training.