Saturday, September 29, 2012

Focus and Leverage Part 143

For the past year I've been working/consulting in a Maintenance, Repair and Overhaul environment for a contractor to both the Army and the Navy.  The first engagement in this area was focused on improving the throughput of jet engines in that when an engine was due for service, the contractor had to replace it with a rental engine while the original engine was being maintained, repaired or overhauled.  The contractor had to pay for the rental expense if it exceeded the contract limit which it did by a wide margin.  I used my customary approach in that we mapped the process in an attempt to identify the system constraint.  We actually found two constraints within this process and both of them involved lengthy approval processes with lots of wait time imbedded in them.  And while the paperwork was being approved, the cost of the rental engine kept accumulating.  It wasn't the engine repair time that caused the extended cycle time problem at all.  It was a classic example of policy constraints causing the excessive rental costs.  Once these constraints were identified, it was clear what had to be done.  So the question became, "Why were the approval process cycle times taking so long?"

I looked at a lot of data, including the email trails, the vehicle to transmit the approval paperwork.  What I found was that because the manager responsible for had so many other functions to perform, he would send out his paperwork usually only one day of the week.  And when he did so, he sent them out in "batches."  The effect of him "batching" the approval paperwork was exactly the same effect as a python trying to swallow a pig!  The python can do it, but the process is slow and the pig moves through the python's body at a snail's pace.  Yes, it eventually is digested, but it takes much longer than if the python had eaten the pig one bite at a time.  Batching encumbers a process by extending the overall cycle time of a process and the approval paperwork process was no exception.  So what did we do to "fix" this problem?

I watched the process of entering the data into a database by the engine manager and it was clear that it was a lengthy process for him.  It was also clear that by hiring a data entry person, the engine manager could be freed up to perform other important functions.  The costs accountants told us that we were not permitted to hire anyone, but that we could run a 3 week study which we did.  During that 3 week trial, the approval paperwork jetisoned through the process and the rental engine time and expense decreased significantly.  Problem solved....right?  I wish it would have been.

The accountants would not approve a permanent slot for data entry because it was too much of an expense!  Think about this decision for a minute.  If the cost of the rental engine was $75/hour and we were able to reduce the rental engine time from 60 hours to 20 hours.....well you can do the math.  In the cost accounting world the focus is on cost savings because they believe that the key to profitability is through saving money.  In the TOC world, the key to profitability is through making money!  And the key to making money is by increasing the throughput.  And the key to increasing throughput is by focusing the improvement effort on the system constraint.  But then again, not everyone sees it this way...............

Bob Sproull

3 comments:

Jim Bowles said...

People often overlook the fact that Throughput is affected by two elements - the rate of flow through the pipeline (governed by the bottleneck) and the length of time in the pipeline MCT (governed by delays). Not only is it important to communicate how physical processes become constraints it is equally important to show how administrative processes (the flow of information) are part of that flow too. During my first ever assignment we showed that the leadtimes could be reduced by two weeks (out of a total of six) merely by scheduling a typist everyday to produce the works documentation - another case of batching because it was considered easier and more convenient. (LOL) Dataflow analysis is a valuable tool for doing this especially when combined with a checklist that examines delays in the flow of information.

Bob Sproull said...

Thanks Jim for your comments which are right one the money as usual! Happy you are following my blog.

Bob

Bob Sproull said...

Sorry, the should be "right on the money"....not right one the money.