How to achieve the financial performance of a successful technology company: Think like 3M

As I mentioned in my previous blog entry, I’ve identified three themes companies employ to successfully  compete in a digital economy. This is the first: Relentlessly innovating and executing on those ideas.

Everyone loves a founder’s story. They often tell about experiencing some type of everyday problem and through a serendipitous event, a solution is imagined. This solution along with an unbelievable amount of hard work formed the genesis of a now highly successful company.

Sarah Blakely, founder of Spanx tells her story as a one-woman innovation wonder as do the founders of Uber, Airbnb and countless other start-ups. These journeys are driven by the sheer will of individuals who have the passion to see their ideas become successful.

Digital giants continue this tradition even as they have grown. Companies such as Amazon, Google and Tesla generate and experiment with new ideas that rethink their customers’ experience by using the advantages of technology to deliver scale, scope and speed.

For entrenched incumbents who have invested their treasure optimizing processes and raising barriers according to the rules of an outdated supply chain playbook, the future can be worrisome. They are now held captive by the fort that previously protected them.

Fortunately for incumbents, this scenario is not permanent. Just as Amazon and Google have remained innovative and adaptive as they grew, a few companies such as 3M have maintained their spirit of innovation (and outsized financial returns) for decades. By studying 3M’s method of innovation, it becomes clear that it has figured out how to create its own serendipity to generate new ideas as well as emulate at scale the same passion to execute as found in a start-up.

3M’s ability to innovate and execute is well documented. It often ranks highly as an innovative company. I won’t duplicate other’s good work, so you can read what I found to be some of the more interesting articles for yourself. There were two key findings for me. First is that 3M is organized to create and then execute on new products beginning with customer interactions all the way through to revenue conversion. It’s not a single department’s responsibility to innovate, it’s the way the entire company operates. Secondly, 3M has also put the mechanisms in place to stop funding projects and put people back to work on productive projects just as effectively as it begins new projects. Clogging the system with poorly performing ideas hurts performance as much as never funding good ideas.

With the barriers to all types of technology continually falling, a company’s biggest differentiator becomes orchestrating these technologies through its ecosystem to redefine what it delivers to its ultimate consumer. To do this successfully requires the ability to forget the current supply chain rules, have a deep understanding of the processes available to deliver product and services and an intimate understanding of the needs of its customers. That knowledge lies somewhere within your workforce. A leader’s job is to unlock that knowledge and act on it. How close is your company to executing innovation at the level of 3M or Google? How exposed is your company to a new competitor figuring it out first?


Some reading on 3M innovation

What is the Starbucks experience worth?

I unwittingly walked into the middle of a spontaneous retail experiment yesterday. As I entered the New Orleans convention center on my way to the SHRM conference I saw a long line. At first I figured it was for registration. As I followed the line with my eyes to the beginning I realized it was a snaking line of about 60 people waiting for a Starbucks coffee.

While that is noticeably longer than most lines I see waiting for food, it wasn’t shocking. This is a large conference with about 15,000 attendees so you can expect peak time waits. What caught my attention however was an unbranded coffee and refreshment concession right next to the end of the Starbucks line.  The contrast was stark. In a time when everyone talks about how busy they are, people are willing to wait 20-30 minutes for a Starbucks coffee when a likely reasonable and less expensive cup of coffee was available in less than five minutes. It was worth a picture so I took this panoramic image. On the right is the Starbucks store, in the middle (back) is the line of people waiting for Starbucks. On the left near the back of the line is the unbranded concession with a small line.


When the topic about valuing the brand and a great customer experience comes up, this little experiment shows that at least in the retail coffee market the value is worth about 30 minutes of a customer’s time and probably a 20% price premium even when the switching cost is zero. (Yes, I know, cut me a little slack on sample size)

The lesson can be translated to other markets….bare bones service and low cost offerings have their place in the market, but a larger percentage of consumers place a premium on the overall experience that includes service and product. Is your company doing what it can to deliver this kind of experience to your customers?


Do you track regular time, absences and overtime? Are you getting this insight from your data?

Let’s start with some background. We were working with a client to help them understand how effectively they were using labor hours in their Distribution Center. This organization’s employees are paid hourly and considered full time.

Similar to most clients, they were tracking Regular, OT and Absence hours by employee. Looking at the data summarized by department across time in the chart below showed nothing unusual. In fact it looks like they have increasing amounts of overtime when they use more overall hours. This is a good method for flexing labor to meet demand. (For a great article on the effective use of OT, check out this The Overtime Lie by John Frehse at Core Practice Partners)

hours by week of year

It appears that they are scheduling effectively too. The chart below shows that in general they are working the hours that are scheduled by department.

scheduled and worked hours

At the summary level many of the details are hidden, It’s only when we get down to the individual employee level that it becomes apparent that there is room for improvement. This is the hardest part of deriving insights from data…organizing the data to start at a summary level and then progressively drilling down towards the right amount of detail to discover something of value.

employee hours

Let me explain the chart above…the x axis shows the number of hours worked in one week.  The y axis shows how many of those hours are overtime hours. Each dot represents the hours worked by an individual employee. The color of the dot represents the range of hours that employee was absent during the week. For example a green dot means all the hours were worked and there was no absence. A yellow dot means that while a person was scheduled for say 30 hours. They were absent between five and ten hours and worked the rest. The legend on the right of the chart shows the complete gradient.

We saw two distinct situations in this data. The first is that for this particular week charted there is a higher rate of absenteeism compared to others. It wasn’t a holiday week or something else that would drive absenteeism. By looking at this chart week by week it became clear that on busy weeks there was higher absenteeism than in slower weeks.

Secondly, we immediately noticed that while everyone is considered full time, many people were not working 40 hours and yet there were many people working significant overtime. So while at a summary level, the OT percentage looked reasonable, the reality was that there was plenty of regular hours capacity to satisfy the demand without the use of overtime. This of course drives costs up, but additionally as we hear in the news frequently, many people are not getting as many hours as they would like and to them, this situation is likely frustrating.

Creative visualization can turn data that companies must collect into valuable insights for improved productivity and increased employee satisfaction. In this case eliminating half the overtime and replacing it with regular hours would have resulted in a ~$35k a week savings. Don’t let your data rest!

Visualizing Schedule “Tradesies”

We recently met with an operations executive that was confronting one of the more vexing issues a manager faces. An employee told her there is a rumor going around that employees are swapping shifts with each other to increase their pay. In this organization, anyone who is not scheduled to work but then works is entitled to premium pay.

One of the well known ways to exploit this type of policy (I explained this as “tradesies” in my book Lean Labor) is to find a couple of buddies and regularly swap shifts with each other. It may not be every shift, but you have to trust your partner enough to know that you can be up or down a shift and when the time comes, they will agree to swap out a shift with you.

This of course drives up costs without increasing output. Not a good outcome for any organization. The executive was pondering what to do. The challenge is that there are many legitimate reasons to swap shifts and the policy is intended to provide flexibility for the workforce but ensure coverage for the workload. A premium may be paid to encourage employees to work hours that they might not otherwise want, thereby providing liquidity to the system.

Addressing this would be tough because it’s difficult to discriminate between legitimate shift swaps and ones that were done purely to increase pay. But the rumor was expanding and if this practice spread it could ultimately lead to lower profits, poor morale and even layoffs of uninvolved people.

Before she acted, she asked my team to take a look at her organization’s scheduling data and see what we could find. This is a fairly challenging exercise because first you have to figure out who actually swapped shifts with who. There is no “marker” other than a premium paycode for one person. After that was resolved, we had a long list of shift swaps. Next we had to figure out a way to visualize that list to help interpret the data.

After a couple of different approaches, the team was excited as they realized this would call for a different type of visual approach. The reason they were excited is that the vast majority of visualizations required are bar, line and scatter charts. These charts do a great job, but we all like some variety!

In this case the team realized they were looking at a networking relationship between the people swapping shifts.

Using a networking diagram, they plotted the employees and who they swapped a shift with. What we wanted to know though was not only who, but how many times shifts were swapped since gaming typically occurs between a small group of people. For that we colored the arrow differently based on the number of swaps made over the time period analyzed.

schedule swap diagram

Below is the result of the effort. As you can see, the majority of the swaps are occasional and with a variety of people. Good news! Most people are swapping shifts as the organization intended. But after applying a filter to remove the occasional swaps there are two clusters of three people that are swapping significantly more times and with the same people. This doesn’t necessarily mean they are gaming the system. It’s possible that they have very specific skills and there is a limited pool of people they can swap with.

This information was illuminating for the executive. Out of thousands of people, she could now focus on six and get to the bottom of it quickly. She could also respond to the rumor with hard facts. Finally, it was peace of mind for her to know that the vast majority of her employees were using the policy as it was intended.

The data scientists behind the scenes and how they put a spotlight on dark data

Over the past 18 months I have learned a lot about analytics and big data, especially applied to the workforce. I spend a fair amount of my time speaking to customers, analysts and the media and besides the most common question of “do you have any examples you can share” I get asked about the people involved and process of developing these applications. In the spirit of “people are the most important resource a company has” I want to showcase that side of big data.

For timekeeping and scheduling, dark data…( to save you a quick trip to your favorite search engine, I mean data that is collected but not typically used, yet is still required) ….in this case the audit trails of any change made to a timecard or schedule.

In Kronos there are sixteen different types of edits you can make to a timecard or schedule. Each of these edits represents tiny trades of time and money between an employee and the company. Individually most are inconsequential, but in aggregate they represent tens or hundreds of millions of dollars. By and large most of these changes are transactions that everyone agrees to and are necessary… The employee forgot to clock in so the supervisor adds in an “in punch”. Or an employee calls in sick and the supervisor changes a paycode from regular to sick in the schedule.

Occasionally however there are situations where the changes are indicative of an issue. For example, a supervisor changes a couple of minutes around during the week on an employee’s time card and eliminates premium pay. Or a supervisor changes a schedule after the fact to represent that an employee only worked the hours they were scheduled.

These small changes are usually lost in the millions of annual transactions that occur throughout the year. And because they are so small they are usually missed by most reports and audit teams. Only when the employee affected has the courage to speak up does a company become aware of it. By this time the consequences for all involved are significant; from degraded morale on the part of the employee to unnecessary cost in terms of productivity, turnover and financial impact for the company.

As the economy improves and companies feel the pain of turnover and lost performance when employee engagement sags, we have been engaged by companies to understand how they can identify these situations. The companies know the answer is in the data because when someone files a grievance and points out the specific situation and dates, the HR department can immediately see what happened in the transactions.

The challenge is seeing these changes sooner; especially before someone is so frustrated they file a grievance or the behavior becomes obvious to all. This is where one of our data scientists who has a PhD in computer science realized that this is a very similar challenge to what retailers face when they are trying to understand what the millions of customer clicks represent on a website. The customers aren’t telling them why they are clicking the way they do and only a fraction of the clicks result in an order.

So the data scientist applied the same machine learning techniques on timekeeping data that retailers use when they analyze their web server logs. The result of his work however was very difficult to interpret unless you understood machine learning and clustering techniques. To simplify this we had one of our visualization experts re-imagine the output in a way that a lay person could understand. Her interpretation was amazing in its simplicity!

Secondly, the data scientist had created a very flexible tool. The first prototype had a number of tuning parameters requiring the user to take output from past results and enter it in to help weight certain parameters for future analysis. We recognized that aside from a data scientist, we couldn’t expect a typical business user to be able to perform this tuning. So we focused what the tool could do and eliminated the tuning parameters.

clustering dashboard

An example of the machine learning dashboard in Workforce Auditor

We were very nervous and excited about analyzing our first data set (we went in without knowing anything about the customer or their practices to ensure we didn’t bias the analysis). When we researched the results, there it was…we had found an issue that was previously unknown to that company. We tried it a second, third and fourth time. Each time we found something important to the customer that they suspected but couldn’t prove or they were completely unaware of. These small changes were indicative of million dollar +  issues that were looming for these companies but had now been avoided….very exciting stuff.

We found supervisors gaming schedules to improve their own bonuses (the company since tweaked the rules of the bonus). We found a store manager working extremely hard to rebuild her schedule each week because the forecast and automated schedule she received was off (the company immediately re-tuned the forecast for her store). There were many more examples and we realized that we had developed quite a versatile tool. Its power is that it can evaluate the actions of thousands of employees and narrow it down to just a handful of situations that require further investigation in a matter of minutes.

With so many positive results we fast tracked the technology. It’s now available as Workforce Auditor and is included with our Workforce Analytics platform.

Take a ways from this experience?

1) Skills and experience really count in developing big data applications, no one is going from “excel guru” to building a machine learning application overnight and it takes multiple people to get it right

2) involving (internal or external) customers and their data is essential; no one could ever build this without deep domain knowledge and many different data sets to trial

3) By focusing on the business problem rather than the technology we created something that was streamlined and easy to use rather than a feature laden product showcasing the power of machine learning.

When I have a little more time to write, I want to share how the newest member of our team used scheduling data and a network map to uncover undisclosed relationships in a company and what it was costing them….stay tuned!

American Payroll Association Launches its Lean Labor Course

If you are reading this, you are probably aware that I authored a book a couple of years ago by the name of Lean Labor. Its purpose was to apply the philosophy and techniques of Lean specifically to paying and managing the workforce.

I’m pleased to announce that the American Payroll Association (APA) has taken the content from this book and added its own knowledge to create the first Lean Labor educational course. It’s targeted at Payroll Professionals who are looking to run their own operations in a Lean fashion as well as support other areas in their business by providing improved processes and information.

In addition, the APA recruited Dr. Martin Armstrong, Vice President Payroll Shared Services at Time Warner Cable to teach the course with me. Martin, who is also a friend, has implemented many of the techniques found in Lean Labor within his own organization and was recognized for this achievement by the APA with a Prism award. Martin recently wrote an article for training (a leadership development resource) titled The Perfect Paycheck.

We’ve taken the fundamentals of the book and added our experiences over the last three years to provide a practical course in using Lean to improve the performance of an organization driven from a Payroll perspective.

We’ll be teaching the inaugural course July 13-14 in Las Vegas. For more details on the course see the description at the APA site.

Forget 1 + 1 = 3, I’ll show you how 0 + 0 = 1 million

Everyone knows that labor costs show up on the income statement, but the same workforce can’t be found on the balance sheet. It’s considered an intangible asset and does not earn a line on the balance sheet. Did you know that there is another intangible asset that has similar qualities?

It’s the data your company collects every day. Each time an employee fills out or edits a form on a mobile device, laptop, desktop or kiosk, it’s adding to the quantity of data your company owns. This collection of data is relatively expensive and is represented as either a Cost of Goods Sold or an Operating Expense on the income statement chewing away at your profits. Similar to employees it too has no quantified asset value.

CFO’s understand both are extraordinarily valuable but also have a difficult time articulating it as anything more specific than goodwill.

What I have seen in the last year however is that an increasing number of companies are putting these two intangible assets together and finding million dollar insights about their businesses. I am specifically differentiating from companies who have business analysts hard at work tearing apart their data and those companies that are empowering a broader population of employees with increasing amounts of information.

The reason that this is important is that business analysts are relatively few and far between. When the broader workforce can begin making data-driven decisions on a daily basis and view situations with perspective and context based on facts rather than relying on their instincts and week old reports, the value creation is exponential.

Why are some doing better than others in capitalizing on this opportunity? Analytics and Big Data are a frequent topic of discussion with every company I visit. Everyone recognizes the potential. But as with most emerging opportunities, most are talking about it and formulating plans. The successful ones are diving in, learning and profiting. Companies that have accelerated their labor analytics journey are finding they can…

  • Identify who is not regularly following the corporate policies put in place from clocking, to editing time sheets to aligning a schedule to demand
  • Identify employees that have figured out manipulate the data they enter into so they can game controls and metrics in a way that current reports can’t identify. This results in either fraudulent behavior or boosting performance metrics at someone else’s expense.
  • Identify employees who are not following policy and outperforming others resulting in new best practices.
  • Identify employees who are overwhelmed with manual production and employee scheduling edits due to smaller batches and increases in product mix. This change in production speeds and patterns with no accompanying improvements in scheduling techniques is resulting in more unplanned Saturday shifts and overtime to accommodate less than optimal schedules
  • Locate where cost accountants have made mis-allocations of labor costs due to a lack of visibility to where and when the costs actually occurred.
  • Rank employees who are receiving volatile schedules from week to week that can result in increased fatigue and turnover.

I didn’t recognize how big this was becoming until I noticed a trend in the conversations I have been having with companies that provide strategic consulting. Some of these consultants are telling me their jobs are getting harder and in some cases their revenue is falling. Why? One of their bread and butter techniques of identifying opportunities is by interviewing multiple employees from different departments and then aggregating data from different systems. What they are experiencing is that the “low hanging fruit” they could always depend on is gone. Customers have already figured it out. The Aha! moments consultants are famous for are getting harder to find with their traditional techniques. They must evolve too.

Where is your company in maximizing its ROIA (Return On Intangible Assets)? The following list highlights some of what the companies that I visit with were experiencing before they changed tactics. To see how you relate, take a survey of your spreadsheets and business analysts to see if you can identify these situations:

  • Is there so much data available that the reports are now being heavily summarized, and detailed and history is not being carried along?
  • Is there significant manual massaging, data wrangling to use a fancy term, to reconcile data from different systems…is this causing latency in delivering the information?
  • Are different functional areas frustrated with the lack of support in getting at data and beginning to start reporting processes of their own, especially as you move to middle management ranks?
  • Have intrepid employees begun adding thousands of lines of macros to spreadsheets to make them automated and interactive to the point that the spreadsheets themselves are unstable?
  • Do spreadsheets going to front line managers have hundreds of rows or more and multiple tabs to make sure they have all the information they need?
  • Do your spreadsheets analyze mainly the data that users enter on forms or through hardware but ignore the data that the hardware and software generates itself in logs and audit trails?
  • Are people frustrated by the fact that they know the data they need is resting on servers within your organization but even your best report writers are not able to put it together to answer the questions and hypotheses posed?

If you see signs of this in your organization, then a review of your reporting processes, skills and tools is in order. You have an opportunity to change the way you think about math.

The case for (and against) predictive analytics

It’s been a busy couple of months, and I’ve been learning quite a bit about business intelligence, big data and the opportunities and challenges in this space.

One area that has been a frequent topic is predictive analytics. As a lean guy, anything that promises improved business results by predicting the future immediately makes me suspect. I’ve been indoctrinated by the Lean philosophy to depend less on forecasts and more on the ability to observe and react to current demand and disruptions in a process.

That being said, I really depend on weather forecasts to get my kids dressed in the morning, so maybe I need to keep an open mind.

Predictive analytics is the next evolution in a long history of forecasting solutions that technology providers offer. For example, Kronos provides a labor forecast for retailers to help them in creating a labor schedule for the next week. This can be really helpful in supporting store managers in that it is really difficult to aggregate all the different patterns and unique events that are significant in scheduling a store. For example, day of the week is a fairly repeatable pattern and can be predicted fairly easily. Black Friday is also consistent. So rather than force the store manager to figure it out on her own, why not automate that in a forecast.

But ask a store manage if they completely rely on that forecast (or any other vendors forecast for that matter) and they will tell you that they use it for guidance. The reason for this is that there are many factors that affect a local store that aren’t used as drivers in creating the forecast. For example, if there is construction in the area that makes it more difficult for customers to get to the store or if a product is out of stock that week, the local manager will know that sales will not meet the forecast. This is why it’s so important to have someone knowledgeable about the local practice with the ability to react quickly to changing conditions.

Predicting levels of absence at a store or plant level is significantly easier that predicting individual absence. Make sure you understand the probability of success in a prediction. If you are providing guidance at the individual level, and the probability of a correct prediction is 60% then that means you are wrong 40% of the time. What actions are you asking managers to take based on this prediction and what’s the impact financially and in terms of system trust if it’s wrong?

A slight improvement over the status quo is good enough if a manager is already making the same decision frequently at an individual level. Hiring guidance for a similar job that has high turnover is a great example of an area where this works. Infrequent decisions are or if you are asking someone to take an action based on a predictive result is a different story. If it’s only a couple of percent better than the current method, internal customers are going to not understand the nuance of improvement and the project will be difficult to sustain.

When someone is having difficulty at home, it is likely their work will suffer. How is this behavior measured? While there are some outcomes that are measured like increases in absenteeism, these can also be attributed to difficulty with a supervisor or co-worker or a health issue.

Behavior is an area where correlation and causation can easily be confused. While we sense through data that something is wrong, it’s still going to take personal discussion to find out what the cause is. If you try and let software predict what the cause is and the manager takes a wrong action, it won’t be too long before the software isn’t trusted.

It’s very tempting to look for patterns in the data we already have. And no doubt there is lots of value hidden away in there that we have yet to mine. But we need to be careful to not try and solve every problem we have with existing data. In many cases, new data will be required to capture the true drivers of an event or behavior. This is a much more difficult endeavor.

So is there a future for predictive analytics? Absolutely. We just need to treat it like any other tool in our bag and not go around thinking that every employee problem we face is now a nail for predictive analytics to hammer.

Where are areas where predictive analytics excels?

Are there significant consequences for missing something?

Safety is something that comes to mind. If we can improve our ability to predict an increase in safety risk by even a few percent, the savings can be significant both in life, limb and dollars. This is a great area for exploring. Part of the solution needs to include understanding the drivers required to reduce the risk once an increase is predicted.

Will improvements in processing power or improved algorithms provide better insight than before?

This is the case for weather prediction. The data and algorithms were overwhelming the processing capabilities. As capacity to process improved, outcomes improved. This is also the case for customer behavior analysis. With lots of new data and increased granularity lower costs for processing have changed the game. Do you see similar opportunities with respect to labor analytics in your company? This could be a rich area to explore.

Predictive analytics is an interesting area, but let’s balance these efforts with the basics of making information more available and easier to use. Only then will we truly empower our employees’ decision making capabilities.

India poised for a new age

I just returned from Kronos’ first customer conference in Mumbai. It’s an exciting time for the Kronos India team as they now have over 100 customers locally.

During the week Narendra Modi took office as the new Prime Minister. The media and citizens have an optimistic vibe. Modi’s messages include increased transparency, eliminating nepotism and other forms of corruption and improved economic conditions. He has a history of welcoming foreign investment and a take charge attitude. Already there are examples of families of government officials who are now rejecting long time government perks saying to the media that they want to be treated the same as everyone else people. Indians are ready and hopeful for some good news.

Indian RetailA traditional and common view of retail in India

I’ve been visiting India for the last seven years. Over that time, economically, there have been significant ups and downs. For many of the companies I’ve been visiting there has been tremendous progress in terms of how they operate including managing their workforce. For retailers, there is an increasing appreciation for larger chain stores as compared to the small stalls that line the streets each specializing in just a handful of products. While retail chains and larger grocers are still in the minority they are leaping forward in their thinking. Last week I spoke with the Head of Operations, Hemant, for a retailer of electronic and electrical consumer products that has over 100 locations across India. It has just completed the transformation from a push to a pull strategy with respect to moving their inventory from DC’s to the store. As a result of this strategy they are experiencing increases in revenue as stock-outs are reduced. Margins are improving due to the reduction of discounts of excess inventory. The return trips of unsold inventory to the DC has more than paid for the incremental expense of smaller, more frequent shipments. Hemant is now moving his sights onto the workforce. He recognizes that pursuing a low-cost labor strategy won’t work. “How do we differentiate our stores when our competitors have the same products with the same types of employees? We need to pay more for highly skilled employees to guide our customers to the product that is right for them.” To pay for this increase in skill, Hemant is looking to make sure the staff is scheduled when the customers are there. This is more difficult than for most retailers in the U.S. as its employees are all full-time. Increased utilization isn’t even the main priority. Hemant continues…”During slow periods when employees have completed some training and refreshed inventory and still have time on their hands they become bored and sluggish. It’s tough for them to get their energy back when customers begin entering the store again. It’s important to make sure they stay busy and energized throughout the day.”

retail storeModern retail chain in India

I visited with a number of manufacturers and there is a widening gap in their approach to labor. The head of HR at one large exporter of textiles felt very strongly that there is no place for technology in managing people. Their supervisors manage the 17,000 people at one plant just fine according to this executive. If there is a problem, adding a couple of extra people is no issue because their wages are low. He did however acknowledge a machine utilization problem. This is being addressed by adding sensors to the machine to let management know when it the machine goes down. I’m looking forward to visiting him in the future to see if his perspective changes.

Diametrically opposed to that perspective is a manufacturer of cellular phones who uses technology to analyze the behaviors of supervisors to understand if they are favoring one gender over the other in scheduling overtime or if they are showing favoritism in granting leave requests. This company has also identified 250 out of 20,000 employees who are critical to keeping the lines moving and know instantly if they are late for work so management can begin reacting right away.

There is no shortage of talent in India, let’s hope Prime Minister Modi is successful in his efforts so that India’s talent can be converted to economic success.

Those office workers have it made

How frustrating to walk off the production floor after a grueling variance meeting and see office workers surfing the web or standing around having a cup of coffee. Even worse, urgent engineering change or variance requests disappear into the ether unless they are constantly expedited. How about customers waiting for weeks to obtain an answer to why a product failed in the field? Should they change their installation practices? Was it a material problem?

Increasingly I’m being asked about back office processes. If it was just a few people causing inefficiency in a department, that’s an easy fix. But it is never quite that simple. It seems like there are waves of busy times and then periods of slow times. Depending on who does the work (we all have our goto people at corporate), the outcomes and response times can be very different.

These are typical scenarios:

Recently I was taking a plant tour with a large Auto OEM and the managing director felt that their production process was in pretty good shape. He then opened a door to a large room full of QC engineers and stated that he wasn’t sure if he had too many, too few or just the right number of engineers working.

A couple of weeks ago I was speaking at a workforce management seminar in Belgium and a manager from a large pharma manufacturer asked about improving the productivity of their QC department whose employees work in a lab environment.

I always ask the same questions. How volatile is their work? Is it slow sometimes and busy others? Are there small jobs and big jobs that flow through the department? Does work get re-prioritized based on production or customer demand? Are the people working in the department reasonably happy and skilled employees?

The answer is generally yes to all four. What I then draw is the curve described by the Kingman formula. What it shows is the relation between utilization and wait time. The drivers of the function are the variability in arrival times and variability in the cycle time.

A good introduction to the formula is available on Wikipedia.

An example of the curve is shown below. What is immediately obvious is that wait times increase dramatically as utilization approaches 100%. The result of this is that departments with little control over the variability of demand and cycle times must run at lower utilization rates in order to maintain acceptable service levels.

kingmans formula

And just because a process is documented and looks efficient doesn’t mean that the variability has been driven out of it. Just like the routing on a production floor, an office process is generally documented assuming perfect conditions…open capacity, consistent workload, fully skilled employees and no interruptions. In other words, it assumes perfect standardization.

We are all familiar where high variability in the process and the financial pressure for high utilization causes long wait times. The doctor’s office is a familiar one. For those with appointments at the end of the day, patients can be waiting for 30 minutes or more after their scheduled appointment. It’s tough for the office to schedule the right amount of time for each patient because it’s difficult to know what care each patient will require. It also has to deal with patients arriving late. But if the office doesn’t book the schedule pretty full, the office can’t be run profitably. A doctor’s office is a relatively simple example and many have implemented fixes to ensure higher levels of service while maintaining high levels of utilization. Canceling appointments for patients who arrive late, increasing flexible capacity by adding Nurse Practitioners and scheduling different amounts of time based on the predicted effort for a scheduled patient are a few examples that increase utilization and maintain service levels.

Knowing that standardizing the process and shaping the arrival times of the work will help maintain service levels while allowing for the increase in utilization, the next challenge is identifying where the variability is the greatest. I’ll assume that the first place to look for that answer is with the employees themselves. This is a good start and take what improvements can be identified. The next level gets a little harder. Workflows often cross departments where priorities change and individuals don’t have knowledge of the entire process. This is where some data is going to be required. Often this is also where the improvement efforts slow down. Collecting data around office processes can be challenging. From employee resistance to complex flows it becomes difficult to know what to track. One change in environment that seems to be going underutilized is that office employees are moving to electronic records. From Engineering Changes, to electronic lab notebooks to CAD systems to document management, employees are logging on to applications, doing their work and logging off or checking into another piece of work. As compliance around every aspect of our lives continues to increase, technology producers are tracking our every move to produce a record of what we have done and when. Often this information goes unused unless it is needed for some type of inquiry.

This information is an untapped goldmine for understanding workflow in the office. This electronic trail reconciles work and employee and time. With this you can generate metrics to understand when it is busy, when it is slow, and how long do different types of work take to traverse through a process. When this information is connected to a workforce management system you go from understanding what happened to predicting issues and being able to address them immediately by shifting capacity to where it’s needed and prioritizing work before it is late.

As with production, the idea is not to work the office staff harder. It’s to improve and standardize the workflow through the office so service levels improve without increasing capacity. Imagine lab results returning faster. Engineering requests approved on time without follow-up. Customer inquiries and complaints responded to more quickly. How would this impact your production lead times and competitive stance in the market? All with no increase in labor cost. Kingman has done the hard part by showing you how to improve utilization and service times and what the ROI will be. Now it’s your turn to drive variability out of the office.