Businesses who want happy customers in 2018 need to commit to delivering near immediate access to products and personalized services, and having a reliable and efficient supply chain is the best way to consistently meet this demand. KFC recently had to learn this lesson the hard way; the global fast-food giant had to close 750 UK stores due to a shortage of chicken. The popular fried-chicken chain had recently switched up its logistics and distribution model, transitioning from smaller, localized distributors in favor of a centralized warehouse intended to cover the whole of the United Kingdom. The model works for similar businesses, but KFC ran into issues working with this new approach, and their inability (or unwillingness) to vet the new supply chain model cost them dearly.
KFC could have avoided a crisis like this by leveraging big data analytics. Companies are learning to rely on digital technologies to streamline, optimize and improve their internal processes. Digital transformation is an imperative for almost every line of business and, in some cases, it has completely disrupted industries. The “Amazon effect” causes consumers to expect fast and reliable access to goods, and has influenced customer expectations across nearly every sector, including fast food. Same-day delivery, groceries on demand and ridesharing accessed at the touch of a button have become everyday expectations. From the moment they place an order, people want more product choice, cost effectiveness, convenience and full visibility. If there’s a problem, businesses should work to resolve the issue as quickly and painlessly as possible, and optimally without the end consumer even knowing that there was ever a problem. Customers don’t have the patience to be passed from department to department with multiple re-explanations of the problem. In this culture of immediacy, businesses dealing with increased scrutiny have to treat their internal processes as the key to meeting, and exceeding, customer expectations. While small deviations from standard processes may seemingly cause minute changes, in reality, the aggregate impact of small deviations almost always make a significant impact on a company’s efficiency, and in turn, the bottom line.
Identifying and Addressing Inefficiencies
Of course, companies process optimization is nothing new. Every business wants to save time and money by working more efficiently. With the increase in digitization, more and more business processes are being driven by IT systems. These systems generate enormous volumes of data and, as a result, many businesses are sitting on goldmines of information — but without the right tools in place, they don’t know how to make sense of it.
To help analyze and improve upon their core operations such as purchasing, logistics and production, companies have traditionally turned to management consultants. This option can uncover patterns of inefficiency and noncompliance, ranging from heavy process rework to duplicate payments, but consulting can also take quite a long time, and can be very costly. In addition, external consultants typically rely heavily on workshops and interviews with the existing operations teams to collect information and provide context, which often causes significant disruption to the organization while the processes are being analyzed. Manual process discovery also comes with an inherent risk of bias and subjectivity; when a member of staff suspects their performance is coming under a microscope, their answers to important questions have a tendency to shift.
Creating Process Transparency
Finding a solution to a clear-cut problem can be difficult, but finding a solution without a clear-cut problem is next to impossible. Identifying issues within core processes can be like finding a needle in a haystack. Fortunately, today’s analytics solutions are powerful enough to help even the world’s largest enterprises sift through massive amounts of data, identify unseen patterns and correlations, and make prescriptive recommendations to improve processes. With previously uncovered insights, companies can make more informed decisions and better satisfy customer demand. One category that delivers such insights is Process Mining, a new form of big data analytics software that helps businesses highlight process deviations and pinpoint inefficiencies within their core operations.
Process Mining technology uses the digital traces left behind by every IT-driven operation in a company and provides complete transparency into how business processes are operating in reality. With a powerful analytics engine under the hood and an intuitive user interface, Process Mining technology takes event-log data that exists within every IT system and reconstructs business processes in real-time, allowing users to quickly and easily identify opportunities for process improvement. Some advanced Process Mining solutions are taking things a step further: using the latest machine learning algorithms, the system acts as an automated business consultant, scanning processes for improvement opportunities and making proactive recommendations to end users. Using this insight, process owners, CFOs, logistics managers and heads of purchasing can accurately analyze their core operations and quickly identify any root causes of delays or bottlenecks. This new capability enables business and IT leaders to address inefficiencies and non-compliance issues at their core, and to make more informed decisions about which areas need the most improvement.
In the age of digitalization, staying ahead of the competition means analyzing internal data in order to determine the most efficient business processes possible. With an improved understanding on how their organizations are operating in reality, businesses will be able to avoid previously unforeseeable issues and be better positioned to meet or surpass customer expectations — because what’s a KFC without chicken?
Alexander Rinke is Co-Founder and Co-CEO of Celonis.