For the first time I attended at the Process Mining Camp and at the same time I hosted an exploratory session on the theme Social Mining, that I will post in a specific post about it.
Process Mining Camp is organized by Fluxicon, Process Sphere’s partner, and is dedicated to bring a mix of something old (the concepts, important for those are putting their hands on process mining for the first time), case studies (evidence that process mining can be used in multiple challenges) and something new (future trends that can be just around the corner in the next couple of years).
If there is an event you should attend to actually learn about process mining and how is making the difference, helping organizations to change and adapt faster, this the one you should save the date.
This is my take on the event.
Anne Rozinat – Opening Keynote
Anne briefly presented some numbers about the process mining community that includes end users and researchers. What is evident is that the figures are growing and it’s it should be taken more seriously by industry analysts (despite Gartner already put it on the radar).
Then it jumped to an important concept, that is taking advantage of maps to visualize and interpret data. It introduced the work of Denis Wood, that was very brand new to me, I’m more familiar with the concepts of Edward Tufte and my countryman Manuel Lima. Denis had a very important contribution in the way data is visualized and it made meaningful to anyone to be able to interpret it. In the early days, as a geographer, he designed maps around, traffic, power grids, social, mail delivery, energy consumption and others. The point was regarding how we can get information to make decisions from data, taking into consideration the complexity of how difficult is to analyze it, particular in a era where processes are much more difficult to analyze once factors like socialization, exception handling, real time adaptation and others. For those that continue to continue to defend grimly that there is only one way to analyze a process map, let’s say using BPMN notation, are still is denial that that particular or any other process notation, lacks (sometimes) other important dimensions that are important, for example distance or time. I say from Anne’s presentation that there is no specific best process model notation because it will depend on the analysis perspective. This is important because it’s clear we need to deal with variety. Sometimes, we need to create other ways to discover and understand the process because if we are exploring a social network, how such process can be modeled alike BPMN models or how can we identify the dogs that bark and probably bite (a hilarious Denis Wood concept) when the mailman is going to deliver the mail?
Tijn van der Heijden – Rabokank’s case study
The case study presented was about supplier invoice payment. Some real value presented here, when the real process map showed that when suppliers send invoices for the first time the process gets fuzzy and it does not work as it was designed. This is an important conclusion that most people think that is easy in theory to analyze structured processes, even when mostly they are automated and the work that is being executed is a mirror of the automation. Well, is not.
Lalit Wangikar – Process Mining for operational performance improvement case study
Lalit, travel to the other extreme around ITIL processes types, but on this case the customer stand up and said that our processes are unique. We have a job shop approach. This is not an assembly line, thus these are difficult to analyze, he said. The approach was to collect huge amount of data, from e-mail and phone logs, that in the USA is easier to implement, not in Europe, where legislation forbids the employer to have access to information stored into a computer, even if the computer is property of the employer.
Philipp Horn – Purchasing Process in Volkswagen case study
Liked the concept about the importance of procurement at Volkswagen group. 60% of the money flows are related with suppliers operations, with lots of differentiation among car brands, but also with a buck of components and synergies occur in the process.
One important fact presented is making a workshop for buy in and acquaintance with process mining approach, before the project starts (where I heard this story before …).
Other point was related how to get the root causes of the problems or bottlenecks in the process, in their case, the approach was more based on serendipity, that proved to work best because sometimes people disperse and present very different ideas getting difficult to structure it.
Also we pointed that, as far I understood, it was possible to identify that in some cases they discovered that the wrong participant was making something on the process. This is critical, when on these days we are overwhelmed with information to process and some make us to waste time and be unproductive. If this kind of challenge exists in big companies, where is virtually impossible to make an assessment on everybody and managers want to make it most out of its human resources the realignment of a workforce can be done using process mining.
Michael Cunningham – Suncorp insurance company case study
Michael talked about a project at Suncorp in Australia. The process that was improved was related with managing incidents claims, with 600.000 claims a year, or 1 claim /minute.
I considered that the most important part of the presentation, besides the results, was the outcomes of the project the people vs culture (like an ad-hoc maturity model). He pointed out 3 different stages:
- Understanding – show don’t tell, but it’s real.
- Competent – reality vs preconceptions
- o That’s not right;
- o I don’t trust it.
- Transformed – Process models just an idealized high level view.
I find this approach very practical and workable, because to make the case for process mining you need to reach the transformed state. For someone like me that experience this kind of reactions from customers, I assure that you don’t need to buy a maturity model for analyst firm to figure it out if your practice reach the right maturity level. I would only add a first stage that is Unbeliever – We don’t have data.
Walter Vanherle – Case study on Security services
Process mining is really beautiful, but from manager’s buy in perspective, do not present them what they already know, do show what they don’t know and probably don’t like what they are looking at, that is where the improvement opportunities are coming from.
Walter pointed an important technical aspect about data quality. The timestamps. The stamps from the system used to support the service were different from the mobile phones used by the Security personnel where data was being recorded, making difficult to be sure that the SLA’s were being met and everything related to contract management.
Youri Soons – Case study Auditdienst Rijk
Youri presented the application of Process Mining from a auditing perspective. The Dutch National Auditing Service monitors the annual reports of all Dutch ministries and provides assurance on the financial statements that are included.
Apart from the case study, he make it a point trough a live demo, presenting how it is possible to be sure at 100% the segregation of duties was in place and the controls existed. I’m very keen with compliance, as someone in the past was responsible for audits and it always had the impression that something could be missing based on the sampling principle. Youri clearly pointed out how it is possible to quickly discover what were the instances (from the thousands) that did not comply with the segregation of duties.
Wil van der Aalst – Closing Keynote
Wil went through a historical journey around process mining. Once I’m addict on the theme I will fast forward most of the presentation (Will actually joked in the beginning that making a presentation on the history of process mining, could suggest that his work around the discipline was over when he still have much more to give).
One interesting point was related with the fact that data, models and systems coexistence is difficult to separate (for this purpose go study around the viable system concept) and that’s the reason that there is so much confusion on BPM trying to break in parts the pieces that make part of the BPM philosophy.
Hence, for me, the key point were most about the future challenges, around the big data effect, that every two years everything doubles, meaning that the challenge is turning event data into real value, once the quest is how do we know that the data we are using can cover only a fraction of all possible behavior (think for a minute about data related with social interaction around end users dispersed in a multitude of systems, some or most, out of control of the agent that is taking control of the process). One of the dangers is to manually generalize.
Again, on the big data effect, and before jumping on criticism that this the reason that process mining can lose its hedge once it does not have the possibility to sit on the flood of data were the action occurs, I would conclude that is better to have the possibility of being exposed to part of the reality than live totally in ignorance and don’t have a change to transform your organization.