The worn out effect
Time and time again, once in a year the choir of BPM unbelievers rise and start promoting the idea that BPM is dead (like the ones that promote disconnect from Facebook). The communication plan is typically aligned with the Spring / Summer conference season. The ones that like to announce that BPM passed away misinterpret what BPM is all about.
For historical reasons, BPM is seen as something between a choice about ways to improve operations using quantitative (based on process data) or qualitative improvement methods (coming mostly from quality management) without using any kind of technology. These kind of practitioners argue that before lay down technology it must be used all human available approaches to make the most of business processes in order to achieve the wanted results. Typically the path is discovering the processes and building a value chain, model core business processes, define key performance indicators, start measuring and with data to be looked at, identifying were the pains are and start implementing improvement programs. I would like to point out that with the increasing change pace, following such an approach can lead to business stagnation, as also watch the competition taking your place in the market as they find other innovative value propositions. On the other hand, BPM is also seen as a software package (the BPM as it is called) for workflow automation capable to automate every piece of job performed inside companies. For the reasons that workflow systems were built to standardize business processes and assure that the very same result is attained every time the process is executed, independently from who and were and “how” is executed, forgot to retain an important implementation aspect that is to maximize the use of human knowledge without projecting capacities onto technology. Despite the fact that in the last couple of years the nature of BPM systems changed, allowing to support knowledge intensive processes, managers still like to enforce a way to perform. Believe it or not, most people who are interested in the subject define BPM as something that is only covered by one of the sub domains previously presented (purely operations or purely automation technology).
Professionals and other people involved in improvement and change initiatives, experienced that when a non system thinking approach is implemented, I mean, is pursued only one of the paths, operational excellence or automation, in the middle of the journey it becomes much more difficult to improve. If in the beginning the gains were big and visible, ultimately it starts to become more difficult to be ahead of the curve. Companies that are more operational focused try to invest more in professional qualification of their human resources (more six sigma and black belts), the ones that are more technical oriented, keep updating the systems (continue to paving the cow path with fresh technology) if there is budget for that. Still there is not anymore big gains and big results. This is when the worn out effect starts to appear.
The worn out effect is a function that could be depicted just like an hysteresis curve. When an magnetic field (an improvement project) is applied to a ferromagnet such as iron (the enterprise), the atomic dipoles (the processes) align themselves with it. Even when the field is removed, part of the alignment will be retained: the enterprise has become bpmtized (magnetized). Even if do not understand material properties and physics, you may notice that the flux density increases in proportion to the field strength until it reaches a certain value were it cannot increase any more becoming almost level and constant as the field strength continues to increase (the worn out effect). This is because there is a limit to the amount of flux density that can be generated by the core as all the domains in the iron (the enterprise) are perfectly aligned. Any further increase will have no effect.
Source: Wikipedia adapted
Re-engineering is driven by technology
Not a long time ago, maybe 5 years ago, there was a lot of discussion about the lack of coverage of wireless networks and the impact it had to information access (to the point that some argued wireless could be something proprietary of business premises). In those days, magazine editors, opinion makers and mavens discussed the important points like cafeteria shops strategy of adopting, or not, wireless networks as an asset to communicate with consumers, a kind of short sight vision, once that consumers at coffee shops are there to rent a table for a couple of minutes or some hours depending how they want to spend their time, just to take a cup of coffee and eat a donut, read a book, do some work or socialize with others. In those days also, there was a trend that mobile internet was compromised because it was expensive and as such, consumers were very slowly adopting it. Borrowing Nassim Taleb words from the latest Antifragile book, these kinds of assumptions are typical from Fat Tony fragilista. Marketers never understood that consumers don’t want corporate information, they want product discounts and the human being as an animal it is, like to socialize as a condition of its own existence. As someone that remembers in the past the importance of keep in touch through the mail I don’t see what is so special about that in rural areas, grandmothers learn how to use social networks to be in touch with their beloved ones. Regarding the price theory and applications of products and services, everything is studied, again I do not see the point of how something that is in early adoption can’t have a high cost that is skimmed in layers as it becomes mainstream.
Fat Tony fragilistas have been discussing the propeller components that are sparking new ways of structure operations and perform work. The so called perfect storm of social, mobile, cloud and information. But like Michael Hammer pointed out in his seminal article Don’t automate, obliterate, “The usual methods for boosting performance—process rationalization and automation—haven’t yielded the dramatic improvements companies need. In particular, heavy investments in information technology have delivered disappointing results—largely because companies tend to use technology to mechanize old ways of doing business. They leave the existing processes intact and use computers simply to speed them up.” Curiously, Cloud Standards Customer Council document Social Business in the Cloud: Achieving Measurable Business Results, points out in the early beginning of the document the confirmation that cook and design operations on top of the propeller “Some organizations, usually with strong business and IT alignment, are realizing that social technologies initiated for one business activity can spark innovative uses in other areas. This insight creates motivation to widen the reach and adoption of these investments. Unfortunately, this is not always the case; many organizations are still struggling with how to justify social business investments in a way that reflects their actual business value. This inability to quantify the business impact of social technology has become a key inhibitor to adoption. Further exacerbating the challenge is the perceived complexity of combining social technologies with investments in adjacent technologies like cloud, big data, and mobile computing.”
As an Engineer I am, I always found dangerous to keep evolving a product based on the same design principles, because a product as a system it is (like a process) have a predefined function constrained by a set of components that are connected and behave to assure the function is attained. Those components have their life cycle, meaning that at some point of time they will get outdated and will underperform compared with new components that were built on top of new technology developments. Also a system is designed taking into consideration a set of guidelines that are function of the context it operates (for example there are watches designed to dive and others don’t, despite they have the same function: to inform you about the time). If you look at this moment that new mobile phones models lack the wow factor introduced by Apple when released the iphone and all the copies that followed after, you understand were I’m coming from. Mobile phones, like processes, also suffer from the hysteresis effect at some point of time, and contrary of some past assumptions and values, I do not believe anymore that you can do enterprise architecture from a straight to bottom approach. Sometimes if you look to the Application and infra-structure layer and delete all those Archimate objects and alike and simply deploy new technology, because from the very first minute you are winning. I am not saying that the good practice of making impact analysis, like if we change the web services of applications are is the impact I have in the overall application landscape is useless. The job is important for application maintenance, evolution, coherence, daily management, but not for disruption because you get tied by all the relationships that your enterprise architecture was build on top of. In order words, those models limit the capacity to think radically out of the box and innovate.
Paul Harmon, connoted as having a balanced, classic and even conservative vision, points out:
“IT is no longer a service – it has become the essence of the company’s strategy. Companies no longer worry about reengineering major processes and are more likely to consider getting out of an entire line of business and jumping into an entirely new line of business to take advantage of an emerging development in information or communication technology.”
Just think about what happened with Zipcar. Zipcar was bought by Avis in the beginning of 2013. Zipcar was considered as a no business, a no threat, a no nothing, until it started to eat the market share of classic rent a car companies. Zipcar had a very different business model. The charge back was based on pay as you go and even is able somehow to compete with taxi service if the areas were you plan to travel are covered by Zipcar stations. But more than a simple business model for the user, it relied on technology for car location and tracking, car access (lock / unlock the car), filling the fuel tank and ultimately all the customer relationship interaction. Designed it’s business processes in such a way that most of the activities were transferred to the customer (he does 99,99% of the tasks) and also the controls (need to check dents and vehicle malfunction). Technology is allowing not only to reinvent business models, as also to start new ventures without large capital investments.
The rise of bottom up technology
In shelving humans beyond automating repetitive tasks
Gartner points out that on 2016, financial institutions will run their business mostly based on cloud. Gartner believes the driver is cost, because banks margins are decreasing and it’s necessary to find how to cut costs. Cloud technology erases the high upfront costs of licensing technology and maintain an IT infrastructure. The IT infrastructure is the ultimate backbone of this industry sector. Yesterday, today and tomorrow we are experiencing exponential growth in technology delivery not only because of the Moore’s Law, but most likely because Banks (as also Military and Defense) demanded quick processing speeds, global communication and smarter technology to support their operations as effect of globalization. Enterprise technology is a just a derivative of the needs and wants of Financial institutions. Hence that is true that it’s possible to operate at a fraction of the cost, if Banks and Insurance companies run of top of cloud deployments, but that is a shortsighted vision of what the customer relationship will be in a near future.
A technological revolution is under way in banking, with systems lined up to slowly replace humans in branches. This is an interesting point of view taking into consideration that the type of relationship we defined with account managers is based on trust (like a medical doctor). Our savings are result of our working effort, it is an extension of ourselves, an award of the ability to execute taking into consideration the environment you choose to live and how you choose to live (you can make a decision to pursue an affluent way of living or a more modest one, but still your savings are a reflection of the decisions you took in the past you count on). For this reason we believe when dealing with an account manager we were puting in his hands our fate. What it’s interesting about this shift is the hands and the heart of the account manager is going to be replaced by systems. Probably we can talk with him using video conferencing, but it will be very unlikely we will have face to face interaction like we had in the past as we allowed technology to perversely substitute our good account manager. This revolution that is underway, heading to the virtualization of banking operations will change forever the way we contact with banks. Instead of narrowing around cost cutting, managers should start thinking that the approach to bring operation revolution is more towards a dehumanized relationship, self serving relationship type.
In putting customers at the center
I never understood quite well why there is a market that creates clutter in the way we interact with information. Why we want a smart watch or bracelet to perform the very same actions we can do in a smart phone that it’s a much more convenient way to do it? Regarding glasses, like the Google glass project and what is becoming real time life experience, we are facing a paradigm change that ultimately can get traction from the consumer side if this new kind of life is what humans want to embrace. Nevertheless, let the market decide. The rise of wearable tech is opening new data sources of signal emission and reception, increasing the big data pipe line and the internet of things class.
Machine to machine communication is becoming wide spread backed by the introduction of new embedded sensors and communication ports or overhaul of existing systems that can communicate when they were put into operation some years ago that was not a possibility. Utilities companies are making an effort of deploying devices in energy transport lines to monitor equipment reliability and prevent breakdowns and disruptions. On the other hand the looming of data interoperability standards is allowing to finally deploy internet of things philosophy. This allows for example to deploy predictive maintenance in practice. Maintenance costs of the aging of assets is a major cost item, which can be significantly improved by a more accurate prediction of the remaining lifetime. Most of the times, assets and assets components are replaced/repaired or overhaul either thought corrective maintenance or scheduled maintenance. Corrective maintenance creates loss of service quality and revenue assurance, whereas scheduled maintenance keeps does not optimize the replacement of future faulty parts. Combining human expertise with data available the transmitters spread across the asset network, it is possible to utilize expertise in reviewing and making recommendations based on data values and trends. When for example it’s necessary to intervene in power lines, applying machine to machine communication it’s possible to isolate to a minimum service breakdown and in some circumstances if the network have redundancy to keep delivering the service.
In airport management it’s possible to deploy strategies towards unified travelling experience. Based on mobile devices, the airport infrastructure, the other intermodal transports like trains, weather conditions, disruptions like strikes and alike provide a new way to communicate and organize the passenger trip since we booked the flight. This includes self-service check-in and on boarding, baggage tracking, information about flight status and related travel modes (like commuting using a car or a train), terminal geo localization and guidance (like informing the passenger it’s time to move security screening), cross sell services, like retail, car hire and disruption management with lateral services provision like non planned hotel stay if the airplane have a technical problem and cannot take off.
In helping humans to search information and make better decisions
I made reflections in the past about the shift onto knowledge management (ACM is under BPM umbrella) and the need to adapt semantic web approaches when interpreting information and hence I am not going to repeat myself. When I look back when these articles were published (3 years ago) and that in such short period of time the machines kept producing data like a bottle filling line in exponential growth. IDC that like to measure how much data is produced, stated (also in 2011) in their report Extracting Value from Chaos that in 5 years that created had a growing factor of 9. According to their study, in 2010 it was produced 1 zettabyte and in 2011 1,8 zettabytes (almost reaching the famous duplication factor from Moore’s Law). Beyond the numbers, most of the data is unstructured and organizations have challenges dealing with such large quantities of data and in formats that are not anymore in transactional form, like video, photo, documents, and do not reside only in organizational systems, part of that data is in the cloud, outside physical frontiers, contrary to what happened in the past, when for example retailers store and process all the consumer transactions to estimate behavior, promote products and cross sell.
As IDC expresses:
“unlike our physical universe where matter is neither created nor destroyed, our digital universe is replete with bits of data that exist but for a moment — enough time for our eyes or ears to ingest the information before the bits evaporate into a nonexistent digital dump”.
Hence the challenge here is how you get access to the right data, transform it into information and create knowledge (link to the post). As I also pointed out in the article I wrote in the book Empowering Knowledge Workers, current business dynamics impose other ways of performing and organizing operations. We are entering into the experience of the real world enterprise approach where it is virtually impossible to manually analyze a process with such is the diversity present. Hence other approaches are necessary as execution is much driven by the capacity to transform data into information and then in knowledge. Like self service reporting in Business Intelligence and Analytics tools, systems will start automating to find, select, and explore data relevant to the workers job taking into consideration the execution context.
Do we need granular Enterprise Architecture
Yes we do. If we want to transform the way bank operations are structured like the examples presented above, it’s necessary to understand the impact when we invoke the web service that checks signatures when a customer is making formal investment decisions. But we don’t if we want to create a change agenda, re-engineering or a paradigm shift.
The idea is: as a manager how you create the future? How you look to the palette of available technologies and setup a vision what you can do with it to change your business? Forget the boxes, the arrows, the dependences, because all of this does not interact with the outside environment of your enterprise, once it’s already a map of how you are structured (in the past for the present) your processes, applications and IT infrastructure. It’s not a reflection of what can be the future of your enterprise, worse than that, it blinds you how to shape the future because your current enterprise architecture map constraint the way you think about it, in order words it filters your creativity to create the future.
Applying the Viable System Model approach think in the horizontal boundaries. What are stakeholders your company serve? How they are served? Are they by any means served?
- With your customers: what kind of technology your are allowing to let your customers be at the center of the operations? How you let your customers go for digital with a Smartphone if they are used to that kind of interface or have money to buy one? How to you kill paper and e-mail?
- With your employees: how do you assure interoperability of data between systems? How to you help your teams auto collaborate? How to help your workers to find the right information and to reason?
- With your assets: how you allow your assets to communicate and use that communications to integrate as part of your operations inside and to the outside world, serving the customers and help them to make best decisions? You do you break the silos between the outsourcing providers that in the past supported cost cutting decisions but produced a hike in information clutter?
- How do you deploy algedonic channels? You do you prevent to get off a market, because your medicines, your soda water have health risks and you were the last to know, once your organization filtered the information across the organizational units?
Just do it.