How Adaptive Case Management can be deceiving

One year ago,  the first edition of BPM Conference Portugal was up and running. Today we are now one month of the completion of the second edition. I was remembering and revisiting some of the key facts of the conference and if something dramatically changed since 2013. Cybernetics and Adaptation were two key themes them.

Last year during a side discussion, there were some arguments against and supportive of the motion about Adaptive Case Management and how knowledge workers would made the difference is achieving goals and competitive advantage, more based on guidelines rather prescriptive way of working. In fact in theory, I agree with the approach if the nature of operations are dynamic, rather than highly structured (as pointed before in earlier blog posts). At that time, the supporters against the motion argued that human nature can disrupt the guidelines (because they tend to think by themselves) and instead of knowledge workers companies, would have a company of heroes.

In the beginning of this year, Patrick Lujan after reading BPM a year in Review 2013 interacted in Twitter arguing with the very same argument, that if you have the bad knowledge workers (and most of them are really bad), decentralized, goal orientated management, smart advanced technology will not make any difference.

 

Knowledge Workers tweeter stream with Patrick Lujan

Knowledge Workers tweeter stream with Patrick Lujan

 

During my today’s reflection, I remembered a part of Beer’s book The Heart of The Enterprise about the loss of human autonomy and how it hurts organizations. Beer wrote brilliantly, about interviewing fictional managers that wanted to change management style (towards Adaptive Case Management orientation):

 

We hope that we are a modern and progressive management team. We have put ourselves through business schools. We have studied, and tried to understand, behavioral theories of management. We had much discussion of Theory X and Theory Y, and we have used consultants in personality testing, managerial grids, ans so on.

As a result, we have abandoned autocratic methods. We have made it clear that we expect our operational elements to work autonomously. We just hope that they can do it … However, we have embarked on a very elaborate management developed program for our people, and spent a lot on sophisticated recruitment techniques, so we have some confidence that will be well.

As far as we board members are concerned, however, and to be perfectly blunt,  there is something of an “identity crisis”. What we are ourselves supposed to do? If we are were to give rulings about things, that would be autocratic. So we have reduced ourselves to the role of advisors – benevolent, avuncular holders-of-hands.

That would be all right if anyone took the advice. They don’t seem to do that. They ask: is that an instruction? We say: no, of course not. So they promptly do something different.

 

Fiat Lux – the rise of the real time enterprise

Extreme connectivity is coming to the enterprise

This year at Process Mining camp 2013, during a workshop I lead, the attendees were discussing about the access, usage, transfer and reuse of knowledge. The context in a part of the discussion was about IT development and implementation. Some told that not every bit of information flowing among the development team deserved attention. The question was not where the information was stored or how it was transmitted. What determined the importance of information was the relevance for the duties in the context of the project. This meant that the development team was always filtering, analysing pieces of code, patterns, working solutions, seeking parallel developments, retrieving information from linked projects autonomously to make impact analysis (something that I also learned in a previous challenge as the Master Project Officer told me that such thing like linked work packages, project dependencies, simply does not exist as we think as a concept).

Real time enterprise

Two months ago I had a chat with a person that will be one of the speakers of the forthcoming edition of the BPM Conference Portugal. He is responsible of everything about customer support. That means that his team is responsible to update and change IT to support constant business evolving requirements and particularly to monitor multiple process instances like complains, connection requests, whatever. They rely on a divided brain to make it happen. Enterprise systems to enter into the addictive loop and out of the box tools like Github to share information among team members and make changes happen. Change in this context (Telecom) means you need to deliver new services every month and prevent customer churn on real time without waiting for next month outdated business intelligent reports. The challenge here is twofold: monitoring and intervene on operations and support business change.

The externalization of knowledge contributes to knowledge diffusion

One month ago I was talking with a manager of an utilities company that also confirmed the need of operational online addiction. Regulation it’s tightening and the company does not want to throw money out of the window because they missed slas to provide an answer about a customer complain, about start billing earlier energy, about finding hidden bottlenecks on operations. At this company some of the real time inspiration came from Lean discussion forums about the ones that are responsible for monitoring on real time energy  distribution and the status of the infrastructure using Scada systems. These people are used to control energy flow, transmission lines disruption, maintenance operations on continuous mode. For them they “do not understand” why their colleagues do not embrace a similar attitude, and as such, they played an important role translating abstract knowledge necessary to embrace the always on journey into proper codification that could be used by their peers.

The role of Enterprise Architecture

Enterprise Architecture under a system thinking approach can make a difference when designing the transformational step of entering in real time mode:

  • What are the horizontal barriers to be monitored? For each process domain is being monitored it’s necessary to identify the stakeholders that touches the process, that would be one of the main source of variety. Note that the idea is not figure out in process design if a particular stakeholder do something and pointing the measurement channels to those points. It’s identifying the stakeholders and absorb the information flowing in the context of process execution (that is a huge difference).
  • What are the vertical barriers to be monitored? What are the processes in the value chain that are related and a truly end-to-end vision must be setup? For example, in an Utilities environment a Complain Handling process should not be connected with Meter Reading? and with Billing and with Churn Management?
  • The Algedonic Channel – what in Cybernetics is defined with the objective to transmit alert signals concerning any event or circumstance that could seriously put the organization in a critical situation: failure of delivering services; an hike in customer churn; a flop on revenue, sales, etc. This is one aspect that is very neglected by managers, because they rely on the organizational structure to communicate alerts and supervening facts and sometimes it’s to late to intervene.

We are in a brink of another major change

There is a lot of discussion since last year about the concept of “intelligent BPMS”. For me, intelligent means the system is able to “think” and “reason” by it self, without falling into the cognitive illusion abyss (that leads to the question what the System should “think” about ?). Hence, there is no question that enterprises that want to customer centric, anticipate errors, predict churn, must have to equip with technology to monitor the business continuously and in real time (like it happen in the movie the Matrix, when people were looking to the screens watching the code trickle), if is “intelligent” or not I leave that discussion to analysts. But more important than that, it’s necessary that companies have highly skilled people, team workers, used to work under agile principles to make it happen. Without that it’s difficult to make the change. Are you as a manager up to job?

Does combined Case Management Model & Notation fits its purpose?

I participated at an OMG meeting two weeks ago in Berlin that during a side conversation I was talking with some peers about the new CMMN designed to model and execute non prescriptive, standard, “bpm”, whatever you want to call process types.

I skimmed across the beta release and I did not found nothing extraordinary that BPMN could not do to model Case Management approach, by the looks and feel, I would say that CMMN is a subset of BPMN.

People from the CMMN committee told that the difference is how the language is executed, once is based on stage transition.

For me it was a surprise, because stage transition is what “BPM” processes (structured) are all about., the process moves, or change stage when activities reach and end state. ACM, or even Case Management is not about stages, is about availability of data, is user and data driven processes, as such is much more object oriented that defines the path the process takes.

Thus is CMMN missing the target and actually is a BPMN subset with a different name, or is something different?

CMMN - FTF Beta 1 - Expanded Stage with Expanded PlanningTable and Expanded HumanTask PlanningTable

CMMN – FTF Beta 1 – Expanded Stage with Expanded Planning Table and Expanded Human Task Planning Table

My argument is X Management (Case management, Adaptive Case Management, Purpose Case Management, Production Case Management … ) is object oriented, not task oriented. Actually we can handle a case with no tasks at all, it’s possible to combine multiple approaches to do it: activity streams, documents, etc. The difference is that is data and data availability that is transformed into information that drives the case. Most part of the data is coming from sources outside the form of the case (if it exists).

For example, if you are analyzing a complain and you get the contract to understand the type of conditions that were setup, the penal clauses or if an opinion coming from Legal Dpt about how the complain should be handled regarding the context and contract that was signed, this is what helps people involved in the case to steer direction.

The big difference as I see is that X Management is object aware approach. The overall process (case) is structured around object types involved and outcomes of its manipulation (goals, tasks, documents, attributes, etc) and may refer to other object types or be referenced by them.

It’s not my intention to start a “holy” war against CMMN, I just say that looks like it’s stuck in the middle of the bridge. It does not means that in future it can evolve and can cross the bridge.

I’m arguing that as a design principle, the language is based on stage transition, like prescribed / structured process does not look right to me, because X Management is all about processing of data that helps to decide the path.

Process Mining Camp 2013 – Expedition on Social Mining

I hosted a workshop at the Process Mining Camp 2013 about Social Mining. Here are the results of the discussion with my peers and fellow miners.

Kick-start to the workshop

We’ve been supporting our way of working, based on the increased processing capacity of information systems that have created the illusion that the world was more stable, predictable and standardized.

However the pace of change in the economy has been increasingly accelerated, fuelled by a nexus of converging forces — social, mobile, cloud and information — is building upon and transforming user behaviour while creating new business opportunities that let people do extraordinary things and are automating repetitive tasks and decision making at large. This implies that our vision of the future has to be changed.

Any system, any process must be able to handle the complexity of its elements and be active and adaptive to survive. This implies that any attempt to limit the existing variety will lead to the system, the process, the organization will lose the ability to adapt. This is the reason that business processes are not anymore normalized, standardized and are getting more difficult to analyze.

For sure there are research methods to tackle this kind of challenges, there is an example like Simplifying Discovered Process Models from Dirk Fahland, Wil M.P. van der Aalst, but the thing is variation, complexity cannot be predicted, and such methods can work in predefined or controlled because organizations live in a world where interdependence, self-organization and emergence are agility, adaptability and flexibility.

It is a networked composed world in the design of collaborative-networked organizations.

These networked configurations comes to the composition of complex systems, from cells, to society and enterprises (associations of individuals, technology and products). In those complex systems, characteristics of emergence, order and self-organization, develop a set of network interdependent actions not visible in the individual parts. This is the reason why defining methods to analyze a domain fail if the domain and the parts change, which is what most of the times occurs once we are living in a world of variety.

The facts that are changing everything

There a hand full of facts that are changing everything the way we work, basically that are two domains that are making a huge pressure on enterprises.

The technology factor

As communication costs drops and speeds increase, cost will no longer be a consideration in many parts of the world. As the cost of communication drops, the shift will be towards applications. Combined with increased computer capacity and speed, we will be able engage with, and have access to information in real time. Cloud will free organisations from fixed and limited availability and processing power. The way we are used to working  will dramatically change.

The social factor

On the social factor, in leading GDP countries, we are facing a displacement of “assembly line” people to aspiring ones; this is because work can be transferred to those that can do the same thing for less than a half of the cost. This shift occurs in industry sectors from manufacturing to services. But in the near future tiny tasks will be fully automated and unfortunately those brave workers will be obliterated, unless there are new work opportunities, or chances to execute more complex work. People will have to adapt and start pushing their capabilities to new boundaries.

This shift has also a profound implication on the type of people companies are sourcing in the labor market. As leading companies expand and operations are outsourced or transferred to low wages economies, the future workers profile will be aimed at highly skilled persons capable of embracing business dynamics.

The convergence of three important process dimensions

The complexity were are living with, implies that we to look and align other kind of dimensions we were not used to look before to tackle the factors that are transforming the way processes are executed. Control flow perspective does not provide any kind of insight because there are not two similar instances and because under social collaboration paradigms the process is the conversation or the interaction and there are infinite ways to do that. Time perspective is important and will continue to be important but is definitely not the best way to understand behaviour.

In fact today we have immense analytical capabilities, but how do we understand a fundamental challenge for organizations that is how people socialize? How do they work? the configuration makes sense? It is too centralized, depends always from the same person and the same organizational units or is open and anyone can be invited to join? The type of knowledge applied is abstract, i.e. people can apply recurrent solutions to daily problems in a multitude of situations, and only apply customized solutions (concrete knowledge)? Knowledge is reused? Information flows naturally or processes are too structured and best practice oriented that are turning organizations into fragile systems because they are not able to change, react to unpredictable facts and adapt?

This was the background of the workshop.

The quest

Our society is constructed around flows. This construction is also applied inside organizations and among its stakeholders. This is what we are made of.

Flows are the sequence of interaction between physically disjointed positions held by social actors, that belong to a particular social network.

Dominant social structure are those arrangements of organizations whose internal logic plays a strategic role in shaping social practices.

Thence the trick is you are able to align network structure to the process type being executed and evolve the network type according to circumstances. In order words, you need to introduce and maintain an adaptive social approach. But that is not enough. You can have the best social network configuration, but knowledge is poorly used, or you let people set them free when it should be supposed to reuse solutions all and over again.

Social dimension – social networks configuration

Once the process transformed into something that is the conversation, we need to understand how people engage. In other words, what is the network configuration. It’s somehow accepted that network patterns can indicate the way people work and share information.

As a reference on social network patterns, and social network discovery techniques you can learn it here in this post.

Challenge #1
Centrality is used to measure degree distribution. But all measures are wrong and some are useful.

From the discussion resulted that:

Information (logs) about social iteration that spreads into e-mail, social tools like activity streams, messaging, video chat, that can help to discover the way socialization occurs are difficult to obtain, due:

  • The effort to obtain this information can be infinite, because is recorded across multiple platforms and most of the records do not have a common key;
  • Some information is inaccessible if is recorded inside systems that the company, the entity that has interest in understanding what is happening is not responsible for the system administration (event if it is administrated indirectly);

Privacy concerns. There is a clear division about the approach how information is considered private across different parts of the globe. For example, in most European countries, at large, data like e-mail, stored in the employer devices is still personal, even if it is corporate e-mail. This challenge is amplified if data is stored in personal e-mail or devices even if it is from corporate source).

Building the complete log can be overwhelming if social interaction is spread in multiples systems. Without entering into technical details, is much more  difficult that joining different database tables.

It’s more important if the social dimension could be embedded in the control flow, rather than being analyzed separately. If the process is the flow and the process is social the visualization should be integrated. I consider that this point is key for developers.

Knowledge types – What type of knowledge exists and how it’s applied

Healthcare industry has always been characterized by the involvement of multiple professionals for diagnosis and treatment of patients where information sharing plays a key role. Health professionals (as well as professionals from other industries), tend to work around problems, address the immediate needs of patients rather than resolve ambiguities. As a result, people face “the same issue, every day indefinitely,” which results into inefficiency. In other words, people like to design, the same solutions always. How can you overcome this challenge and what can be done so that the knowledge use can be more abstract and knowledge itself can evolve within the organization?

Knowledge consumption should be aligned with the type process design. For example a repetitive task is usually automated turning into explicit knowledge use, documented and understood by all. There is often a temptation to simplify the existing complexity, automating and standardizing how to proceed to the point of “crystallize” only a small part of the information that people have to process, making it difficult to cope with the changing conditions of execution, thus leaving no room to use of the tacit dimension.

Knowledge is not then just a twin flavor (explicit or tacit) but it’s more than that.

Challenge #2
How to discover and measure knowledge type?

There can be different types across parts of the process and measuring is not automatic.

From the discussion resulted:

People would like to spot the indispensables. The ones that makes the difference, when a solution is build. That could be measured by how many person in the company “like”, use, apply the knowledge that was created.

Many think that the problem with knowledge discovery and usage is related with the tools used to store and share it (portals, wikis or alike). Some examples were provided in IT context, like a patch, a pattern that was sent over the development team, was considered to be handy, because everyone was involved working in the solution and as such knowledge gets codified, but big

knowledge repositories are not considered to be useful.

The lynchpins, the indispensables, don’t like to codify it’s knowledge, because it makes them … dispensable (I tend to agree, but there are some generations that live under the share paradigm and make the others contribute to the company success).

A side interested comment was presented:

Knowledge finding automation is highly requested. Even with a Swiss army of systems to manage knowledge, it’s hard to find.

Discovering process types

Process are not from a single flavor anymore. Today it’s possible to find a very pre-defined type, but also a blend of every type available across multiples process instances.

Today processes are blended. You can handle a claim with a customer in a loosely manner and in the end pay a compensation using a by the book, best practice, “ever day the same thing”.

Challenge #3

How to understand what process type we are looking at?

The structured ones are easy to find, but Ad-hoc and Adaptive put extra challenges, particularly if parts are blended with structured ones.

From the discussion resulted that:

Most important that have super algorithms to spot patterns and discover process types, at this point of time is more important to have access to recorded data to actually let people think.

BPM Conference Portugal 2013

The first edition of the conference was held last week on April 18th, and brought a blend of different viewpoints on the most advanced and innovative themes on BPM, as also, had practical approach rather than a conceptual only approach.

For the first time I was involved as a chair of the event. The main difference between being invited to participate or manage the conference regarding the sessions, the agenda, the themes, is that when you deliver your talk you strive to make your best and inspire the others, while when you are the chair, you are responsible for all the speakers that ultimately constitutes a very different kind of challenge. Again, I would like to thank to everyone that make this possible. The event organizers, the speakers, the attendees. I retain the idea the event equalizes with others around the world (taking into consideration the size of the market), much more forward thinking (I fight for that) and I hope that future editions will have more sessions around the HOW TO DO IT that is something that attendees expressed in a hand full of informal talks I had. They are not looking for workshops, but for sessions that explain how the result was achieved.

If you think you can make e difference in 2014 edition, send me an e-mail and I will be glad to enroll your presentation proposal.

The themes of the conference:
The topics of BPM Conference Portugal were: Cybernetics, or the ability to deal with diversity; Adaptation, how companies sense, innovate and change the way operations are performed and Socialization, how managers can change the way people get engaged out of the organization charts and use other approaches to achieve the intended results.

The goal of the event was to provide new perspectives on the challenges companies face; new methods to overcome challenges and see in practice, in real life, how to achieve competitive advantage.

I opened the conference with a very concentrated pitch around the conference themes summarized bellow:

The baseline of the conference is the fact that company environment has not changed. It continues to evolve, but faster.

  • The pace of change in the economy has been increasingly accelerated, fueled by ubiquitous access to information and enterprise systems that are allowing to change the way work is done. Predict what will happen next is exponentially more difficult. Uncertainty has become an enduring variable, as companies have noticed lately. This implies constantly changing, or in other words adaptation.

To perceive is to understand patterns.

  • Is a fact that today companies have immense analytical capabilities, but how managers understand a fundamental challenge for organizations that is to deal with all this interaction variety is necessary to understand patterns. Understanding patterns is not predict behavior, but infer trends, so people can think, act and adapt.
  • Organizations that manage to be better aligned these three perspectives: social network, knowledge type and process design, are those that will be ahead in terms of execution capabilities, flexibility and adaptation to change.

The role of human resources development.

  • Without retaining and nurture highly skilled workers, knowledge cannot be applied effectively.
  • In the current context, organizations need all kinds of knowledge of all organizational units coming from all business units. Organizations need to use all styles, because organizations never know in advance which people they need to solve a problem, taking into account uncertainty times we are facing.
  • People are deeply knowledgeable of the organization’s rules and apply them in the work they do because systems are imbued with logic and interoperability required for execution. Not only the type of technology has to be different, which often involves changes in technology architecture, but enabling people directly in the design and execution of business processes.

The conference sessions:

José Tribolet: Adding value to BPM by enforcing the fundamental principles of Enterprise Engineering

Professor Tribolet is a disciple of Dietz’s Enterprise Ontology method and he and his team is applying it in government agencies. The case presented was around handling judicial procedures where it was possible to identify that failures occur in the acts related with process execution, with an impact in delays, complains, superseded judicial decisions.

DEMO, (Dietz’s method) is somehow misunderstood around the community because is difficult to understand  (heavily based on computational science and three axioms: social agreements; content of communication; means of communication ) difficult to apply (it’s necessary to have a lot of conditions to be used like being able to trace process actions recorded by enterprise systems), but effective if you want to evaluate  consistency and completeness of your process models in run time mode.

Business transactions specify the pattern-based behavior that describes how actors collaborate in order to achieve business results. The method takes as input a process model that is converted to a transactional model. The transactional model is then analyzed and revised so that all transactions comply with the Ψ-theory axioms. Finally, the original BPMN process model is revised to become consistent with the transactional model and complete in the sense it expresses all transactional steps.

As a result is possible:

Identify consistency issues:

  • Activity sequencing (control flow) violates the transaction pattern.
  • Data flow violates the transaction pattern.

Identify completeness issues:

  • Behavior of an activity cannot be classified as a coordination or production act.
  • Coordination or production acts cannot be mapped to any activity (i.e. the act is either implicit or missing on the process model).

Keith Swenson: Planning and Supporting Innovative Work Patterns

Keith split his presentation in two parts: the concept around anti-fragile systems and adaptive case management.

Most of the talk was around anti-fragility a concept rose by Nassim Taleb’s book Antifragile: Things That Gain From Disorder. Without being repetitive you can find most of Keith’s key points in his own words on this post. I would add in a different perspective and revisiting Ashby’s law,  any system, any process must be able to handle the complexity of the elements that constitute in an active and adaptive way to survive and thrive. This implies that any attempt to limit the existing variety will lead to the system, the process, the organization, will lose the ability to adapt leading to implosion, in Keith’s words, turning into fragile. This idea was also presented by Vitor Santos when we was arguing in his talk around the concept of enterprise elasticity (that I will come back to it in the end of this post).

Without changing the objectives of Keith presentation, there is a concept that I think (but I might be wrong) managers don’t even understand yet when dealing with enterprise systems deployment. Most are worried with the function, with the features, with business support and forget system engineering concepts, I mean how the system was conceived (engineered) to evolve and adapt to changing conditions (probably to revisit  in next edition sessions). I do no mean that the system itself will have such kind of character (by the way those that say IT can behave like complex adaptive system are in science fiction mode because one thing is the system to behave like that, other are the patterns that emerge as humans act on systems), but hey should be engineered with that objective (that can be evolved taking into consideration enterprise ecosystem, rather substituted).

Regarding adaptive case management there where some key ideas I would like to stress: the future is more around  providing guidelines that show people where to go, but do not prevent deviations if they are necessary, rather than enforcement where people fight against process design. Still, the idea that knowledge workers know what to do, because they understand the business model, the business rules and apply their knowledge building solutions to business problems was refuted by Tribolet. On his words, sometimes knowledge workers do what the hell they want to do and enter into contraction with company objectives. Hence the idea that knowledge workers know best to achieve the goals, sometimes does not apply and business suffers. It’s a matter of human behavior.  This is something we should make a reflection about.

Denis Gagné: Business Process Simulation: How to get value out of it

For those that are familiar with Deni’s style, already know that his sessions are very practical oriented. Denis talked about the reappearance (some argue that never disappeared) and the importance of simulation.

In the past simulation was seen like an evil tool that did not deliver value because process models were incomplete,  data used in simulation was inappropriate (mostly because it not even close of reality). There are some seminal reflections on simulation of Process Mining godfather Will Van der Aalst  were it argues that any attempt to simulate will be an incomplete exercise and will lead managers to make the wrong decisions, but as I envisioned before (something that for sure Gartner and Forrester will bring to the intelligent BPM assessment reports) Process Mining and Simulation are poised to merge. This is because today most of what we do is recorded by enterprise systems and it’s possible to construct real world models and use real world data to hep to build scenarios and make decisions about future directions.

Back again to Denis’s presentation, one of the key points was bringing awareness about what is the difference about process improvement that can be done using a myriad of approaches and business process management the management philosophy (not project based; continuous improvement culture and process-based management).

Regarding simulation he stressed more on capacity simulation aspects of a process model, usually dynamic analysis (using discreet simulation methods). Finally he talked about BPSIM the standard that allows data interoperability embedded into process models providing for pre-execution and post-execution optimization. As I told in the conference closure session, simulation is sexy again and it’s a way to explore how to improve process redesign in an era where all the data you need to do it is available inside your enterprise systems, rather than some years ago where you took cumbersome effort making studies driving you into the wrong direction to make decisions.

Ivo Velitchkov – Reasoning with Taskless BPMN

For me this was the most innovative presentation of all, because it challenged the current state on BPMN process modeling. BPMN is difficult to learn (but once it’s learned believe me can produce rich process model models) it has an endless symbol pallet, modeling by itself can lead to highly capillary detail or to high level approaches does not tell the complete story, in order words produce incomplete models. Hence Ivo, presented a new approach, based on taking from the process map the tasks (tasksless).

He defended the idea that taksless model diagrams, based only on process state transitions, conditional events and process rules, can produce easily understandable process models. On his words, tasks try to restrict what should be done during run time with what is known during design time. I see great potential of his ideas, translating business models into high level IT requirements, substituting state transition diagrams.

Tom Graves – Serving the story how BPM and EA work together in the enterprise

In a time where Enterprise Architecture finally is being understood as something valuable, that goes beyond creating boxes like collecting trading cards or tokens,  because today when you carry a process improvement initiative you realize you cannot anymore “automate” something because the pace of process change is touching other processes, systems, people if you don’t have the broader picture, aligning business model, value chain, organization and  IT all together, the risk your transformation project will fail is high.

Tom, brought a different perspective of how to do EA right. Putting people talking to each other in order to provide each of enterprise architecture layers (business model, value chain, organization, IT) perspective to the project.

He somehow stressed that enterprise architecture is not about IT like TOGAF framework, letting no place to the people that make part of the enterprise that know what the enterprise is all about contrary to the silicon servers. On his words, let the people include the people-story otherwise EA will be incomplete.

Michael Poulin – Business Processes in a Service-Oriented Enterprise

Michael walked mostly around a set of principles on Service-Oriented Enterprise, but I will highlight the concept that Michael created around Purpose Case Management. Conceptually, Purpose Case Management makes the blend between ACM and BPM (in this context BPM is structured process not the management philosophy) at it can drive smoothing transitions between unstructured and structured actions across ACM/BPM independently of the approach.

Robert M. Shapiro – Visual Analytics and Smart Tools

Robert talk was also on simulation, focused  on using data on executing processes to get an understanding on what is happening, what problems are and where you should look where to make improvements.

He walked through on a a very practical perspective intended to combines the process model, simulation to add data to the model in order to capture the behavior of the process model, analyze the different dimensions of the simulation result (time, cost, resources) and optimize that makes comparisons regarding different improvement scenarios. He also presented a method on Return on Investment on thinks like spending money on training with people’s performing the task in the process vs IT task automation, calculating benefits that can be used on process deployment, helping managers to decide before the rubber hits the road. This was very new to me.

Vitor Santos: Organizational elasticity with BPM

Vitor tried to demystify the approaches to build IT systems. He talked about the engineering approach that tries to align the enterprise holistic approach, and pointed out the concept of IT adaptability (elasticity on his words) built on the concept of viable systems that prevent the hike of maintenance cost or replacing IT time to time rather than making a bigger investment upfront that will fulfill business for a larger period.

 

In the next couple of weeks, videos from the sessions will be available. If you are interested, take a peek at the conference website.

Interested in a different view about the conference? Here is Tom’s view.