Process Mining Camp 2013 – Expedition on Social Mining

I hosted a workshop at the Process Mining Camp 2013 about Social Mining. Here are the results of the discussion with my peers and fellow miners.

Kick-start to the workshop

We’ve been supporting our way of working, based on the increased processing capacity of information systems that have created the illusion that the world was more stable, predictable and standardized.

However the pace of change in the economy has been increasingly accelerated, fuelled by a nexus of converging forces — social, mobile, cloud and information — is building upon and transforming user behaviour while creating new business opportunities that let people do extraordinary things and are automating repetitive tasks and decision making at large. This implies that our vision of the future has to be changed.

Any system, any process must be able to handle the complexity of its elements and be active and adaptive to survive. This implies that any attempt to limit the existing variety will lead to the system, the process, the organization will lose the ability to adapt. This is the reason that business processes are not anymore normalized, standardized and are getting more difficult to analyze.

For sure there are research methods to tackle this kind of challenges, there is an example like Simplifying Discovered Process Models from Dirk Fahland, Wil M.P. van der Aalst, but the thing is variation, complexity cannot be predicted, and such methods can work in predefined or controlled because organizations live in a world where interdependence, self-organization and emergence are agility, adaptability and flexibility.

It is a networked composed world in the design of collaborative-networked organizations.

These networked configurations comes to the composition of complex systems, from cells, to society and enterprises (associations of individuals, technology and products). In those complex systems, characteristics of emergence, order and self-organization, develop a set of network interdependent actions not visible in the individual parts. This is the reason why defining methods to analyze a domain fail if the domain and the parts change, which is what most of the times occurs once we are living in a world of variety.

The facts that are changing everything

There a hand full of facts that are changing everything the way we work, basically that are two domains that are making a huge pressure on enterprises.

The technology factor

As communication costs drops and speeds increase, cost will no longer be a consideration in many parts of the world. As the cost of communication drops, the shift will be towards applications. Combined with increased computer capacity and speed, we will be able engage with, and have access to information in real time. Cloud will free organisations from fixed and limited availability and processing power. The way we are used to working  will dramatically change.

The social factor

On the social factor, in leading GDP countries, we are facing a displacement of “assembly line” people to aspiring ones; this is because work can be transferred to those that can do the same thing for less than a half of the cost. This shift occurs in industry sectors from manufacturing to services. But in the near future tiny tasks will be fully automated and unfortunately those brave workers will be obliterated, unless there are new work opportunities, or chances to execute more complex work. People will have to adapt and start pushing their capabilities to new boundaries.

This shift has also a profound implication on the type of people companies are sourcing in the labor market. As leading companies expand and operations are outsourced or transferred to low wages economies, the future workers profile will be aimed at highly skilled persons capable of embracing business dynamics.

The convergence of three important process dimensions

The complexity were are living with, implies that we to look and align other kind of dimensions we were not used to look before to tackle the factors that are transforming the way processes are executed. Control flow perspective does not provide any kind of insight because there are not two similar instances and because under social collaboration paradigms the process is the conversation or the interaction and there are infinite ways to do that. Time perspective is important and will continue to be important but is definitely not the best way to understand behaviour.

In fact today we have immense analytical capabilities, but how do we understand a fundamental challenge for organizations that is how people socialize? How do they work? the configuration makes sense? It is too centralized, depends always from the same person and the same organizational units or is open and anyone can be invited to join? The type of knowledge applied is abstract, i.e. people can apply recurrent solutions to daily problems in a multitude of situations, and only apply customized solutions (concrete knowledge)? Knowledge is reused? Information flows naturally or processes are too structured and best practice oriented that are turning organizations into fragile systems because they are not able to change, react to unpredictable facts and adapt?

This was the background of the workshop.

The quest

Our society is constructed around flows. This construction is also applied inside organizations and among its stakeholders. This is what we are made of.

Flows are the sequence of interaction between physically disjointed positions held by social actors, that belong to a particular social network.

Dominant social structure are those arrangements of organizations whose internal logic plays a strategic role in shaping social practices.

Thence the trick is you are able to align network structure to the process type being executed and evolve the network type according to circumstances. In order words, you need to introduce and maintain an adaptive social approach. But that is not enough. You can have the best social network configuration, but knowledge is poorly used, or you let people set them free when it should be supposed to reuse solutions all and over again.

Social dimension – social networks configuration

Once the process transformed into something that is the conversation, we need to understand how people engage. In other words, what is the network configuration. It’s somehow accepted that network patterns can indicate the way people work and share information.

As a reference on social network patterns, and social network discovery techniques you can learn it here in this post.

Challenge #1
Centrality is used to measure degree distribution. But all measures are wrong and some are useful.

From the discussion resulted that:

Information (logs) about social iteration that spreads into e-mail, social tools like activity streams, messaging, video chat, that can help to discover the way socialization occurs are difficult to obtain, due:

  • The effort to obtain this information can be infinite, because is recorded across multiple platforms and most of the records do not have a common key;
  • Some information is inaccessible if is recorded inside systems that the company, the entity that has interest in understanding what is happening is not responsible for the system administration (event if it is administrated indirectly);

Privacy concerns. There is a clear division about the approach how information is considered private across different parts of the globe. For example, in most European countries, at large, data like e-mail, stored in the employer devices is still personal, even if it is corporate e-mail. This challenge is amplified if data is stored in personal e-mail or devices even if it is from corporate source).

Building the complete log can be overwhelming if social interaction is spread in multiples systems. Without entering into technical details, is much more  difficult that joining different database tables.

It’s more important if the social dimension could be embedded in the control flow, rather than being analyzed separately. If the process is the flow and the process is social the visualization should be integrated. I consider that this point is key for developers.

Knowledge types – What type of knowledge exists and how it’s applied

Healthcare industry has always been characterized by the involvement of multiple professionals for diagnosis and treatment of patients where information sharing plays a key role. Health professionals (as well as professionals from other industries), tend to work around problems, address the immediate needs of patients rather than resolve ambiguities. As a result, people face “the same issue, every day indefinitely,” which results into inefficiency. In other words, people like to design, the same solutions always. How can you overcome this challenge and what can be done so that the knowledge use can be more abstract and knowledge itself can evolve within the organization?

Knowledge consumption should be aligned with the type process design. For example a repetitive task is usually automated turning into explicit knowledge use, documented and understood by all. There is often a temptation to simplify the existing complexity, automating and standardizing how to proceed to the point of “crystallize” only a small part of the information that people have to process, making it difficult to cope with the changing conditions of execution, thus leaving no room to use of the tacit dimension.

Knowledge is not then just a twin flavor (explicit or tacit) but it’s more than that.

Challenge #2
How to discover and measure knowledge type?

There can be different types across parts of the process and measuring is not automatic.

From the discussion resulted:

People would like to spot the indispensables. The ones that makes the difference, when a solution is build. That could be measured by how many person in the company “like”, use, apply the knowledge that was created.

Many think that the problem with knowledge discovery and usage is related with the tools used to store and share it (portals, wikis or alike). Some examples were provided in IT context, like a patch, a pattern that was sent over the development team, was considered to be handy, because everyone was involved working in the solution and as such knowledge gets codified, but big

knowledge repositories are not considered to be useful.

The lynchpins, the indispensables, don’t like to codify it’s knowledge, because it makes them … dispensable (I tend to agree, but there are some generations that live under the share paradigm and make the others contribute to the company success).

A side interested comment was presented:

Knowledge finding automation is highly requested. Even with a Swiss army of systems to manage knowledge, it’s hard to find.

Discovering process types

Process are not from a single flavor anymore. Today it’s possible to find a very pre-defined type, but also a blend of every type available across multiples process instances.

Today processes are blended. You can handle a claim with a customer in a loosely manner and in the end pay a compensation using a by the book, best practice, “ever day the same thing”.

Challenge #3

How to understand what process type we are looking at?

The structured ones are easy to find, but Ad-hoc and Adaptive put extra challenges, particularly if parts are blended with structured ones.

From the discussion resulted that:

Most important that have super algorithms to spot patterns and discover process types, at this point of time is more important to have access to recorded data to actually let people think.

Social Network Analysis – part two

On part 1, I introduced the importance of social network understanding as the socialization of interactions is becoming a new working habit and as such classic control flow perspective analysis does not anymore provide information about how work is done.

On this post, I will explore important points to look for when performing Social Network Analysis (SNA).

On properties:

Social networks have typically the following properties:

  • Emergence: agents that belong to the network interact in an apparently random way. This feeling is amplified if there are many agents and / or there are too many interactions that make difficult to extract patterns. Emergence is all about separating the signal form the noise and make those patterns to emerge.
  • Adaptation: enterprises, communities, exist confined in a particular environment that when changes it makes agents to react. Environment can be external, interaction with customers, suppliers, government agencies; influence like the publication of a new law or regulation or competitor movements as they enter in new markets or create new products or services. Environment can also be internal and its related to the way agents interact that is ultimately associated with how business processes were designed, how IT solutions were deployed, culture, hierarchy configuration and formal recognition of authority, just to provide some examples.
  • Variety: Ashby, one of the father’s of cybernetics, defined the Law of Requisite Variety “variety absorbs variety, defines the minimum number of states necessary for a controller to control a system of a given number of states.” For an organisation, to be viable it must be capable of coping with the variety (complexity) of the environment in which it operates. Managing complexity is the essence of a manager’s activity. Controlling a situation means being able to deal with its complexity, that is, its variety [1].
  • Connectivity: The way agents are connected and how those connections are aligned with the process type that was designed / being executed and the type of knowledge that is necessary to support operations (more about this alignment here). The existing connections will unveil the emergent patterns that are necessary to identify and understand behaviour under a social point of view (high coupling or loosely coupling between agents or group of agents).

On network types:
Most of the times when people refer to social networks they are expressing their beliefs on community networks like Facebook, subject expert groups like enterprise wikis. Although those are important network types, they do not express the nature of organization operations, because they do not record communication acts expressed on social activity, hence I will only concentrate on Coordination Networks.

A Coordination Network is a network formed by agents related to each other by recorded coordination acts.

Coordination acts are for example, the interchange of emails, tasks as design on enterprise systems or activity streams just to provide some examples. The above definition is an adaptation of [2] because it does not include the importance of coordination act that is related with the nature of work, rather the connection itself. The former is the important dimension related with business process management and will guide the remaining content.

Coordination acts is meant to be as defined (adapted) [3] an act to be performed by one agent, directed to other agent that contains an intention (request, promise, question, assertion) and a proposition (something that is or could be the case in the social world). In the intention, the agent proclaims its social attitude with respect to the proposition. In the proposition, the agent proclaims the fact and the associated time the intention is all about, recorded by the system, supporting the definition Coordination networks, which configuration that can ultimately be discovered, patterns emerge, using discovering techniques like for example process mining.

Coordination Act V00

Coordination Act

On analysis dimensions:

Social network analysis is not new. Actually, the first studies were done around the 50’s of last century. Its refinement stumbled around:

  • Degree distribution: study connection number around a node of the network;
  • Clustering: groups with connection density larger than average;
  • Community discovery: measures alignment of connections regarding organization hierarchy.

There is an immense list of techniques to analyse each one of the above dimensions, that reflects the high maturity level of each method, but he drawback is that SNA analysed on each dimension alone can induce managers in the wrong direction. For example, studying community discovery can be important, because communities are a collection of individuals who are linked or related by some sort of relation. But carrying the analysis without taking into consideration the content of the conversation (coordination act) that drove the creation of the link is absolutely wrong, because the conversation is all about the way we humans work. I tend to disagree with other points of view from other practitioners that conversation does not matter (probably because they were influenced by Gordon Pask), only the network configuration. Conversation (the process) is the matter of study.

Social networks are self-organizing systems, but there are important patterns that emerge from the nature of the coordination acts that can be identified. Despite there are random factors and the type of patterns presented in most of scientific papers are based on graph theory and tend to be very simple compared with the reality (and hence maybe this is one of the reasons they are not taken seriously) it is the only way, as an abstraction, to understand agent behaviour. Pattern recognition is critical to align process type (from structured to unstructured), knowledge domain (simple to chaotic) and network type (central to loosely coupled). In order words, to infer trends and help humans to interact better regarding the role they play in the process ecosystem. Having said that, I would like to invoke Stafford Beer’s on models: “in general we use models in order to learn something about the thing modelled (unless they are just for fun)” [5].

Centrality is used to measure degree distribution. Centrality [2] is described as a process participant, business unit, group (a set of process participants or people) or an enterprise system (do not forget the machines) within the context of a social network. Centrality is also related with discovering the key players in social networks.

Some measures that can be used for Centrality are:

  • Degree centrality: calculate how many links a node has regarding the remaining network nodes (commonly called network stars). Higher degree centrality means higher probability of receiving information (but does not mean it drives information flow inside of the network).
  • Betweenness: measures the degree witch a process participant controls information flow. They act as brokers. The higher the value, higher is information flow traffic that moves from each node to every other node in the network. The importance of Betweenness in social network analysis is nodes with higher values stop processing coordination acts, will block information flow to run properly.
  • Closeness: measures how close a node is isolated in the network compared with other network mode. Nodes with low closeness are able to reach or be reached by most of all other nodes in the network, in other words low closeness means a node is well positioned to receive information early when it has more value. Closeness measure must be supported on time dimension (see reference about the timestamp attribute on the coordination act exemplification), without it, is useless.
  • Eigenvector centrality: used to calculate node influence in the network. Higher scores means a node can influence (touch) many other important nodes.

In order o put it all together its worth to consider the following self-explanatory picture [6]:

Diverse centrality measures V00

Diverse centrality measures

The challenge:

There is a lot of noise around what is the best measure to perform SNA, as I learned at the User Modelling, Adaptation and Personalization Conference 2011 it’s time to put the mathematical equations aside and practice it’s application.

At this moment of time, there are plenty of ways to measure network centrality, but somehow they neglect that those algorithms are not appropriate regarding the type of business process / information system interaction played. For example, Eigenvector centrality measure is important in unstructured processes, where the path is defined on instance mode and it is necessary to create a team and involve others as the process progress. Once SNA does not analyze the process type, only about agent relation, if applied analyzing a procure to pay process (highly structured process type) it’s useless and can damage results interpretation, because on this case, every agent, every process participant receives and process information basically the same way to achieve the same outcome every same day. Maybe this is the reason why is not yet taken more seriously, because these days the process is all about social  interaction and it cannot anymore be analyzed naively taking into consideration the dispersion, complexity and interdependence of relationships, something that can also be applied on IT requirements elicitation or IT system operation , which allows to understand communities interaction in order to support emerging and unique processes under a techno-social systems approach [7].

Social Network Analisys IT V00

References:

[1] – Design and Diagnosis for Sustainable Organizations – Jose´ Pérez Ríos – Springer – ISBN 9783642223174
[2] – Large Scale Structure and Dynamics of Complex Networks – Guido Caldarelli; Alessandro Vespignani – World Scientific Publishing – ISBN-139789812706645
[3] – Enterprise Ontology – Jan Dietz – Springer – ISBN – 3540291695
[4] – Complex Adaptive Systems Modeling – A multidisciplinary Roadmap – Muaz A Niazi
[5] – The Brain of the firm – Stafford Beer – Jonh Wiley & Sons – ISBN – 047194839-X
[6] – Discovering Sets of Key Players in Social Networks – Daniel Ortiz-Arroyo – Springer 2010
[7] – José L.R. Sousa, Ricardo J. Machado, J.F.F. Mendes. Modeling Organizational Information Systems Using “Complex Networks” Concepts. IEEE Computer Society 2012, ISBN 978-0-7695-4777

 

Social Network Analysis – part one – the importance of God on complexity

On the previous article about A Social Platform Definition, I presented a framework about the elements of such Platform. The following articles I will expand each of the layers. This one is dedicated to the Search and Analysis component.

Before we dig into the component content, I would like to bring some background about its significance.

An important introduction to Social Network Analysis

Last week, I had a meeting with a college headmaster to figure it out if there was alignment between me and the headmaster’s expectations and values regarding how students will be prepared for the forthcoming decades, taking into consideration the shift we are facing regarding work patterns, information overload and technology disruption.

The institution is catholic oriented and have strong roots with the Catholic Church. Let me say that I do not consider myself catholic as by the book definition, but probably I’m more catholic that others that go to the church every day and don’t have ethics and values. This means I did not choose to evaluate the institution because it is linked with my religious beliefs, but because they are the best institution according to the evaluation program that was created by the Portuguese Government some years ago.

During the interaction with the headmaster (a religious person), we talked about two vectors I introduced into the conversation: values and student preparation for the forthcoming decades (how we prepare people to interpret and act on information and how they improve reasoning in the knowledge era). When the headmaster was talking about values, introduced an amazing characteristic from the human being point of view (sorry by the religious background I’m putting into the discussion but I consider that it’s worth for the sake of clarification about social network analysis).

God created humans as a single and unique entity. There are no equal human beings (even perfect twins) and God created animals and all the other living organisms differently that belong to a system (let us call planet earth that belongs to other system called the universe) made by diversity in constant balance and adaptation.

This point of view opens and reinforces the main characteristic that we humans who belong to families, communities, organizations, arrangements that are part of a super system called the universe whose foundations rely on the top of diversity and complexity, not on standardization. Somehow, we keep pushing in into an ordered regime because it is much simpler to understand concepts, interactions and our own existence in an controlled manner rather than in a complex one.

The world is complex and we cannot change that as much we would like to

Ashby’s law teaches us that any system must match the complexity of its elements in an actively and adaptive way to survive and prosper.

In addition, Ashby pointed out other important conclusion: any attempt to limit part of the variety (because it is considered noise by the humans) that constitutes the system will lead that the system will lose the capacity to adapt and lead into implosion. This reflects in the way some business processes cannot respond to exception handling, because the misleading adaptation consists into fighting against the process model rather than adapt to changing executing conditions. If we consider a different organization layer like strategy management, think when external signs are ignored that can lead the organization to bankruptcy or financial loss.

In the social era we are being misleading about what is Social Network Analysis, one of the reasons it is about the semantics, the meaning of Social, broadly understood connected people, but a Social Network is much more than that. In very general terms a Social Network can be described as a graph whose nodes (vertices) identify the elements of the system. The set of connecting links (edges) represents the presence of a relation or interaction among these elements. With such a high level of generality it is easy to perceive that a wide array of systems can be approached within the framework of network theory [1].

Social Networks can be made of Organizational Units, Business Units, Roles and Functions, Individuals, Data, Technology consumption (what part of the IT solution is used), Technology interaction (how IT solutions communicate), Business Processes, Traffic, Biological, Physics (these last two categories lend so much of its properties to business analysis) etc.

All the networks are self organizing systems, but there are important patterns that can be identified anywhere from the self organization, despite randomness, patterns are critical for humans to understand how data can be transformed into information, that ultimately is transformed into knowledge used to understand the behavior of such networks (see note below).

Self-organization refers to the fact that a system’s structure or organization appears without explicit control or constraints from outside the system. In other words, the organization is intrinsic to the self-organizing system and results from internal constraints or mechanisms, due to local interactions between its components [2] (that can be put on top of a business process). These interactions are often indirect thanks to the environment. The system dynamics modifies also its environment, and the modifications of the external environment influence in turn the system, but without disturbing the internal mechanisms leading to organization [2] (think for example social interaction with customers that change the course of the business process, or events during product research and development that makes to alter the characteristics and features). The system evolves dynamically either in time or space, it can maintain a stable form or can show transient phenomena. In fact, from these interactions, emergent properties appear transcending the properties of all the individual sub-units of the system [2] (and these emergent properties are the ones than be understood using a combined set of discovering techniques like process mining, social network analysis and data mining).

I tend to agree that with argument that looking for patterns into a complex landscape is a waste of time for the reason that into complex domains any attempt to take a snapshot is a distorted version of the reality. Nevertheless, the objective of patterns discovery and understanding is not to predict behavior but to infer trends or in Jason Silva’s words “to understand is to perceive patterns” http://vimeo.com/34182381 .

The objective of Social Network Analysis is not to predict outcomes, but to understand, to construct knowledge around emergence self-organization and adaptation in scenarios like for example decision making or distributed systems that are becoming real enterprise challenges as business complexity and interactions grow exponentially.

Huge amount of data is being recorded today (see image bellow) that allow us to make discovery and analysis of complex interactions. The argument that does exist and it cannot be done only fits in a category like airport security information that typically relies on paper.

The Internet of Things – new infographics - Source: http://blog.bosch-si.com/the-internet-of-things-new-infographics/#more-6995

The Internet of Things – new infographics – Source: Bosch

On part two, I will explore techniques to analyze social networks.

Note:
On Fastcompany’s article: “IBM’s Watson Is Learning Its Way To Saving Lives”  is said that “Watson is poised to change the way human beings make decisions about medicine, finance, and work” […] “They believed Watson could help doctors make diagnoses and, even more important, select treatments”. I argue that IT can help humans to process and show data to help humans to make better decisions. Last weekend, a family member stood at a hospital during a day making analysis on what could have been a heart attack. Diagnosis were automatic: they make a 1 minute electrocardiogram (considered insufficient by experts) combined among others with measurement of troponin levels (diagnostic marker for various heart disorders). The results found correlation between the results and the family member was told a cardiologist should immediately see him. When the cardiologist looked to the results he said that there was no correlation at all, the results of the electrocardiogram were insufficient and the troponin level was 1/100 of the danger threshold and was unlikely to raise suddenly. In the end the diagnostic was wrong and the cause of sickness was nervous system. This evidence like many others should make us think as Einstein said: “Information is not knowledge, the only source of knowledge is experience”; I would add information cannot be stored.

References:
[1] Preliminaries and Basic Definitions in Network Theory – Guido Caldarelli and Alessandro Vespignani – Large Scale Structure and Dynamics of Complex Networks: From Information Technology to Finance and Natural Science – World Scientific Publishing Company – ISBN 978-9812706645

[2] Self-Organisation: Paradigms and Applications – Giovanna Di Marzo Serugendo, Noria Foukia, Salima Hassas, Anthony Karageorgos, Soraya Kouadri Mostéfaoui, Omer F. Rana, Mihaela Ulieru, Paul Valckenaers, and Chris Van Aart – Engineering Self-Organising Systems – Springer – ISBN – 3-540-21201-9

A Social Platform definition

1. Introduction:

On a previous post called The three layers of social business I discussed the importance of understanding a social business as an ecosystem rather than focus only on the technology menu.

As the Social Business Community Group puts in the A CTO’s Guide to Social Business, a social business is:

A social business is an organization that applies social networking tools and culture to business roles, processes and outcomes. A social business enables people to engage productively in a business context through collaboration and interconnecting business activities with social content. The scope of a social business spans across internal organizational boundaries and can extend to partners and customers. A social business monitors and analyzes social data to discover new insights that, when acted on, can drive business advantage, for example faster problem solving, improved customer relations, predicting market opportunities, and improving processes both internal  and external. A social business recognizes that people do business with people and optimizes how people interact to accomplish organizational goals.

What is social business alignment?

  • Social activity is integrated in business processes across and is supported by social technology;
  • Processes need to handle with social interaction at any point at any time, because the customer is driving the process, not the company. A Process that it’s designed on predefined manner will fail to cope with dynamics. Processes must be designed to spark others from the “middle of nowhere” rather than have predefined touch points. A customer maybe dealing with a complain and wants at the same time be informed about invoice settling. If you have difficulty to deal with this, one of the solutions for process design is a business rule based  approach where processes focus on what need to be done, describing the available activities that can be performed at a process stage and the rules sit on top not allowing the process to deviate;
  • It’s an ecosystem;
  • It must have the right social network configuration (process bandwidth should reflect the knowledge dimensions that should support the outcomes customers are willing to be achieved);
  • It must have executive commitment and participation (not the sponsorship, actually is the executives putting their hands on the tools as I’ve been told).

2. The platform:

The idea of the platform bellow was developed based on a previous work done by the W3C’s Social Web Headlights  Task Force later adopted by the Social Business Community Group I belong , but it was kept in the closet for a while, because the target of these group is to foster standards development based on available technology.

There is no common definition for a social platform. Most of the community work around it focus too much on the technology the platform provides rather on the semantics or meaning of such platform is.

A social platform can be understood as the capability provided to an organization to deploy and manage onto an infrastructure artifacts into the layers that constitutes the platform.

The layers reflect the internal and external social interactions an organization executes regarding the organization environment it belongs and evolve over time. Those interactions that constitute the social practice are driven by technology that best supports the nature of work executed by the people.

The social platform is constituted by three main blocks as bellow illustration and it’s technically agnostic:

  • About the Human: Who you are, how you identify yourself and what you pretend to be, the person’s social graph.
  • Human Interactions: What you do. How do you express. How you engage. How do you react. Where you belong. What work you execute.
  • Search and Analytics: Search for knowledge, gather feedback, get trends, spot patterns, sentiments, learn.
The Social Platform

The Social Platform

About the human is constituted by:

  • Identity: Unique identification attributes about the human.
  • Profile: What attributes about the human are available to identify him. Presence status (on-line / off-line …).
  • Social Graph: Social network type, network type and network connections around particular domains (may exist different networks for different domains). The communication flow and how people are connected with. People can be connected through some kind of relationship (work with, partner of), but the most importantly is the information flow between the people, i.e. the working practice, information sharing, because is here where action occurs.
  • Addressing: Contact details (e-mail, instant messaging … ).
  • Reputation: Perceived evaluation taken by members of the social graph.

Human interactions are constituted by:

  • Messaging: Conversations can be synchronous or asynchronous around a topic.
  • Group dynamics: Communities of interest or practice, interacting around a particular domain.
  • Collaboration: Work being carried towards a goal that must be achieved. Work can be structured or unstructured.
  • Sharing: Accepting an object to be shared in order others can take action if they want to.
  • Reactions: Expressing a feeling about an object , making an opinion or making an evaluation.

Search and analytics is constituted by:

  • Search : Find information necessary to gather data to reason, to judge and execute tasks.
  • Business Intelligence: Getting and explore data, finding trends, correlation helping people to do better decisions (if they know how to interpret it)
  • Mining: Extract data and display it process oriented for business process analysis and  improvement.

3. Technology is important but it’s not enough

The problem of focusing on the technology that makes social business happen is it creates a tunnel vision of making it a reality because technology alone does not provide the foundation of creating a full integrated into the organization value chain that includes all the stakeholders.

For example an airline can have a twitter account to broadcast announcements about events like strikes that do not allow the flights to operate properly and that is a plus, but it that twitter account it’s not able to catch important feedback about traveling issues and annoyances of the passengers and integrate that feedback into customer support what is the value of using twitter as a communication tool? By the way I will come back to this issue when I have time to seat and write about it, how UPS is mastering complain handling using social technologies. “Scooping” around it I will like to tell you that in the US, UPS is taking seriously how complains are handled unlike in Portugal where they don’t even let you make it until the next day I received a phone call regarding a complain I sent to the headquarters (strange, I thought I did a complain to the local office!). As you can see even inside of enterprises there is no common foundation what is a social business and most of the times managers are worried in choosing a proper tool to say they use it, is there, and it’s available but it’s not integrated on the value chain (this is a challenge to Enterprise Architects).

Semantic BPM – epilogue

On this reflection we analyzed the current challenges companies are facing to manage business processes. Currently people are fighting against the meaning of things. Knowledge is no longer in procedures, in flowcharts, is spread across systems, sensors and streams. Knowledge is not anymore inside of company boundaries.

People interaction connected to companies are changing or are about to change as quickly systems to socialize are spreading. You cannot stop this paradigm. You can block social iteration for a while, but as new generations start working in the company you will need to adapt because these new bright lads don’t know how to work differently. There will be no regression.

In part one we saw that people want information that systems cannot interpret because questions are put in natural language.

In part two we discussed the paradigms of ontological management and how it can help to overcome the challenges outlined above, but it must be adaptive, dynamic, and evolve as organization domain changes. Ontologies cannot be a one time snapshot.

In part three it was defined a model for process, domain and organization ontology discovery.

On this last article we integrate semantics on business process management cycle.

So far we focused on a framework to build ontologies for process, domain and organization and how it can support enterprise architecture.

  • Process ontology: Identifies all the artifacts that describe a process, regardless of whether it is structured or not . It allows building clearly and unambiguously all process elements, linked with the domain ontologies that specify enterprise concepts,  as well as the business rules,  roles, outcomes, and all the other inter-dependencies.
  • Domain ontology: its not a glossary of terms, is what defines the company sphere and represents what the company does. This ontology provides vocabulary of concepts and their relationships, about the activities performed and on the theories and elementary principles governing that domain.
  • Organizational ontology: Identifies who participates in the work executed and how people are connected through the work and responsibilities assigned to them.

Ontology construction and its maintenance are beyond the scope of these articles series, but these references are a good starting point [20], [21].

As companies increasingly rely on information systems to run their processes it is necessary that systems provide semantic capabilities that allow them to use and apply ontologies, thus becoming intelligent. As already highlighted beforesystems still do not have fully developed intelligent capabilities to interpret information. Continue to be mere repositories of information in relational databases, but do not allow people to get the data they need to work and to reason. Paradoxically, systems turn out to be a barrier to understanding the enterprise as a connected system, unable to capture the characteristics of the organization in various dimensions: processes, domain and organizational.

As  dominance of unstructured work increases, integration of ontological management in enterprise architecture becomes more important for knowledge management challenges.

In 2009 it was already pointed out [22] the rise of knowledge management (although Drucker predicted by the 60’s !), a trend coined under the term adaptive case management. As pointed out by Deborah Miller [23] today with the growth of unstructured work people have to adapt to the unexpected, overcome challenges and achieve the unpredictable nature of our activities to achieve our objectives, management in this context, human judgment, external events and business rules do not determine the paths through a pre-defined (as a process flow) path. Rather, these factors determine in real time the activities that must be performed.

One of the key factors that support knowledge management is collaboration, but collaboration at its most intrinsic form is full of sources of waste. In the scope of ontology management two categories are worth identifying:

  • Interpretation: Time lost interpreting artifacts and it’s concept;
  • Research: Time wasted searching for information and their intra-relationships.

That is why the ontology management plays a key role allowing people obtain artifacts they need to work with, business rules, and remaining content in a facilitated manner.

How Ontology management integrates with BPM

The Business Process Management is usually based on these simplified four stages. This model is independent of the nature of the process that is running: structured or unstructured. I still don’t believe that BPM does not apply for unstructured processes.

BPM Cycle

Design phase: It defines how the process is going to be executed. In a structured process includes the process model everybody will have to follow. In an unstructured process, the process is constructed in real time, people define the set of actives and the information they need to work with.

At this stage the existence of a pre-defined ontology supports the understanding of the nature of the tasks that must be performed, and the nature of the concept of each informational entity. Thus it eliminates the ambiguity of natural language usage to the detriment of the concepts that are part of the ontology that is being used (process, domain, or organizational). The process model construction supported by an ontology can later be used during automatic Web Services linkage [26] if the process is automated all the elements that describe the process can be constructed in a format capable of being interpreted by machines, using for example XML. Another advantage is the possibility of reuse process models something that is a reality on adaptive case management.

Implementation: SOA is here for a while but companies struggle to connect seamlessly systems. One of the biggest challenges of adaptive case management systems is how they address connection in real time with data stored somewhere. A user cannot figure it out or manage to connect to it.

To make it possible endless connection capacity, it’s necessary to semantically describe all aspects of the services that are available through the interfaces. The standard WSMO [28] sets out how services should be described.

The process of building an ontology to describe a particular context is usually time consuming. In order to facilitate the introduction of ontological concepts, especially those that are internationally accepted in a particular industry WSMO was designed to be modular to the point of importing existing ontologies. The WSML [29] an extension of WSMO that defines a formal language to describe ontologies, goals, Web Services, semantically and a model for managing different language variants encompassing both logic descriptions as logic programming.

Still it’s possible to exist duplicate services, Semantic Web Services provides the following additional benefits:

  • Discovery of Web Services: Web Services are usually stored in a registry in order to be discovered, but with semantic capabilities it’s possible to prevent duplication and ambiguity;
  • Process Automation: Putting the process on execution mode involves the translation of all the artifacts that defines it. If there are no process models and Web services that can be reused, they are created and stored in the shared registry, otherwise it’s made ​​the discovery and connection to the process models (or process models parts) when the process is first installed or as you are execution it in adaptive case management mode. It is not necessary to make adjustments on message communication protocols, transport, security, because these features are supported automatically.

Execution: Under a semantic approach services are automatically discovered, linked to activities that invoke them in real time as opposed to a classic design highly coupled used specifically for the particular implementation of a process.

Analysis and monitoring: Both the analysis and monitoring use predefined reports or dashboards, thus limited to fixed content. Semantic Business Process Management allows users to search and correlate information dynamically supported by data recording semantically annotated by a system engine put in place for execution. Despite bottlenecks identification (tasks running more than once), workload balance and KPI’s provide important information about execution and improvement opportunities, the dawn of running unstructured processes or a bend between structured and unstructured, where execution is driven by events, questions, curiosity and outcomes is particular difficult to extract information to carry on our daily activities supported by predefined control mechanisms. People cannot way anymore one week to get hands on the data or an app than needs to be developed for a special purpose.

One last thought. Google Wave the trigger of this articles series is on read only mode. Is no longer supported by Google. Funny how concepts evolve.

References:

[20]  – Tasks for Ontology Design and Maintenance – Domenico Lembo, Carsten Lutz, Boontawee Suntisrivaraporn

[21] – Techniques for Ontology Design and Maintenance – F. Baader, R. Bernardi, D. Calvanese, A. Calì, B. Cuenca Grau, M. Garcia, G. de Giacomo, A. Kaplunova, O. Kutz, D. Lembo, M. Lenzerini, L. Lubyte, C. Lutz, M. Milicic, R. M¨oller, B. Parsia, R. Rosati, U. Sattler, B. Sertkaya, S. Tessaris, C. Thorne, A.-Y. Turhan

[22] Using technology to improve workforce collaboration – James Manyika, Kara Sprague and Lareina Yee

[23] How Adaptive Case Management Helps Businesses Overcome Challenges and Improve Performance – Deborah Miller

[26] Service-Oriented Architecture Ontology – The Open Group – ISBN: 1-931624-88-7

[28] Web Service Modeling Ontology – Cristina Feier, John Domingue http://www.wsmo.org 

[29] Web Service Modeling Language – J. de Bruijn, H. Lausen, R. Krummenacher, A. Polleres, L. Predoiu, M. Kifer, D. Fensel http://www.wsmo.org .

Process Paths

This is a mirror post from a discussion group for future reference before it enters in the oblivion zone regarding a <post> of Wil van der Aalst called Process Mining: Desire Lines or Cow Paths?

The challenge that Wil is proposing is far beyond choosing the right path, is related to the fact that today people start to realize that there not only structured (cow path) processes that typically exist on companies highly coupled with compliance (like pharmaceuticals for example) or unstructured (the desired line) like how it happens in hospitals. Actually there is a <blend> of how people work and there is no black or white way to do the right thing.

Thus, I would like to put the challenge on other perspective: when decision makers need to make a choice of the best path to implement the most important fact is they can access all the possible paths (most used, often used, single use) and imagine others that were not discovered but could be implemented. On options evaluation or scenarios if you want, the best path is the one that is aligned and balanced regarding efficiency (cost, time, resources) effectiveness (outcomes, not outputs) corporate human culture (how people like to execute and collaborate) company assets (systems, physical infrastructure) and strategy (business objectives).

Sometimes the detour path path is excellent in speed but does not comply with regulation and other times is the best way to do it despite most of the people in the company did not discovered it. In the era of rising of unstructured work (that existed all the time, the relevance is we are experiencing the market entry of systems tailored to such kind of environments) some argue that there is no such thing of finding the best path and just let people free executing what is best to achieve regarding business rules.

Well, I discovered from my experience that despite there is no single way of reasoning, because we as humans react differently the multitude of sparks and signals (contrary of programed bots or systems) we can adapt to a better way of executing keeping our freedom of action and reasoning, learning with different practices executed by different people and for that we need to understand and take all the possible paths.

BPM – a year in review – 2011

This is becoming a tradition, writing about what happened across the year. Once I’m not an analyst or a sorcerer, I cannot predict the future, thus, I will be concentrated  in the rearview mirror perspective for reference only and somehow monitor if these trends ramp up or disappeared into the fog.

I still believe that 2012 will definitively be the year of mobile BPM regarding the availability of good mobile hardware with big screens, high battery life and ubiquity of internet connection.

This the 2011 list. It’s in alphabetic order.

Adaptation – The acronyms war around advanced, adaptive, dynamic was interesting or dull regarding your perspective on the topic. I don’t know why people lost to much time around definitions (coining) rather trying to use it on real world.

The importance of adaptation is close and far from for Darwin theory. Is close because without it your company will not be able to perform. Is far because this is not about restructuring a company or a line of products. It’s about execution.

I remember some years ago when I was leading a team to deploy a major process change, on the training phase, the participants were asking me to make changes not in the process, but the way they could behave and make decisions. If we go back to the BPM theory, at this point you should collect all the feedback and try to make some final improvements before the process roll out. The thing was, in those days and still on these days it’s not very flexible to change, or better to adapt to stimulus or changing situations because IT, or the process does not let people to do what is correct to do.

Contrary to what is thought, CIO’s are worried about adaption because they know the consequences of execution inflexibility and siloed applications that despite tied by web services put users highly coupled to what they can perform

Adaption is about you can respond to stimulus (event, data, notification,changing circumstances) on a given context to achieve an desirable outcome.

Think it like your brain that reacts to interaction you have with the environment that surrounds you. If you can drive business process under these principles you are turning to adaptive. In a world of hyper connectivity failing to be adaptive can bring some bitterness.

Big Data – This is topic that missed most of BPM discussions this year. Today most of the companies are data to report collectors but are missing the value of the remaining available data some of them is free flowing on the internet (social interactions, for example) other in sensors, etc. Data matters because it can tells you in detail what are you doing, wat your customers want, how people react to conditions. Data is not anymore how much do you sell or or much sales leads you lost or about your marketing campaign results. Pity most big data debates are around how distribution industry (supermarkets) are taking advantage of big data. Big data is about your company ecosystem behavior and this is why is important to business process management. Actually is a company asset, you as a manager should transform to a goodwill.

Big data can help to be more outside in, or faster to bring outside in perspectives and adapt, introduce, kill, your products or services.

Big data supports process performance beyond predefined corporate reports.

Big data tells you about execution, action and reaction, thus bakes improvement.

Big data is here to stay. EU’s new open data strategy is just an example.

Process Mining – At last something diferent that emerged in the BPM space. Process Mining is a “new” technology that can speed your improvement and adaptation efforts into a new dimension.

Process Mining helps your company to understand the reality inside your company (or outside it if you have data to do inter organizational mining). As systems are spreading across the enterprise, where every piece of work is becoming stored: your customer relationship interactions, your back office support, your supply chain execution, the collaboration of your workforce.

Everything leaves a trace that is impossible for us humans to analyze. On the other hand we are changing more and more to unstructured work  or predefined work with unstructured snippets that makes virtually impossible to figure it out what are the roots of process execution, collaboration patterns and waste categories. In some companies, people do not work anymore in companies offices, process outsourcing of low added value processes are becoming a standard and it costs to much time and money to carry process walk troughs as it happen in the past.

Bias, hidden agendas, subjectivity kills improvement. Process mining is all about the facts, the reality even if that hurts. Process mining can be used within the full process spectrum: structured to ad-hoc.

I’m proud of belonging to the task force on process mining and  baking the manifesto. If you are interested in discover something more about process mining, follow these links:

A short key note – <here>.

The process mining manifesto for a serious introspection – <here>.

What is a process mining project – <here>.

Social BPM – This is a repeating category. Last year everybody were concentrated on the wrong topic: process control flow design in collaboration mode. But this year the concept evolved to the correct area.

The importance of social BPM is that enables management of dynamic processes to deliver greater business response. In other words enables adaptation.

XaaS –  Everything-as-a-Service. Cloud and cloud related business models, the term has we know it, will disappear in the next couple of years, because it will no longer be an option, it will make part of the infrastructure. XaaS will not only have impact in a shift how business and people use IT, but particularly the way people consume information, meaning that

If you are a service provider you no long sell IT, you sell a result.

In the package you can put systems, professional services, whatever. This will also change the way decision makers choose and pay for it. Before, you could sell something that did not work as predicted on the demos, but after the investment have been made you have sometime to fine tune and readjust. In the XaaS model it can be great having a constant flow of revenue streams, but if from the start you cannot prove you can deliver results, your are out of the business despite having obliterated your customer’s high up front costs.

The great challenge XaaS will have to overcome (from the IT perspective) is intelligent enterprise system connection hiding tech from the users. This means a blend of technologies, processes and knowledge management, integrating information access,  events and alerts and collaboration with high reporting capabilities.

Interested in 2010 list? Click <here>.