Keywords: #EventLogs #ProcessImprovement #ProcessMining #HowTo #RACI #Governance
Event logs are a special type of data that store the events that happened, usually for a software system. With a little effort, event logs can be probed and harvested for hidden and valuable insights, considering that they reflect very accuracy the actual events that occurred. The discovered insights will be unique to your organisation and as such give you an unparalleled knowledge about your operation.
The concept is very simple, you take a set of event log, with analyze it with the right analysis tool, and get hidden insight about your organisation.
Event Log + Analysis = Hidden Insight
The process of analyzing event log is commonly known as Process Mining. It is a maturing field in the space of Process Intelligence and Data Science. Today, there are many different algorithms used for process mining. Some are generic and others for specific use case.
If the thoughts of tools and algorithms sound very technical and complex, fear not. it is actually quite straightforward and simple.
A Brief Introduction on Event Logs
During the 1980s, as software development practices were maturing, teams replaced individuals and were building complex and integrated software. It became critical to have some form of built-in tracking features to help with software debugging and problem resolution. Basically, to introduce a logging feature which records what was done at when. The simplest form of a event log entry was in the form of
What was Done at When
|Jane||New User Created||2021 Sep 12 08:00am|
Over the following decade, with the proliferation of networking and networked systems, it became possible for more than one user to use the same system. The logging feature becomes common practice and logging was used to provide officiate an audit trail of the system and its users. Logging became imperative to provide a record of who is responsible for the actions in the system.
So Why Event Logs?
Event logs are useful for at least 2 reasons: 1) It is an accurate representation of what happened at when, 2) When individual logs are combined in a meaningful manner, it shows a link or relationship between logs, for example duration, utilization, implications and dependencies. In other words, you get very unique insights which are not directly visible on the log itself.
What problems can I solve with these insights? Well, briefly, you can:
- Discovery – Discover processes in your organisation and understand how it performs
- Conformance – Validation and verify if actions are conducted as defined in the organisation’s policies
- Performance – Easily identify usage patterns, bottlenecks and resource utilization
Sounds amazing isn’t it? If it sounds like you can do wonders with process mining, you can!
The 3 mandatory columns of an event log for process mining contain the What, Done and When.
What was Done at When
In Process Mining, these columns translate to
|Column||Description||Known As||Also Known As|
|What||Describes an entity. It can be any form of identifiers pointing to a subject of interest. This is not confining to a person and can be used to track other entity types such as a project, a service, a product, or customer, etc. Common behaviour requires this to be some type of ID (identifiers).||Case Id||ID, Identifier, Entity, Subject, Author, Creator, Person, Project, etc|
|Done||The state you are interested in. Depending on the choice of entity, it translates to states describing the entity against a timestamp||Activity||Event, State, Stage, Status, Milestone, Checkpoint, etc|
|When||The most straightforward of the 3. Describes the date and time when the event happened.||Timestamp||Datetime, Date, Time, etc|
For the rest of the article, we will use the annotation Case Id, Activity and Timestamp to describe these 3 fields.
What -> Case Id Done -> Activity When -> Timestamp
As you can see, it is a simple data set. Take the following sample event log for example, which consists of only 4 rows.
|Jane||New User Created||2021 Sep 12 08:00am|
|Jane||Grant Access||2021 Sep 12 08:25am|
|Jane||Make Administrator||2021 Sep 12 08:29am|
|Jane||Save Document||2021 Sep 12 08:32am|
You can see that, the user Jane was created at 8.00am on the 12th Sep 2021. 25 minute later at 8.25am, Jane was granted access and another 4 minute later at 8.29am, Jane was made an administrator. Finally, 3 minute later at 8.32am, Jane saved a document.
From this log, we can easily deduce the time taken in between the related activities. It also implies that one should Create New User BEFORE Grant Access, and Grant Access BEFORE making someone an administrator. By pure visual observation, you can already decipher so much.
In a nutshell, this is what a process mining tool does. A process mining tool can work almost instantaneously on a large number of data (so that you don’t have to). Aggregate the results and crunch all these numbers for you, all at the click of a button!
Where and how to get Event Logs?
I hope you are sold on the idea of Event Logs and Process Mining.
Today, logging is common practice in software design. Most modern information systems such as Enterprise Resource Planning systems (ERPs), Customer Relationship Management systems (CRMs) and many others contain event logs.
Sourcing for event log takes up most of the process improvement time. There are 2 areas to consider when sourcing for event logs
- People factor – stakeholders, governance and policies in place to enforce data confidentiality, privacy and security.
- Systems factor – no single method to extract event logs as each system is different.
From a RACI (Roles, Authorised, Consulted, Informed) perspective, there are a number of stakeholders who should be informed or be able to provide authorization.
Each organisation is different, but I advocate the approval from the following 3 key stakeholders in the order listed:
Process Improvement or Business Transformation Team
|The manager of Process Improvement Team or Business Transformation Team to approve Process Mining as a feasible approach to process improvement. If the team is already embarking on automation and/or Robotic Process Automation (RPA), know that Process Mining and RPA forms a collective term known as Hyperautomation which is significantly more impactful than tradition automation capabilities.||Approving Process Mining as a viable option for process improvement.|
Business Units, Departments
|The manager of business units or department to approve their process as pilot runs, or case studies for process improvement. An important note here is for the improvement team to protect and secure the privacy and confidentially of individuals involved in the process.||Approving the use of event logs relating to the selected process.|
|IT Manager||The manager of IT department, Business as Usual (BAU), infrastructure or systems team to approve the extraction of actual data artifacts.||Approving the extraction of data artifacts.|
The titles of these managers may be different in your organisation and they may also double up in multiple roles. Even better if you already hold one or more of the roles.
A good way to introduce Process Mining is through a demo. ProcessChampion provides an easy way for you to demo the power of Process Mining on Power BI, using a neutral and sanitized event logo.
The second challenge is the actual sourcing of Event Logs. As the data may reside within multiple systems, you will first need to identify the related systems which contain the event logs. Once these systems are identified, you may then consult the respective business owners and system owners for authorization.
Depending on the system, some event logs are easier to retrieve than others. Event logs are generally located in one of the following 3 locations:
|Where||Description||How To Extract|
|1) Built-in with export||Event log is provided and can be extracted by end users or power users, usually through reporting function.||The easiest approach. Export these data yourself. Build a proof of concept with Process Mining and determine if other forms of automation is required.|
|2) Built-in without export||Event log is available but cannot be extracted without technical involvement i.e. secured folder access or API integration.||This approach can get technical. Engage the help of IT (administrator) to extract or use a Data Connector to extract. Power Bi comes with a ton of good database connectors which can be useful in this scenario.|
|3) Not Built-in||Event log is not available||This most technical approach. Engage the help of Data or BI team to build a data ingestion pipeline. Each snapshot storing a state of the data against a timeline. The result becomes a data mart which can be used as event log.|
The methods can be combined and are not mutually exclusive. Also, the extractions depends on what roles you play in the organisation and how much access you have. It is no doubt that most process mining exercises do spend significant amount of time extracting event logs than performing the actual analysis. But if you got this far, keep going. It’s worth it.
An aspect to consider is when there is no system involved. This means that the process is purely manual. While not as common nowadays, introducing some forms of automation may be a good start for process improvement.
Eventually, you will need to consider the frequency of the extraction. I will cover this in another article. For a start, able to get hold of any event log is a win. Especially if you want to assess the validity of the data before progressing further.
What Event Logs Should I get?
One of the the biggest question I get asked constantly was What data should we get? I suggest 2 approaches here.
The idea here is the team has decided what to improve, and may have modeled the process in flow charts or BPMN. This means that you have decided where you want to look. The approach here is straight-forward:
- Decide which process to improve (with or without process models)
- Identify the system(s) associating to this process
- Identify the key Case Ids (entities) you want to analyse for this process
- Extract event logs from the related tables from these systems
- Merge or link these data together using their Ids (i.e. Case Ids)
- Process Mining.
The Exploring approach is where you start without a predetermined process. You start by sourcing for event logs and analyse it using a process mining tool. Personally I like this approach as well as I get pretty good amount of surprises in a good way. This is best use when you have completed a significant Top down approach, or simply could not secure any reliable event logs for analysis.
The approach is more random and less restrictive. Some examples of entities based on the industries are:
- Government – Services, Applications, Residents
- Healthcare – Patient, Specialties, Wards
- Education – Students, Lecturers, Courses, Room and Resources
- Finance – Customers, Loans, Investments
- Retail – Sales, Operations, Delivery
- Project Management – Projects, Resources
As you can see, it is not restricted to a person type and is suitable when you are sourcing for places to improve.
What makes a good event log
Not all event logs are created equal. Some require more cleaning than others, while others can be used raw. A few softer requirements to determine a good event log.
- Good Volume – The number of rows in the event log should be high to improve accuracy. The general concept is higher volume yields better accuracy. However, this does not mean you cannot process mine into a small event log.
- Number of Unique Case Ids – The number of unique Case Ids should be smaller than the number of total records. Imagine an event log with 10 rows and 10 unique Case Ids. It implies that each Case Id exist only once. This will prevent process mapping to be discovered properly as there is no known relationship between activities (for each case).
- Number of Activities – The number of activities should be large enough for analysis but not too large. There is no hard and fast rule on this one. It goes with the complexity of the process. A soft cap of 50 should be more than enough to depict a process.
- Data Cleanliness – Event Log files come with incomplete data. These data are both useful in some scenarios and may cause errors in others. Give an example, if you are discovering the maximum number of active cases, then any incomplete cases should be included and relevant. However, if you are determining the longest and shortest execution time, the incomplete cases will distort the final date.
Business Process vs Workflow
One thing to remember is the word business processes and workflow are often used interchangeably. In reality they overlap but are not identical.
A business process defines a higher level of abstraction and describes the sequence of activities to accomplish an organisational goal. It can be creative, not automated and focused a lot of choices, conditions and concurrent activities that leads to the fulfillment of the goal. The objective of business process is to have a concise and accurate represent of the business process, choices made and the alignment to the process goals.
Workflow on the other hand, are usually the target of automation. Each step (activity) describes an action leading to a tangible outcome. Some level of business schematics may be abstracted away when converting from business process to a workflow. For example, it can become common place to implement workflow sequentially when in business process it is described concurrently. The objective is mainly to ensure that the sequence of tasks when executed in the defined sequence achieved the desired outcome.
Event logs are more commonly residual of workflow automation. However, when mined, it aims to describe the higher-level business processes where conditions, choices and parallelism takes priority. This can cause some granularity issue when the mined process falls schematically between a business process and a workflow. However, this phenomenon does not undermine the effective of the process improvement initiative.
I got the event log!
Finally, Congratulations! I am envy of what insights you would discover!
Let the fun begin.