TY - GEN
T1 - Measuring the Stability of Process Outcome Predictions in Online Settings
AU - Lee, Suhwan
AU - Comuzzi, Marco
AU - Lu, Xixi
AU - Reijers, Hajo A.
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023/10/27
Y1 - 2023/10/27
N2 - Predictive Process Monitoring aims to forecast the future progress of process instances using historical event data. As predictive process monitoring is increasingly applied in online settings to enable timely interventions, evaluating the performance of the underlying models becomes crucial for ensuring their consistency and reliability over time. This is especially important in high risk business scenarios where incorrect predictions may have severe consequences. However, predictive models are currently usually evaluated using a single, aggregated value or a time-series visualization, which makes it challenging to assess their performance and, specifically, their stability over time. This paper proposes an evaluation framework for assessing the stability of models for online predictive process monitoring. The framework introduces four performance meta-measures: the frequency of significant performance drops, the magnitude of such drops, the recovery rate, and the volatility of performance. To validate this framework, we applied it to two artificial and two real-world event logs. The results demonstrate that these meta-measures facilitate the comparison and selection of predictive models for different risk-taking scenarios. Such insights are of particular value to enhance decision-making in dynamic business environments.
AB - Predictive Process Monitoring aims to forecast the future progress of process instances using historical event data. As predictive process monitoring is increasingly applied in online settings to enable timely interventions, evaluating the performance of the underlying models becomes crucial for ensuring their consistency and reliability over time. This is especially important in high risk business scenarios where incorrect predictions may have severe consequences. However, predictive models are currently usually evaluated using a single, aggregated value or a time-series visualization, which makes it challenging to assess their performance and, specifically, their stability over time. This paper proposes an evaluation framework for assessing the stability of models for online predictive process monitoring. The framework introduces four performance meta-measures: the frequency of significant performance drops, the magnitude of such drops, the recovery rate, and the volatility of performance. To validate this framework, we applied it to two artificial and two real-world event logs. The results demonstrate that these meta-measures facilitate the comparison and selection of predictive models for different risk-taking scenarios. Such insights are of particular value to enhance decision-making in dynamic business environments.
KW - event stream
KW - performance evaluation
KW - predictive monitoring
KW - process outcome
UR - http://www.scopus.com/inward/record.url?scp=85175424484&partnerID=8YFLogxK
U2 - 10.1109/ICPM60904.2023.10271960
DO - 10.1109/ICPM60904.2023.10271960
M3 - Conference contribution
SN - 979-8-3503-5840-7
T3 - Proceedings - 2023 5th International Conference on Process Mining, ICPM 2023
SP - 105
EP - 112
BT - Proceedings - 2023 5th International Conference on Process Mining, ICPM 2023
A2 - Munoz-Gama, Jorge
A2 - Rinderle-Ma, Stefanie
A2 - Senderovich, Arik
PB - IEEE
T2 - 2023 5th International Conference on Process Mining (ICPM)
Y2 - 23 October 2023 through 27 October 2023
ER -