Process Analysis Modeling

  • Uploaded by: Ioannis Moutsatsos
  • 0
  • 0
  • December 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Process Analysis Modeling as PDF for free.

More details

  • Words: 2,536
  • Pages: 4
From Cow Paths to Superhighways



Page 1 of 4

From Cow Paths to Superhighways

Sophisticated process analysis techniques can bring dramatic change to informatics implementations Stuart M. Miller and John M. Petrakis Even before it was called “lab informatics,” the concept of process improvement as a predecessor to informatics system implementations was recognized and accepted as a necessary part of most major lab informatics projects. Yet, today, increasing pressure to get informatics projects done faster and cheaper, particularly in the life science industry, is resulting in abbreviated, aborted or completely ignored process analysis. IT teams are usually eager to get to the “fun stuff” of implementing the technology. Convincing them and the budget approvers of the need for time, resources and money to analyze and improve processes before implementing new informatics systems is not easy. The alternative is, however, to use a common IT euphemism, to continue “paving cow paths.”

click to enlarge Figure 1: Comparison Chart • Advantages of dynamic process modeling and simulation over static process modeling

This trend of ignoring process analysis and merely paving over the old and inefficient lab business process “cow paths” with new informatics tools, like LIMS, CDS, SDMS and ELN, is even more alarming when you consider the evolving nature of commercial lab informatics systems. Newer informatics tools, like ELN and some LIMS, are providing very specific niche functionality that must be intimately interwoven with everyday work processes in order to be effective. Ignoring process analysis can be more costly in dollars and time than any incremental savings you might have temporarily realized from bypassing up-front process analysis and improvement.

Budget approvers in many organizations have learned from prior mistakes made of investing heavily in multiple commercial state-of-the-art informatics systems only to discover the systems have not delivered the anticipated benefits. Consequently, informatics project teams are being challenged to develop hard financial and process improvement metrics as part of their business cases to demonstrate the predicted return on investment (ROI) prior to the approval of project funding. They must provide to budget approvers the hard metrics they need to make decisions on where to spend informatics program funds and how to monitor the improvements over time. Process modeling and simulation is one method that bridges the gap between traditional static workflow tools and the increasingly sophisticated demands of lab informatics projects.1

Process modeling and simulation Like all industries, the life sciences industry is experiencing enormous pressure to reduce costs and improve efficiency. Until recently, R&D and quality labs have flown under the radar of process excellence initiatives in many companies, probably due to their relatively small size and complex business models, which are foreign to traditional business and process analysts. However, as industry executives become concerned about costs and efficiency, they are driving process excellence initiatives down to greater depths within their organizations. Recent statements by some executives predict the cost of bringing a new drug to market will balloon to over $2 billion by 2010 unless the pharma industry can find better ways to improve efficiency and effectiveness in drug development.2 This would include the laboratory operations that support the development process. In order to examine the business process without significant risk, cost or disruption of business operations, labs should begin adopting a methodology that has been used successfully in other industries — process modeling and simulation.

Static workflow mapping vs. dynamic process modeling and simulation

click to enlarge

http://www.scientificcomputing.com/ShowPR_Print~PUBCODE~030~ACCT~300004346... 9/19/2007

From Cow Paths to Superhighways



Page 2 of 4

The goal of process modeling is to create a simplified model of a business Figure 2: Simplified model of a raw process. The models developed allow analysts to study the processes involved in materials lab process and sub a business in order to: processes • uncover waste and inefficiency • develop changes to a process to correct for performance problems • select process designs that give the best results • provide cost justification for the proposed changes • establish performance metrics for the process. But what exactly is modeling and simulation? Simulation uses a combination of dynamic modeling and simulation techniques and software tools to produce a simulated software-based model of the flow and interaction of materials and information through a business process. It supports detailed analysis and forecasting of business process improvement outcomes and permits those questions to be answered that are most often asked, but rarely answered satisfactorily when using traditional tools. For example: • Capacity: What is the ideal throughput capacity of the business or system? • Workload: Should workload (or activities) be performed at a single site or distributed? • Technology: What is the effect of using automation or integration of systems? • Optimization: What is the optimum number of resources to support the process? • Resource scheduling: What personnel and equipment are needed at what time? • Efficiency: How much time, labor or money can be saved? It can also alleviate functional, technical and management concerns around new technology or process improvement initiatives by giving them visual and quantitative proof that the new system will work, and it may even help uncover process issues, bottlenecks or solutions not previously understood or realized from static mapping approaches. The traditional approach of static workflow mapping is very different from simulation. Workflow mapping, which employs common flow charting tools like Microsoft Visio, can be useful to a point, but has some major shortcomings compared to dynamic process modeling and simulation. Static workflow mapping is familiar, inexpensive and generally easy to use and to understand. However, it also can be too familiar, visually underwhelming those from the business who need to understand and evaluate the models. It rarely provides enough detail to fully describe the variability of the process, provide metrics or make an impact with budget approvers evaluating your business case. It provides only a snapshot of the process and cannot take into account the time-varying nature of a process. Simulation, by comparison, has some clear advantages. For example, simulation is dynamic and graphical, providing a highimpact visualization of the process being modeled. Although more complex to build, simulation is intuitive and easy to understand; it uses the familiar pictorial abstraction or representation of a process workflow but adds a graphical animation or simulation of the process in action. A well-constructed simulation model will simulate the flow of materials and information through a process, including entities such as samples, test data and approval status, and is able to account for random variations in how work is done and how materials and information flow through the real world. One unique and valuable advantage is the ability to perform quantifiable comparisons of ‘what-if’ scenarios facilitating selection of the best ‘to-be’ model. It allows the analyst to evaluate in quantifiable terms the effects of modified or reengineered processes and helps to demonstrate the effect of important changes to the process, as well as modeling the effects of automation, new informatics systems, interfaces, and so forth. As part of this evaluation, analysts can perform statistical analysis of process parameters and metrics. The simulation models produce data on key metrics such as cost, throughput, capacity, wait time and others that can be analyzed in order to optimize the process. The other advantage of simulation is the ability to forecast costs associated with a process using activity-based costing. Activity-based costing involves the evaluation of the cost of resources or equipment based on activities performed in a discrete part of the process. The actual cost of different parts of the process can be analyzed in terms of value-added versus non-value-added activities and can enable better decisions to be made based on cost. Simulation also provides future benefits in the form of a tool to monitor the process improvements implemented. The models developed can be reused as templates or dynamic reference models. Actual values of key metrics from the improved process can be loaded into the model and can be used for ongoing monitoring of process improvements and reporting actual results compared to forecasted improvements.

click to enlarge Figure 3: Top-level screen shot of simulation model with three methods of examining the simulation statistics

Finally, simulations are more engaging than static workflow models and, therefore, more likely to evoke a response. This stimulates employee participation in the process of identifying areas for improvement and development of new process solutions, and promotes innovative thinking to help ensure acceptance of the proposed process changes.

Ultimately, simulation is an extremely valuable tool to help optimize business processes. It reduces experimentation time and the risk of costly field implementations of incorrect solutions by modeling

http://www.scientificcomputing.com/ShowPR_Print~PUBCODE~030~ACCT~300004346... 9/19/2007

From Cow Paths to Superhighways



Page 3 of 4

and validating both process changes and the benefits of technical solutions or automation. It also reduces the time required to collect process metrics by employing an easy-to-use tool and methodology.

A case study Taratec recently performed a lab business process improvement (BPI) assessment for a large pharmaceutical R&D organization with three labs. A small piece of this assessment can be used as a simple case study to demonstrate how modeling and simulation were applied in a lab environment. This organization had recently implemented both LIMS and CDS and had an aggressive schedule of follow-on integration and new technology projects planned. A lab BPI process profile assessment found that overall acceptance and utilization of LIMS and CDS was low, resulting in a poor return on these recent investments. The results of this lab process profile offered valuable insight into the current operations, but any changes made would be trial-and-error implementations. So, to avoid the risk associated with trial-and-error changes, we employed the use of modeling and simulation. The first step was to develop a clear focus for the modeling effort. In this case, the purpose of the model was to test process and technology changes. The next step was to map the main activity flow. That meant defining the basic process flow, sub processes, activities, resources and time duration, workflow sequence, business rules and behaviors. A simulation was developed to flush out mechanical mistakes in the process, to tune performance of the model and to check activity-based costing metrics necessary to ensure robustness of the model. This established an ‘as-is’ baseline model that could be used to measure improvements and to test ‘what-if’ process alternatives, such as improved or standardized processes and elimination of non-value-added activities, as well as to demonstrate the benefits of integrating new technology like ELN to improve the efficiency of the process. Figure 2 illustrates a simplified model of a raw materials lab process and sub processes for pre-lab, within-lab and post-lab activities. In this dynamic model, the computer simulates the flow of samples and information through the process. The model accounts for the random variations in how work is done and the way samples flow through the real-world lab. By employing discrete event simulation to capture the time-varying nature of the process under study, we are able to correlate the data produced by the model with measurements taken from the real processes. This provides a good degree of certainty that the model had adequately captured the essential features of the real process. Using this approach and simulation tool allowed us to integrate process mapping, hierarchical event-driven simulation and activity-based costing into a single modeling and simulation environment. Figure 3 is a top-level screen shot of the model with three methods of examining the simulation results: 1. Dynamic metrics are updated as the simulation runs — we could define virtually any set of metrics such as samples initiated, analyzed and approved. 2. At either side of the main graphic are two real-time plots that show the click to enlarge number of samples in the system and number of QA resources busy. We could isolate any entity (in our case a sample is an entity), resource or activity (to Figure 4: Side-by-side comparison of collect statistics) counts, cycle time, units busy. 3. Displayed to the left (background) is a standard numerical report that captures the ‘as-is’ and ’what-if’ simulations all performance and cost statistics that were specified to be collected by the simulation model. These reports include volumes processed, cycle time, resource utilization and activity-based costs. Finally, the numerical data was analyzed for both the ‘as-is’ and a number of ‘what-if’ simulations. Figure 4 represents a side-by-side comparison of some key metrics from the ‘as-is’ and ‘what-if’ simulations. This table shows the sample volume in the ‘as-is’ and ‘what-if’ simulation models was kept constant, while cycle time was reduced from 25 to 14 days and actual hands-on processing time was reduced by nearly one-quarter of a day. In addition, resources were freed up adding potential for increased capacity. Perhaps more importantly, costs were reduced from $62K to $36K just for this very small piece of the process. This clearly demonstrates how the use of activity-based costing can be extremely valuable in truly understanding the price of hands-on work. In addition, these types of metrics allow us to decide what other alternatives should be examined. For instance, what happens if the number of analysts were reduced from four to three or the sample volume increased by 30 percent?

Conclusion In summary, as companies continue to drive process improvement further and further down into their organizations and operations, it is clear that labs and their lab informatics projects will be expected to provide more measurable benefits to the business. One of the most effective approaches to meet these expectations is through the use of process modeling and simulation. As other industries have already discovered, simulation clearly adds value by providing a structured, repeatable process for developing, evaluating and comparing proposed solutions and implementation strategies that is far superior to traditional static workflow mapping techniques. Simulation also provides exceptional value by providing cost and time information to support the business case (activity-based costing), as well as the ability to visualize the ‘to-be’ state and to gain buy-in for changes, including modeling the impact of informatics technology options.

http://www.scientificcomputing.com/ShowPR_Print~PUBCODE~030~ACCT~300004346... 9/19/2007

From Cow Paths to Superhighways



Page 4 of 4

It is exactly this type of sophisticated and value-added approach that is needed in lab informatics to improve the industry’s record of success and to support the renewed interest in process efficiency and improvement. Modeling and simulation have the potential to dramatically change the way lab business processes are deconstructed from meandering, low-speed “cow paths” into high-velocity process “superhighways,” while at the same time establishing the hard business metrics to justify where to spend informatics dollars. As one colleague put it, we don’t want to transform the cow paths into superhighways if they’ll only be used by the farmer and his cow.

References 1. Petrakis, John M.; Miller, Stuart M. "Lab BPI III: Realizing Your Vision: Using Process Simulation to Model Your Future State." www.taratec.com/about/ response.taratec.asp?id=labvision_webrequest_download 2. Davies, Kevin. BioIT World. Drug Costs Nearing $2 Billion, Warns Lilly Executive. August 11, 2006. http://www.healthitworld.com/newsitems/2006/ august/08-11-06-drug-costs. Stuart Miller is Practice Director, LIMS & Lab Informatics, and John Petrakis is Director of Business Process Improvement Services at Taratec Development. They may be contacted at [email protected]

© 2007 Advantage Business Media All rights reserved. 100 Enterprise Drive, Suite 600 Rockaway, NJ 07866-0912 Use of this website is subject to its terms of use. privacy policy

????????????????????????????????????????????????????????????? ?????????????????????????????????????????????????????????????

http://www.scientificcomputing.com/ShowPR_Print~PUBCODE~030~ACCT~300004346... 9/19/2007

Related Documents


More Documents from ""