Hot Yusof

  • November 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Hot Yusof as PDF for free.

More details

  • Words: 6,706
  • Pages: 10
Proceedings of the 39th Hawaii International Conference on System Sciences - 2006

Towards a Framework for Health Information Systems Evaluation Maryati Mohd. Yusof, Ray J. Paul, Lampros K. Stergioulas School of Information Systems, Computing and Mathematics, Brunel University, Uxbridge, Middlesex, UB8 3PH, UK [email protected]; [email protected]

Abstract The evaluation of Health Information Systems (HIS) is crucial to ensure their effective implementation and positive impact on healthcare delivery. A review of HIS evaluation studies was carried out which indicated that the improvement of current methods is required. In order to satisfy this requirement, a new framework is introduced for the evaluation of Information Systems (IS) in healthcare settings. The proposed HOT-fit framework (Human, Organization and Technology-fit) was developed after having critically appraised the existing findings of HIS evaluation studies. It also builds on previous models of IS evaluation, in particular, the IS Success Model and the IT-Organization Fit Model. HOT-fit incorporates the concept of fit between technology, human and organization. The paper outlined the proposed methodology of a Fundus Imaging Systems in a primary care in the UK. The framework can be potentially used as a tool to conduct better and comprehensive HIS evaluation.

1. Introduction The provision of health care is increasingly shaped by the adoption of Health Information Systems (HIS). HIS is a group of processes implemented to aid in enhancing the efficiency and effectiveness of healthcare organization in performing its functions and attaining its objectives. HIS range from simple systems such as transaction processing systems to complex systems such as clinical decision support systems (CDSS). To ensure effective implementation and positive impact of HIS on healthcare delivery, evaluation of these systems is crucial. HIS evaluation has been highlighted as one of the most important topics of Health Informatics research [8,18]. HIS evaluation is defined as “the act of measuring or exploring attributes of a HIS (in planning, development, implementation, or

operation), the result of which informs a decision to be made concerning that system in a specific context” [3]. The decision making in design, development, purchase or management in HIS requires evaluation; it is a challenge therefore to undertake evaluation [32]. Evaluation serves a number of purposes. Given the unpredictable characteristics of IS in general and the aim of improving clinical performance and patient outcomes in particular, evaluation is undertaken to understand system performance [42]. Essentially, evaluation of health informatics application has the potential to improve the quality of care and its costs and to determine the safety and effectiveness of HIS [28,42]. Evaluation is also performed to improve HIS by using past experience to identify more effective techniques or methods, investigate failure and learn from previous mistake [15]. HIS evaluation has a number of barriers and problems that pose challenges to its evaluators [2,15,46]. Research in Health Informatics evaluation is still in its infancy and what constitutes ‘good’ HIS is unclear [2]. It is argued that while HIS are increasingly being developed, the number of published evaluations is limited [3,43]. In addition, evaluation of HIS is difficult to perform, particularly in selecting a framework to be applied and methods to be used [3]. However, it seems that such problems regarding evaluation methods in Health Informatics can be overcome [2]. New methods and extensions are needed to improve the existing methods [32]. The organizational and social issues of HIS should be analyzed before undertaking evaluation, as they are the main components of such system [37]. Moreover, technology, human and organization should fit with each other to realize the potential of HIS. Therefore, a comprehensive evaluation framework that addresses technology, human and organization and the fit between them is essential to ensure effective evaluation. More work on these areas is needed as most existing evaluation studies of HIS focus on

0-7695-2507-5/06/$20.00 (C) 2006 IEEE

.

1

Proceedings of the 39th Hawaii International Conference on System Sciences - 2006

technical issues or clinical processes, which do not explained why HIS works well or poorly with a specific user in a specific setting [8,9,24,25,26,27]. The aim of this paper is to propose a new framework for HIS evaluation that incorporates comprehensive dimensions and measures of HIS effectiveness and fit between the technological, human and organizational factors. This proposed framework, HOT-fit (Human, Organization and Technology-fit) will potentially be useful in conducting a better and thorough evaluation study. It is also possible that it will assist researchers and practitioners in unfolding the complexity of HIS evaluation. The new framework builds from previous models of IS evaluation, namely IS Success Model [10,12] and IT-Organization Fit Model [35]. In developing this model, problems and methods of HIS evaluation highlighted in the selected Health Informatics literature are discussed. Further, the aforementioned models in the field of Information Systems are presented to explore their applicability in improving those of Health Informatics. This paper is organized into five sections. The following section provides a theoretical background of the proposed framework including a brief description of early approaches to HIS evaluation. The proposed evaluation framework for HIS is presented in section three. Section four presents research methodology for this research in progress. Finally, conclusions and recommendations for further research are given in section five.

2. Theoretical Background This paper reviews the selected literature of Health Informatics and Information Systems. First, early approaches to HIS evaluation are analyzed. This analysis is followed by the discussion of the IS Success Model. The limitations of this model are related to human and organizational factors and the concept of fit; therefore these factors and concept are presented next. The proposed model combine the strengths of the IT Organization-fit with the IS Success Model, thus, the latter model is subsequently put forward.

2.1 Early Approaches to HIS evaluation Evaluation can be performed at human or organizational level. Evaluation of human factors covers a wide range of issues including training, personnel, personnel attitudes, ergonomics and regulations affecting employment [23]. Beyond this individual unit of analysis, larger organizational

issues arise that relate to its nature, structure, culture and politics [23]. The organizational aspects of evaluation are classified into organizational structure and organizational environment. The organizational structure constitutes organizational nature, hierarchy and functional divisions. External environment includes regulations governing hospital, the type of populations being served, epidemics that may occur, government policy concerning financing of health care, and the supply of physicians. Approaches to HIS evaluation were developed based on different domains including technical, sociological, economic, human and organizational. Here, a number of frameworks, which were explicitly designed to be used for HIS evaluation, are reviewed to identify the evaluation dimensions and measures used to evaluate systems in a healthcare setting (See Table 1). Table 1: Selected HIS evaluation frameworks Framework 4Cs (Kaplan, 1997) CHEATS (Shaw, 2002)

TEAM (Grant, et al., 2002) ITAM (Dixon, 1999)

Evaluation Aspects Technology Human HIS and its Communidevelopment cation impact Technical Human Education Social IS based on management level IT adoption

Role

Organization Control Care Context Clinical Organization Administration Structure

Individual user

4Cs is developed from the Social Interactionist Theory, which stands for Communication (interaction within department), Care (medical care delivery), Control (control in the organization), and Context (clinical setting) [23]. The evaluation measures of this framework can be make clearer and the Control aspect need further explanation. CHEATS is a generic framework for evaluating IT in healthcare that has six evaluation aspects: Clinical, Human and organizational, Educational, Administrative, Technical and Social [36]. CHEATS attempt to provide a more comprehensive evaluation aspects and more specific measures, especially in the clinical aspect. However, the dimensions within some of the aspects, such as technical and human and organizational and dimensions within each aspect are still inadequate and need further work. Further, a global framework known as TEAM (Total Evaluation and Acceptance Methodology) is constructed based on systemic and model theories [19]. It has three

2

Proceedings of the 39th Hawaii International Conference on System Sciences - 2006

dimensions: Role, Time (evaluation phase) and Structure (strategic, tactical, operational management level). The 3D structure of this model illustrates clearly the components of system evaluation. However, apart from the Role and Time aspects, the Structure aspect can be challenging. The selection of evaluation measures that match the management level can be challenging as the same measures can be categorized into more than one management level. As a whole, this framework is quite broad for a specific type of IS evaluation. Meanwhile, an IT implementation and evaluation framework for individual user known as the IT Adoption Model (ITAM) is constructed to study the individual user perspective and their potential IT adoption [14]. From the individual user perspective, this framework includes comprehensive evaluation criteria and relationships among them. This framework is clearly insufficient for a wider scope of evaluation, which involves the organizational aspect. Overall, these frameworks complement each other in that they each evaluate different aspects of HIS. Thus, these different aspects can be combined in a single framework to allow comprehensive evaluation studies. However, these frameworks do not provide explicit evaluation categories to the evaluator. Hence, more specific measures within the dimensions of each aspect can be defined to facilitate HIS evaluation. In the search for a comprehensive, specific measures of evaluation framework, two models have been identified as being complementary of each other in fulfilling the limitations of existing HIS evaluation frameworks, namely IS Success Model and IT Organization-Fit Model. This research will make use of the IS Success Model based on its comprehensive, specific evaluation categories, extensive validation and its applicability to HIS evaluation. A review of success determinants of Inpatient Clinical IS indicates that the categories for success of this model can be used to assess a specific type of HIS [43]. As mentioned above, IT Organization-fit Model complements the IS Success Model by featuring the concept of fit and organizational factors. IS Success Model is discussed in the next section, followed by the IT OrganizationFit Model.

2.2 IS Success Model In order to organise numerous researches on IS success categories, a comprehensive taxonomy is introduced [10]. A model is constructed which consists of six success categories or dimensions; they are linked causally and temporally as success is

viewed as a dynamic process instead of a static state. The multidimensional relationships among the measures of IS success have been tested extensively in a number of IS studies [11]. Based on these studies, an updated version of this model is presented [12] (See Figure 1). These measures are included in these six system dimensions: System Quality (the measures of the information processing system itself), Information Quality (the measures of IS output), Service Quality (the measures of technical support or service), Information Use (recipient consumption of the output of IS), User Satisfaction (recipient response to the use of the output of IS) and Net Benefits (the overall IS impact).

Figure 1: Information Systems Success Model (Source: DeLone and McLean, 2004) 2.2.1 System Quality. The studies on System Quality are often associated with system performance. Examples of system quality measures are ease of use, ease of learning, response time, usefulness, availability, reliability, completeness, system flexibility and easy access to help [4,13,29,38,41]. 2.2.2 Information Quality. Measures of Information Quality usually focus on information produced by the system such as a report. Some criteria used for information system quality are information accuracy, output timeliness, reliability, completeness, relevancy, legibility, availability and consistency [4,15,29,39,41]. Most information quality measures are subjective, as they are derived from the user perspective. 2.2.3 Service Quality. The new addition to the IS Success Model – Service Quality – is concerned with the overall support delivered by the service provider, regardless of whether the service is delivered by the internal department or outsourced to external vendors. Quick responsiveness, assurance, empathy and following up service are examples of the service quality dimension.

3

Proceedings of the 39th Hawaii International Conference on System Sciences - 2006

2.2.4 System Use. The use of information output such as reports appears to be one of the most frequent measures to assess the success of IS. The use of system in terms of system inquiries and system functions is another example of the measures in this success dimension. System use is concerned with the frequency and breadth of system usage. The actual use of system as a measure of IS success refers to voluntary instead of mandatory use. Other issues pertinent to system use are the person who uses it and their levels of use and training [4,29]. System use is also related to individual knowledge and belief [5]. These measures are related to human acceptance and resistance. Jiang et al. [21] regard resistance as an important factor of system success. As different types of systems are usually related with a particular type of function and user, the reasons for resistance might differ among system types. Resistance can be viewed from one of the following theories: (1) people-oriented, (2) system-oriented and (3) interaction-oriented. People-oriented theory describes resistance to system results from user’s (groups or individuals) internal factors. Personal characteristics such as age, gender, background, value and belief have been suggested as influencing individual’s attitude towards technology. System oriented theory suggests that resistance results from system design factors or relevant technology including user interface and system characteristics. Interaction theory explains resistance from the interaction between people and system factors; thus, assessment of a system varies across settings and users. Job insecurity and fear are some examples of interaction resistance. 2.2.5 User Satisfaction. User satisfaction is often used to measure system success. It is subjective in nature as it depends on whose satisfaction is measured. Some studies relate user satisfaction to perceived usefulness and user’s attitudes towards IS. User satisfaction is defined as the overall evaluation of user’s experience in using the system and the system’s potential impact. 2.2.6 Net Benefits. A system can benefit a single user, a group of users, an organization or an entire industry. Individual impact is the effect of information on the behaviour of the recipient. It is associated with performance as well as changes in user task such as change in work activity and improved productivity. Examples of individual impact measures include time efficiency, work efficiency and effectiveness, decision quality and error reduction [4,9]. Organizational impact is the effect of information on organizational performance. Cost

effectiveness and organizational performance are some of the examples of organizational impact measure. In comparison with existing HIS frameworks, IS DeLone and McLean’s model illustrates clear, specific dimensions of IS success or effectiveness and the relationships between them. However, it does not include organizational factors that are pertinent to IS evaluation. Van der Meidjen et al. [43] discovered that a number of measures such as user involvement during system development and organizational culture do not match any of the dimensions of the framework. The extension of this framework is recommended by adding the organizational factors, their dimensions and clinical measures related to the healthcare domain.

2.3 Human and Organizational Issues The importance of human and organizational factors in the development and implementation of IS has been advocated in IS and Health Informatics literature. A study conducted by Brender et al., [8] indicates that ‘soft issues’ including methods, human and organisational issues are emphasised more in health care than ‘technical issues’. HIS are not useful unless clinicians use them, and clinicians will not use them if there are barriers that impede the system use. Barriers to using HIS are also important to consider in HIS evaluation as they explain the failure and success of these systems. Culture and process changes are reported to be the barriers to the wider use of health care systems [28]. Studies cited in Anderson [5] identified a number of barriers to direct physician use of HIS including low level of expertise, lack of acceptance, lack of medical staff sponsorship and alteration of traditional workflow patterns. Examples of organisational challenges include hospital culture, such as being risk adverse, reluctance to invest much in IT and resistance to change [7,30]. In short, human and organizational factors are equally important as technical issues with regards to system effectiveness [26]. Human, organizational and technical elements should also have a mutual alignment or ‘fit’ in order to ensure successful HIS implementation. It is crucial that HIS fit organizational aspects as well as align with work routines, management assumptions, patient care philosophies and users’ needs as the introduction of a system affects different dimensions of fit in complex ways [25]. Further, the alignment of organisation (objectives, structures and processes), technology and human is an important starting point in IT

4

Proceedings of the 39th Hawaii International Conference on System Sciences - 2006

implementation as it is one of the strategies that affect IT investment [45].

2.4 The Concept of Fit A number of studies in health informatics have included the concept of ‘fit’ [1,6]. Southon et al. [39] found that lack of fit among main organization elements contributes to a large number of system failures in public health. The fit between technical, organizational, and social factors is analysed to identify gaps between current health care systems and new system features [20]. Kaplan [25] shows that poor fit between system developers’ goals and clinicians’ cultural values contributes to user reluctance to use CDSS. Executive support, understanding the business, IT-business relations, and leadership are identified as both enablers and inhibitors of the fit of IT and business [31]. 2.4.1 IT-Organization Fit Model. Management in the 90s (MIT90s) is a well-known ITorganizational fit model, which includes both internal and external elements of fit [35]. Figure 2 illustrates the concept of fit between the main organizational elements. Internal fit is accomplished by a dynamic equilibrium of organizational components including business strategy, organizational structure, management processes, and roles and skills. External fit is achieved by formulating organizational strategy based on environmental trends and changes such as market, industry and technology. Within this internal and external fit as its enabler, IT is expected to affect organizational performance and its strategy.

transformation. First, organizational vision and the reasons behind it have to be clear to organizational members to get them prepared for organizational changes and hence reduce the challenges in managing transformation. Second, organizational corporate strategy (business and IT), information technology and organizational dimensions have to be aligned with each other. Third, a robust IT infrastructure such as electronic network and understood standards should be equipped within the organization. These three prerequisites as well as the internal and external fit may be used to identify the problems in IT implementation. It is discovered that this model was relatively new and has not been extensively utilized in healthcare [40]. The model was also identified as being capable of identifying the main organizational elements which can affect IS as well as emphasizing the essential alignment or fit between them. Moreover, the model is comprehensive as it includes the following factors: technology (IT), human (Roles and Skills) and organization (Strategy, Structure and Management Process). However, these factors can be categorized into more detailed dimensions to provide more specific evaluation dimensions. For instance, IT can be further classified into system quality and information quality, as proposed by DeLone and McLean [10]. Roles and skills can be associated with use and user satisfaction. Based on the strengths and limitations pointed out in both models, IT-Organization Fit and the IS Success Model complement each other in presenting a comprehensive evaluation framework. Organizational factors, which are lacking in the IS Success Model, are featured in the IT Organization-fit. Similarly, specific evaluation dimensions and measures which are lacking in the IT Organization-fit, are featured in the IS Success Model. Based on the two models explained above, a new evaluation framework is proposed. This framework is referred to as HumanOrganization-Technology-Fit (HOT-fit) and is presented in the next section.

3. Proposed Evaluation Framework

Figure 2: The MIT90s (IT-Organization Fit Model) (Adapted from Scott Morton, 1991) In order to realize the benefits of IT, three prerequisites are required for successful IT

The proposed evaluation framework was developed after having critically appraised the existing findings of HIS and IS evaluation studies. It also makes use of the IS Success Model in categorizing its evaluation factors, dimensions and measures. In addition, the IT-Organization Fit Model is also used to incorporate the concept of fit between the evaluation factors: human, organization and

5

Proceedings of the 39th Hawaii International Conference on System Sciences - 2006

technology. The IS Success Model was extended by the addition of the following features which are explained in the following part of this section. (See Figure 3): •

Organizational factors and their dimensions: Structure and Environment.



Fit between technological, human and organizational factors.



Two-way relationships between these dimensions: Information Quality and System Use, Information Quality and User Satisfaction, Structure and Environment, Structure and Net Benefits, Environment and Net Benefits.

Figure 3: Human-Organization-Technology Fit (HOT-fit) framework A model of HIS should fit human (stakeholders) and organizations based on their needs. Therefore, HIS should work according to human needs and it should assist humans in performing their task. Similarly, humans should possess appropriate knowledge and attitude in order to be able to use HIS in performing their task. Likewise, healthcare organizations should be equipped with appropriate technology and infrastructure in order to realize the potential of HIS. Further, healthcare organizations should have the capacity to prepare their staff members to adapt to any changes resulting from the uptake of HIS to reduce the challenges in managing transformation. This can be achieved through strategy and management such as leadership support, teamwork and effective communications that are formed using the roles and skills of the staff. Moreover, organizational and IT plan should be aligned with each other to ensure that IT is supporting the organizational objectives. The fit between human, organization and technology is illustrated by the thick

arrows in Figure 3. Fit can be measured and analyzed using a number of measures define in these three factors including ease of use, usefulness, relevancy, attitude, training, user satisfaction, culture, planning, strategy, management and communication. Human, organization and technology comprise IS which impacts are assessed in the net benefits. These factors correspond to eight interrelated dimensions of HIS success: System Quality, Information Quality, Service Quality, System Use, User Satisfaction, Organizational Structure, Organizational Environment and Net Benefits. The examples of each dimension are shown in Table 2 (page 10). These dimensions influenced each other in a temporal and causal way. System Quality, Information Quality and Service Quality singularly and jointly affect both System Use and User Satisfaction. Some of these relationships are two ways; for instance, System Use, which relies on user knowledge and training, can influence the information quality, since user’s knowledge in using the system can affect reports, images and prescriptions produced by the system. The level of System Use can affect the degree of User Satisfaction and vice versa. Similarly, the Environment factors such as government policy and politics can affect organizational Structure while factors in organizational Structure will affect the population served in the Environment. System Use and User Satisfaction are direct antecedent of Net Benefits. Net Benefits will then affect System Use and User Satisfaction. Similarly, organizational Structure and Environment are direct antecedent of Net Benefits. Net Benefits will then have impact on organizational Structure and Environment. Based on its comprehensive dimensions and outcome measures, the framework could be used to evaluate the performance, effectiveness and impact of HIS or IT in healthcare settings [33]. Effectiveness refers to the accomplishment of specific goals with accuracy and completeness as well as the correct utilisation of appropriate resources [17]. In this research, effectiveness is defined as the ability of a healthcare organization to continuously accomplish goals using optimum resources within a specified time.

3.1 Technology System quality in a healthcare setting measures the inherent features of HIS including system performance and user interface. Examples of measures are ease of use, ease of learning, response time, usefulness, availability, reliability, flexibility, access to technical support, security. Ease of use

6

Proceedings of the 39th Hawaii International Conference on System Sciences - 2006

assesses whether healthcare professionals regard HIS as satisfactory, convenient and pleasant to use. Availability refers to the up time of HIS while flexibility is concerned with the ability of HIS to adapt to a health care setting and integrate with other systems. Further, it is also important to determine whether the system meets the needs of the projected users and fits the work patterns of the professionals for whom it is intended and the overall health system [4,36]. Measures of information quality are concern with information produced by HIS including patient records, reports, images and prescriptions. Criteria that can be used for HIS quality are information completeness, accuracy, legibility, timeliness, availability, relevancy, consistency, reliability, data entry methods and quality. Service quality is concerned with the overall support delivered by the service provider of HIS or technology, regardless of whether the service is delivered by the internal department of healthcare organisation or outsourced to external providers. Service quality can be measured through quick responsiveness, assurance, empathy and follow up service.

3.2 Human System Use refers to the frequency and breadth of HIS inquiries and functions. System Use also relates to the person who uses it, their levels of use, training, knowledge, expectation and acceptance or resistance. Knowledge is concerned with computer literacy and skills [5,9]. Expectation refers to the anticipation of improved patient care delivery from the use of HIS [5]. Resistance can be assessed from the following perspectives: user internal factors; system characteristics and technology factors and interaction between human and system factors. User satisfaction is the overall evaluation of the user’s experience in using HIS and potential impact of HIS. User satisfaction can be related to perceived usefulness and user’s attitudes towards HIS which are influenced by user’s personal characteristics.

3.3 Organization The nature of a healthcare institution can be examined from its structure and environment. Organization structure consists of nature including type and size (number of beds), culture, politic, hierarchy, autonomy, planning and control systems, strategy, management and communication. Leadership, top management support and medical staff sponsorship are also important measures of HIS

success [5,39]. The environment of organization can be analysed through source, government, politics, competition, inter-organisational population served, and communication.

a healthcare its financing localization, relationship,

3.4 Net Benefits Net benefits capture the balance of positive and negative impacts on user, which includes clinicians, managers and IT, staff, system developers, hospitals or the entire healthcare sector. Net benefits can be assessed using direct benefits, job effects, efficiency, effectiveness, error reduction, communication, clinical outcomes and cost. In the healthcare context, clinical outcomes can be used as a means of measurement. Examples of these measures include costs reduction, which is due to fewer medication errors and Adverse Drug Effect (ADE); improved efficiency in patient care delivery, specifically pertaining to tests and drug orders and increased use of generic drug brands and number of consultations and length of waiting lists [9,44]. Clinical outcomes are also measured through two criteria: morbidity (the rate of incidence of a disease) and mortality (death rate). Apart from these quantitative measures, clinical impacts can also be assessed qualitatively using these measures: quality of care, impact on patient care and communication, such as change in communication style and facilitation of information access [36].

4. Proposed Research Methodology The planned approach is to use a subjectivist, case study strategy with qualitative methods. A formative evaluation will be undertaken on the adoption of a Fundus Imaging System (FIS) in a Primary Care of the National Health Service (NHS) in the UK due to its relative importance in the diabetic care. A subjectivist approach will be employed in order to gain an extensive understanding of the healthcare context surrounding the FIS through detailed, insightful explanation of the study [15]. Qualitative methods will be employed to generate a fuller description of healthcare settings and their cultural issues and to understand why systems function well or poorly in a particular setting. Further, a single, indepth case study will be undertaken to obtain a comprehensive view and understanding of the development process of FIS. A number of data collection methods will be employed, including indepth interviews, participant observation and document/artefact analysis.

7

Proceedings of the 39th Hawaii International Conference on System Sciences - 2006

This research approach to FIS evaluation consists of six iterative phases, which are problem identification, development of initial evaluation framework, selection of research strategy and methods, system evaluation, outcome disseminations and refinement of evaluation framework [15,23]. The first three phases were achieved. Evaluation problems (issues, questions and concerns) have been identified through a literature review as well as observations made during immersion. An immersion was carried out to set the general context of the research as well to establish rapport with relevant stakeholders. Initial data collection was gathered during immersion. An initial evaluation framework was constructed based on findings from the first phase. This framework will be used as a guideline in the later phase: evaluation of FIS. The research strategy and methods were selected based on the research problem. In the evaluation phase, the proposed evaluation framework will be appraised using the case study of FIS. A pilot study will be conducted prior the actual case study for the purpose of improving the planning, monitoring and evaluation process of the research. Participant observations of daily clinical routines and meetings will take place at different units, departments and clinics. During observations and interviews, individuals including user, clinicians and IT staff that are involved with the system will be requested to document or recall their system use and patient pathways. Patients will be queried about their perception towards the system. Data will be collected on planned occasions as well as spontaneously in a number of iterative cycles. The data will be recorded and interpreted based on the expert knowledge of the researchers. Four techniques will be used to analyze the results: coding, analytic memos such as reflection notes, displays such as matrices, flowcharts and concept maps, and contextual and narrative analysis [22]. Evaluation plan and preliminary findings will be presented at a project stakeholders meeting. The purposes of this presentation are to make further decisions regarding system and system evaluation, get feedback and comments, validate initial findings and shape evaluation plan and system development efforts. Finally, the evaluation will be improved and refined based on the ongoing literature review and case study results. Potential bias acknowledged in qualitative research approach will be overcome by conducting a reliability test and data triangulation. Reliability will be achieved through detailed documentation of procedures and appropriate record keeping [34]. Triangulation will be accomplished through the use of multiple evidences from different sources to confirm

the same fact or finding [47]. Participant validation (where the interviewees review and approve their results) will also be used to reduce bias.

5. Conclusions This paper has identified the problems, reviewed the existing methods and proposed a new framework for HIS evaluation. In the search for an appropriate, comprehensive approach to HIS evaluation, a number of existing frameworks for IS evaluation in Health Informatics and IS literature are analysed. The review findings suggest that there is more need to improve existing methods in HIS evaluation. The strengths and limitations of these frameworks are discussed and are used as a basis for the new proposed framework, HOT-fit. In addition to the literature review, this framework builds on the IS Success Model and ITOrganization Fit Model. HOT-fit addresses the essential components of IS, namely human, organization and technology and the fit between them. In order to validate its usefulness, this framework needs to be tested in clinical settings. Findings from the fieldwork could be used for further improvement and refinement of this framework. The framework should be applied flexibly, depending on different context and purposes; emphasize can be given on the most important dimensions and measures. The contribution of this paper is that it brings together disparate evaluation studies in both Health Informatics and IS literature to provide a comprehensive picture of the-state-of–the-art as well as research needs of HIS evaluation. It analyses a number of existing frameworks and models of evaluation in HIS and IS. The paper also demonstrates the applicability of IS models in HIS evaluation. Moreover, this paper introduces a new framework for HIS evaluation that combines dimensions and measures from current evaluation frameworks in Health Informatics and models in IS and review of both literature. Although it focused on a specific setting, the evaluation framework of this study will potentially be useful for researchers and practitioners to conduct a better and comprehensive evaluation study on other HIS or IT applications in healthcare settings.

6. References [1] Aarts, J., Peel, V., Wright, G., "Organizational issues in health informatics: a model approach," International Journal of Medical Informatics, 52, 1998, 235-242.

8

Proceedings of the 39th Hawaii International Conference on System Sciences - 2006

[2] Ammenwerth, E., Graber, S., Hermann, G., Burkle, T., Konig, J., "Evaluation of health information systems problems and challenges," International Journal of Medical Informatics, 71, 2003, 125-135. [3] Ammenwerth, E., Brender, J., Nykanen, P, Prokosch, H.-U., Rigby, M., Talmon, J., HIS-EVAL Workshop Participants., "Visions and strategies to improve evaluation of health information systems: Reflections and lessons based on the HIS-EVAL workshop in Innsbruck" International Journal of Medical Informatics, 73, 2004, 479-491. [4] Anderson, J. G., Aydin, C.E., "Overview: Theoretical Perspectives and Metholodogies for the Evaluation of Health Care Information Systems," in Evaluating Health Care Information Systems; Methods and Applications, A. C. Anderson JG, Jay SJ, Ed. Thousand Oaks, CA: Sage, 5-29, 1994. [5] Anderson, J. G., "Clearing the way for physicians' use of clinical information systems'," Communications of the ACM, 40, 1997, 83-90. [6] Berg, M., "Patient care information systems and health care work: a sociotechnical approach," International Journal of Medical Informatics, 55, 1999, 87-101. [7] Bottles, K., "Critical Choices Face Healthcare in How to Use Information Technology," Medscape General Medicine, 1, 1999. [8] Brender, J., Nohr, C., and McNair, P., "Research needs and priorities in health informatics," International Journal of Medical Informatics, 58-59, 2000, 257-289. [9] Coiera, E., Guide to Health Informatics, 2nd ed: Hodder Arnold, 2003. [10] DeLone, W. H., McLean, E.R., "Information Systems Success: The Quest for the Dependent Variable," Information Systems Research, 3, 1992, 60-95. [11] Delone, W. H., McLean, E. R., "Information systems success revisited" Proceedings of the 35th Hawaii International Conference on System Sciences (HICCS), Big Island, HI, USA. 2002. [12] DeLone, W. H., McLean, E.R., "Measuring ecommerce success: applying the DeLone & McLean Information Systems Success Model," International Journal of Electronic Commerce, 9, 2004, 31-47. [13] Doebbeling, B., "Information Technology and Primary Care at VA: Interdisciplinary Partnership Opportunities for Providers, Managers, and Researchers" Health Service Research & Development Series. Retrieved 21July, 2004 from http://www.hsrd.research.va.gov/publications/internal/f orum10_03.pdf. 2003. [14] Dixon, D. R., "The behavioral side of information technology," International Journal of Medical Informatics, 56, 1999, 117-123. [15] Friedman, C. P., Wyatt, J.C., Evaluation Methods in Medical Informatics. New York: Springer-Verlag, 1997. [16] Gawande, A. A., Bates, D.W., "The Use of Information Technology in Improving Medical Performance - Part II. Physician-Support Tools," Medscape General Medicine, 2, 2000.

[17] George, B., "A Framework for IT Evaluation Research," Americas Conference on Information Systems (AMCIS 2001), USA, 2001. [18] Giuse, D. A. and Kuhn, K. A., "Health information systems challenges: the Heidelberg conference and the future," International Journal of Medical Informatics, 69, 2003, 105-114. [19] Grant, A., Plante, I., Leblanc, F., "The TEAM methodology for the evaluation of information systems in biomedicine," Computers in Biology and Medicine. 32, 2002, 195-20. [20] Heeks, R., Mundy, D., Salazar, A., "Why Health Care Information Systems Succeed or Fail," Institute for Development Policy and Management, University of Manchester, Retrieved November 11, 2003 from http://www.man.ac.uk/idpm/idpm_dp.htm#isps_wp. 1999. [21] Jiang, J. J., Muhanna, W.A., Klenin, G., "User resistance and strategies for promoting acceptance across system types," Information & Management, 37, 2000, 25-36. [22] Kaplan, B., Maxwell, J., "Qualitative Research Methods for Evaluating Computer Information Systems," in Evaluating Health Care Information Systems; Methods and Applications., Aydin, C. E., Anderson, J. G., Jay, S.J., Ed. Thousand Oaks.: Sage, 45-67, 1994. [23] Kaplan, B., "Organizational Evaluation of Medical Information Resources," in Evaluation Methods in Medical Informatics, Friedman, C. P., Wyatt, J.C., Ed. New York: Springer-Verlag, 255-280, 1997. [24] Kaplan, B., "Evaluating informatics applications clinical decision support systems literature review" International Journal of Medical Informatics, vol. 64, pp. 15-37, 2001a. [25] Kaplan, B., "Evaluating informatics applications-some alternative approaches: theory, social interactionism, and call for methodological pluralism." International Journal of Medical Informatics, 64, 2001b, 39-56. [26] Kaplan, B., Shaw, N., "People, organizational and social issues: Evaluation as an Exemplar." Yearbook of Medical Infomatics 2002, 2002, 63-76. [27] Kaplan, B., Shaw, N.T., "Future directions in evaluation research: people, organizational, and social issues”. Methods of Information in Medicine, 43, 2004, 215-231. [28] Kuhn, K. A., Giuse, D.A., "From Hospital Information Systems to Health Information Systems - Problems, Challenges, Perspective," Yearbook of Medical Informatics 2001, 2001, 63-76. [29] Lippeveld, T., "Routine Health Information Systems: The Glue of a Unified Health System," Workshop on Issues and Innovation in Routine Health Information in Developing Countries, Bolger Center for Leadership Development. Potomac, Maryland, 2001. [30] Lorenzi, N. M., & Riley, R. T. (2003). Organizational ISSUES=change. International Journal of Medical Informatics, 69(2-3): 197-203. [31] Luftman, J., "Assessing Business Alignment Maturity," Communications of AIS, 4, 2000, 1-51.

9

Proceedings of the 39th Hawaii International Conference on System Sciences - 2006

[32] Moehr, J. R., "Evaluation: salvation or nemesis of medical informatics?" Computers in Biology and Medicine, 32, 2002, 113-125. [33] Roderer, N., "Outcome measures in clinical information systems evaluation." Medinfo. 2004, San Francisco, CA, USA, 2004. [34] Rowley, J., "Using Case Studies in Research," Management Research News, 25, 2002, 16-27. [35] Scott Morton, M. S., The Corporation of the 1990s. New York: Oxford University Press, 1991. [36] Shaw, N. T., "'CHEATS': a generic information communication technology (ICT) evaluation framework," Computers in Biology and Medicine, 32, 2002, 209-220. [37] Sittig, D. F., Hazlehurst, B.L., Palen, T., Hsu, J, Jimson, H., Hornbrook, M.C., "A Clinical Information System Research Agenda for Kaiser Permanente," The Permanente Journal, 6, 2002. [38] Smith, J., Health Management Information Systems: A Handbook for Decision Makers. Buckingham: Open University Press, 2000. [39] Southon, F. C. G., Sauer, C., Grant, C.G., "Information Technology in Complex Health Services: Organizational Impediments to Successful Technology Transfer and Diffusion," Journal of the American Medical Informatics Association, 4, 1997, 112-124. [40] Southon, F. C. G., Sauer, C., Dampney, K., "Lessons from a failed information systems initiative: issues for complex organisations," International Journal of Medical Informatics, 55, 1999, 33-46. [41] Stoop, A. P., Berg, M., "Integrating Quantitative and Qualitative Methods in Patient Care Information Systems Evaluation: Guidance for the Organizational

[42]

[43]

[44]

[45]

[46]

[47]

Decision Maker," Methods of Information in Medicine, 42, 2004, 458-462. Van Bemmel, J. H. a. M., M.A. (eds.), Handbook of Medical Informatics. Heidelberg: Springer–Verlag, 1997. Van Der Meidjen, M. J., Tange, H.J., Troost, J., Hasman, A., "Determinants of Success of Inpatient Clinical Information Systems: A Literature Review," Journal of the American Medical Association, 10, 2003, 235-243. van't Riet, A., Berg, M., Hidemma, F., Sol, K., "Meeting patients' needs with patient information systems: potential benefits of qualitative research methods," International Journal of Medical Informatics, 64, 2001, 1-14. Willcocks, L., "Managing technology evaluation techniques and processes," in Strategic Information Management: Challenges and strategies in managing information systems, Galliers, R. D., Baker, B.S.H., Ed. Oxford: Butterworth Heinemann, 365-381,1994. Wyatt, J. C. and Wyatt, S. M., "When and how to evaluate health information systems?," International Journal of Medical Informatics, 69, 2003, 251-259,. Yin, R. K., Case Study Research: Design and Methods, vol. 5, 3 ed. Thousand Oaks: Sage Publications, 2003.

Table 2: Examples of the evaluation measures of the proposed framework

Technology

Human

Organisation

System Quality

Information Quality

Service Quality

System Use

User Satisfaction

Structure

Environment

Ease of use Ease of learning Response time Usefulness Availability Reliability Flexibility Access to technical support Security

Completeness Accuracy Legibility Timeliness Availability Relevancy Consistency Reliability Data entry methods Quality

Quick responsiveness Assurance Empathy Follow up service

Level of use (frequency, duration) Attitude Expectations/ belief Knowledge/ expertise Acceptance Resistance/ reluctance Training

Perceived usefulness User satisfaction

Nature Culture Planning Strategy Management Autonomy Communication Leadership Top management support Medical sponsorship

Financing Source Government Politics Localization Competition Interorganisational relationship Population served Communication

Net Benefits Direct benefits Job effects Efficiency Effectiveness Error reduction Communication Clinical outcomes Cost

Perpustakaan Elektronik Koleksi: Moch. Boedi Soetanto File tersedia pada: http://csdl2.computer.org/comp/proceedings/hicss/2006/2507/05/250750095a.pdf diakses pada: 8 Januari 2007, jam 12:14 WIB

10

Related Documents

Hot Yusof
November 2019 2
Hot-hot
June 2020 36
Hot
June 2020 21
Hot
May 2020 28
Hot Desking
June 2020 0