Muutke küpsiste eelistusi

E-raamat: Handbook of Evaluation Methods for Health Informatics

(Institute of Health Science and Technology, Aalborg University, Aalborg, Denmark)
  • Formaat: 368 pages
  • Ilmumisaeg: 17-Jan-2006
  • Kirjastus: Academic Press Inc
  • Keel: eng
  • ISBN-13: 9780080533452
  • Formaat - PDF+DRM
  • Hind: 69,15 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Formaat: 368 pages
  • Ilmumisaeg: 17-Jan-2006
  • Kirjastus: Academic Press Inc
  • Keel: eng
  • ISBN-13: 9780080533452

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

This Handbook provides a complete compendium of methods for evaluation of IT-based systems and solutions within healthcare. Emphasis is entirely on assessment of the IT-system within its organizational environment. The author provides a coherent and complete assessment of methods addressing interactions with and effects of technology at the organizational, psychological, and social levels.

It offers an explanation of the terminology and theoretical foundations underlying the methodological analysis presented here. The author carefully guides the reader through the process of identifying relevant methods corresponding to specific information needs and conditions for carrying out the evaluation study. The Handbook takes a critical view by focusing on assumptions for application, tacit built-in perspectives of the methods as well as their perils and pitfalls.

*Collects a number of evaluation methods of medical informatics
*Addresses metrics and measures
*Includes an extensive list of anotated references, case studies, and a list of useful Web sites

Arvustused

"...informative from a conceptual and analytical perspective, elucidating assessment and evaluation of systems from the user's point of view...this handbook is quite comprehensive in its coverage of concepts related to IT assessment." --DOODY ENTERPRISES, INC

"...a good general overview of evaluation and different evaluation techniques. We would encourage readers to buy this book especially if they want a reference guide of evaluation techniques." --Madhu Reddy, College of Information Sciences and Technology, The Pennsylvania State University, INFORMATION PROCESSING & MANAGEMENT

Note to the Reader ix
The Aim of the Handbook ix
Target Group x
Criteria for Inclusion xii
Acknowledgments xiii
Additional Comments xiv
PART I: INTRODUCTION
1(50)
Introduction
3(6)
What Is Evaluation?
3(3)
Instructions to the Reader
6(1)
Metaphor for the Handbook
7(2)
Conceptual Apparatus
9(20)
Evaluation and Related Concepts
9(3)
Definitions
9(2)
Summative Assessment
11(1)
Constructive Assessment
11(1)
Methodology, Method, Technique, and Framework
12(4)
Method
13(1)
Technique
13(1)
Measures and Metrics
14(1)
Methodology
14(1)
Framework
15(1)
Quality Management
16(2)
Perspective
18(7)
Example: Cultural Dependence on Management Principles
19(1)
Example: Diagramming Techniques
20(1)
Example: Value Norms in Quality Development
20(1)
Example: Assumptions of People's Abilities
21(1)
Example: The User Concept
21(1)
Example: The Administrative Perspective
22(1)
Example: Interaction between Development and Assessment Activities
23(1)
Example: Interaction between Human and Technical Aspects
24(1)
Evaluation Viewed in the Light of the IT System's Life Cycle
25(4)
Types of User Assessments of IT-Based Solutions
29(8)
Types of User Assessment during the Phases of a System's Life Cycle
31(3)
The Explorative Phase
31(1)
The Technical Development Phase
32(1)
The Adaptation Phase
33(1)
The Evolution Phase
34(1)
Assessment Activities in a Holistic Perspective
34(3)
Choosing or Constructing Methods
37(14)
How Do You Do It?
37(7)
Where in Its Life Cycle Is the IT Project?
37(1)
What Is the Information Need?
38(1)
Establishing a Methodology
39(1)
Choosing a Method
40(2)
Choosing Metrics and Measures
42(1)
Execution of the Method
42(1)
Interpreting Results
43(1)
From Strategy to Tactics: An Example
44(1)
Another (Abstract) Example
45(1)
A Practical Example of a Procedure
46(1)
Planning at the Strategic Level
46(1)
Planning at the Tactical Level
46(1)
Planning at the Operational Level
47(1)
Frame of Reference for Assessment
47(1)
Perils and Pitfalls
48(3)
PART II: METHODS AND TECHNIQUES
51(192)
Introduction
53(8)
Signature Explanations
54(4)
Application Range within the IT System's Life Cycle
54(2)
Applicability in Different Contexts
56(1)
Type of Assessment
57(1)
Use of Italics in the Method Descriptions
58(1)
Structure of Methods' Descriptions
58(3)
Overview of Assessment Methods
61(12)
Overview of Assessment Methods: Explorative Phase
61(3)
Overview of Assessment Methods: Technical Development Phase
64(1)
Overview of Assessment Methods: Adaptation Phase
65(3)
Overview of Assessment Methods: Evolution Phase
68(4)
Other Useful Information
72(1)
Descriptions of Methods and Techniques
73(154)
Analysis of Work Procedures
73(5)
Assessment of Bids
78(7)
Balanced Scorecard
85(3)
BIKVA
88(3)
Clinical/Diagnostic Performance
91(5)
Cognitive Assessment
96(6)
Cognitive Walkthrough
102(4)
Delphi
106(3)
Equity Implementation Model
109(2)
Field Study
111(5)
Focus Group Interview
116(4)
Functionality Assessment
120(5)
Future Workshop
125(3)
Grounded Theory
128(4)
Heuristic Assessment
132(3)
Impact Assessment
135(7)
Interview
142(5)
KUBI
147(2)
Logical Framework Approach
149(5)
Organizational Readiness
154(2)
Pardizipp
156(3)
Prospective Time Series
159(4)
Questionnaire
163(9)
RCT, Randomized Controlled Trial
172(8)
Requirements Assessment
180(5)
Risk Assessment
185(3)
Root Causes Analysis
188(2)
Social Networks Analysis
190(2)
Stakeholder Analysis
192(4)
SWOT
196(3)
Technical Verification
199(5)
Think Aloud
204(3)
Usability
207(8)
User Acceptance and Satisfaction
215(4)
Videorecording
219(3)
WHO: Framework for Assessment of Strategies
222(5)
Other Useful Information
227(16)
Documentation in a Situation of Accreditation
227(5)
Measures and Metrics
232(6)
Standards
238(5)
PART III: METHODOLOGICAL AND METHODICAL PERILS AND PITFALLS AT ASSESSMENT
243(82)
Background Information
245(4)
Perspectives
246(3)
Approach to Identification of Pitfalls and Perils
249(4)
Framework for Meta-Assessment of Assessment Studies
253(68)
Types of (Design) Strengths
257(33)
Circumscription of Study Objectives
257(4)
Selecting the Methodology/Method
261(6)
Defining Methods and Materials
267(4)
(User) Recruitment
271(2)
(Case) Recruitment
273(3)
The Frame of Reference
276(8)
Outcome Measures or End-Points
284(5)
Aspects of Culture
289(1)
Types of (Experimental) Weaknesses
290(23)
The Developers' Actual Engagement
291(2)
Intra- and Interperson (or -Case) Variability
293(2)
Illicit Use
295(1)
Feed-back Effect
296(1)
Extra Work
297(1)
Judgmental Biases
298(2)
Postrationalization
300(1)
Verification of Implicit Assumptions
301(1)
Novelty of the Technology--Technophile or Technophobe
302(1)
Spontaneous Regress
303(1)
False Conclusions
303(1)
Incomplete Studies or Study Reports
304(3)
Hypothesis Fixation
307(4)
The Intention to Treat Principle
311(2)
Impact
313(1)
Types of Opportunities
313(4)
Retrospective Exploration of (Existing) Data Material
314(1)
Remedying Problems Identified -- beyond the Existing Data Material
315(2)
Types of Threats
317(4)
Compensation for Problems
317(1)
Pitfalls and Perils
318(1)
Validity of the Study Conclusion
319(2)
Discussion
321(4)
A Meta-View on the Study of Pitfalls and Perils
322(3)
LIST OF ABBREVIATIONS
325(2)
LIST OF REFERENCES
327(20)
Annotated, Generally Useful References, Including Case Studies
337(6)
Annotated World Wide Web Links
343(4)
Index 347


Jytte Brender McNair has an R&D background of fifteen years in a university hospital, nine years as an industrial researcher, and fifteen years as full-time university researcher. Her expertise and experiences cross-fertilise an M.Sc. in biochemistry (Copenhagen University, 1973), an M.Sc. in computer science (Copenhagen University, 1991), and a European Doctorate & PhD in Medical Informatics (Aalborg University, Technology & Science Faculty, 1997). Her most recent position was as full-time researcher at the Department of Health Science and Technology, Aalborg University, Aalborg, Denmark, since October 1995.

Her research career had a scientific focus on the theoretical and practical aspects of Quality Management and Technology Assessment that included all aspects of evaluation and quality & risk management, with a range from constructive evaluation (dynamic, self-reflective, purpose-driven and corrective evaluation), to holistic analysis of organisational and behavioural aspects. Her expertise covers the breadth of organisational change and evolution, covers the theoretical aspects of the quality of semantic aspects of medical knowledge, and even further to modelling of the architectural logic of organisations. A computer scientist at the anthropocentric end of the scale, all things human-centred interest her as does the multifaceted realm of asymmetric abstraction.

Prior to the Mereon Matrix book, her major recent publication is a handbook of methods for constructive evaluation of IT-based solutions [ Brender 2006], emphasising the hidden aspects of methods, assumptions for application, conjecture on epistemological nature, as well as potential pitfalls and perils leading to bias. It includes a framework for meta-analysis of evaluation studies that is dedicated to pinpointing the downside in such studies, while scrutinizing experimental biases.

In August 2011 she took early retirement, becoming a professor emeritus, to focus on applications of the Mereon Matrix as a template for modelling information; see more about the work related to the structure of Mereon on www.mereon.org. Jyttes interest here is information modelling, striving to understand the internal workings of systems as systems, therein both social and biological systems. Her recent significant contribution is modelling human molecular genetics (including clinical genetics) using the Mereon Matrix as the template information model. In this, she has taken advantage of her original training as a biochemist together with her experience in computer science/health informatics to develop the explanatory model in [ Dennis et al. 2013] together with her husband, Peter McNair, and others on the Mereon team.

Her professional experience includes project management and technical coordination, over global (project) quality and risk management, to task leader in large multinational, cross-disciplinary R&D projects involving multiple teams of highly skilled academics.

She is author/co-author of seven books, 68 publications in scientific peer-reviewed journals and books (about a third of them invited articles), more than 80 technical reports (almost all peer-reviewed) and 75 presentations with proceedings/abstracts (therein 34 invited and 3 keynote presentations). She is (co-)editor of 7 proceedings of international congresses and workshops, including an interactive CD-Rom and one special issue of a scientific journal.